Sep 30 20:15:06 localhost kernel: Linux version 5.14.0-617.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025
Sep 30 20:15:06 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Sep 30 20:15:06 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 20:15:06 localhost kernel: BIOS-provided physical RAM map:
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Sep 30 20:15:06 localhost kernel: NX (Execute Disable) protection: active
Sep 30 20:15:06 localhost kernel: APIC: Static calls initialized
Sep 30 20:15:06 localhost kernel: SMBIOS 2.8 present.
Sep 30 20:15:06 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Sep 30 20:15:06 localhost kernel: Hypervisor detected: KVM
Sep 30 20:15:06 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Sep 30 20:15:06 localhost kernel: kvm-clock: using sched offset of 4416063171 cycles
Sep 30 20:15:06 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Sep 30 20:15:06 localhost kernel: tsc: Detected 2799.998 MHz processor
Sep 30 20:15:06 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Sep 30 20:15:06 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Sep 30 20:15:06 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Sep 30 20:15:06 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Sep 30 20:15:06 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Sep 30 20:15:06 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Sep 30 20:15:06 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Sep 30 20:15:06 localhost kernel: Using GB pages for direct mapping
Sep 30 20:15:06 localhost kernel: RAMDISK: [mem 0x2d7d0000-0x32bdffff]
Sep 30 20:15:06 localhost kernel: ACPI: Early table checksum verification disabled
Sep 30 20:15:06 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Sep 30 20:15:06 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Sep 30 20:15:06 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Sep 30 20:15:06 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Sep 30 20:15:06 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Sep 30 20:15:06 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Sep 30 20:15:06 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Sep 30 20:15:06 localhost kernel: No NUMA configuration found
Sep 30 20:15:06 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Sep 30 20:15:06 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Sep 30 20:15:06 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Sep 30 20:15:06 localhost kernel: Zone ranges:
Sep 30 20:15:06 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Sep 30 20:15:06 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Sep 30 20:15:06 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 20:15:06 localhost kernel:   Device   empty
Sep 30 20:15:06 localhost kernel: Movable zone start for each node
Sep 30 20:15:06 localhost kernel: Early memory node ranges
Sep 30 20:15:06 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Sep 30 20:15:06 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Sep 30 20:15:06 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 20:15:06 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Sep 30 20:15:06 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Sep 30 20:15:06 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Sep 30 20:15:06 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Sep 30 20:15:06 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Sep 30 20:15:06 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Sep 30 20:15:06 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Sep 30 20:15:06 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Sep 30 20:15:06 localhost kernel: TSC deadline timer available
Sep 30 20:15:06 localhost kernel: CPU topo: Max. logical packages:   8
Sep 30 20:15:06 localhost kernel: CPU topo: Max. logical dies:       8
Sep 30 20:15:06 localhost kernel: CPU topo: Max. dies per package:   1
Sep 30 20:15:06 localhost kernel: CPU topo: Max. threads per core:   1
Sep 30 20:15:06 localhost kernel: CPU topo: Num. cores per package:     1
Sep 30 20:15:06 localhost kernel: CPU topo: Num. threads per package:   1
Sep 30 20:15:06 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Sep 30 20:15:06 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Sep 30 20:15:06 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Sep 30 20:15:06 localhost kernel: Booting paravirtualized kernel on KVM
Sep 30 20:15:06 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Sep 30 20:15:06 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Sep 30 20:15:06 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Sep 30 20:15:06 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Sep 30 20:15:06 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Sep 30 20:15:06 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Sep 30 20:15:06 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 20:15:06 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64", will be passed to user space.
Sep 30 20:15:06 localhost kernel: random: crng init done
Sep 30 20:15:06 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Sep 30 20:15:06 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Sep 30 20:15:06 localhost kernel: Fallback order for Node 0: 0 
Sep 30 20:15:06 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Sep 30 20:15:06 localhost kernel: Policy zone: Normal
Sep 30 20:15:06 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Sep 30 20:15:06 localhost kernel: software IO TLB: area num 8.
Sep 30 20:15:06 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Sep 30 20:15:06 localhost kernel: ftrace: allocating 49329 entries in 193 pages
Sep 30 20:15:06 localhost kernel: ftrace: allocated 193 pages with 3 groups
Sep 30 20:15:06 localhost kernel: Dynamic Preempt: voluntary
Sep 30 20:15:06 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Sep 30 20:15:06 localhost kernel: rcu:         RCU event tracing is enabled.
Sep 30 20:15:06 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Sep 30 20:15:06 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Sep 30 20:15:06 localhost kernel:         Rude variant of Tasks RCU enabled.
Sep 30 20:15:06 localhost kernel:         Tracing variant of Tasks RCU enabled.
Sep 30 20:15:06 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Sep 30 20:15:06 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Sep 30 20:15:06 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 20:15:06 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 20:15:06 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 20:15:06 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Sep 30 20:15:06 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Sep 30 20:15:06 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Sep 30 20:15:06 localhost kernel: Console: colour VGA+ 80x25
Sep 30 20:15:06 localhost kernel: printk: console [ttyS0] enabled
Sep 30 20:15:06 localhost kernel: ACPI: Core revision 20230331
Sep 30 20:15:06 localhost kernel: APIC: Switch to symmetric I/O mode setup
Sep 30 20:15:06 localhost kernel: x2apic enabled
Sep 30 20:15:06 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Sep 30 20:15:06 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Sep 30 20:15:06 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Sep 30 20:15:06 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Sep 30 20:15:06 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Sep 30 20:15:06 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Sep 30 20:15:06 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Sep 30 20:15:06 localhost kernel: Spectre V2 : Mitigation: Retpolines
Sep 30 20:15:06 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Sep 30 20:15:06 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Sep 30 20:15:06 localhost kernel: RETBleed: Mitigation: untrained return thunk
Sep 30 20:15:06 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Sep 30 20:15:06 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Sep 30 20:15:06 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Sep 30 20:15:06 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Sep 30 20:15:06 localhost kernel: x86/bugs: return thunk changed
Sep 30 20:15:06 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Sep 30 20:15:06 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Sep 30 20:15:06 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Sep 30 20:15:06 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Sep 30 20:15:06 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Sep 30 20:15:06 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Sep 30 20:15:06 localhost kernel: Freeing SMP alternatives memory: 40K
Sep 30 20:15:06 localhost kernel: pid_max: default: 32768 minimum: 301
Sep 30 20:15:06 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Sep 30 20:15:06 localhost kernel: landlock: Up and running.
Sep 30 20:15:06 localhost kernel: Yama: becoming mindful.
Sep 30 20:15:06 localhost kernel: SELinux:  Initializing.
Sep 30 20:15:06 localhost kernel: LSM support for eBPF active
Sep 30 20:15:06 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Sep 30 20:15:06 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Sep 30 20:15:06 localhost kernel: ... version:                0
Sep 30 20:15:06 localhost kernel: ... bit width:              48
Sep 30 20:15:06 localhost kernel: ... generic registers:      6
Sep 30 20:15:06 localhost kernel: ... value mask:             0000ffffffffffff
Sep 30 20:15:06 localhost kernel: ... max period:             00007fffffffffff
Sep 30 20:15:06 localhost kernel: ... fixed-purpose events:   0
Sep 30 20:15:06 localhost kernel: ... event mask:             000000000000003f
Sep 30 20:15:06 localhost kernel: signal: max sigframe size: 1776
Sep 30 20:15:06 localhost kernel: rcu: Hierarchical SRCU implementation.
Sep 30 20:15:06 localhost kernel: rcu:         Max phase no-delay instances is 400.
Sep 30 20:15:06 localhost kernel: smp: Bringing up secondary CPUs ...
Sep 30 20:15:06 localhost kernel: smpboot: x86: Booting SMP configuration:
Sep 30 20:15:06 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Sep 30 20:15:06 localhost kernel: smp: Brought up 1 node, 8 CPUs
Sep 30 20:15:06 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Sep 30 20:15:06 localhost kernel: node 0 deferred pages initialised in 19ms
Sep 30 20:15:06 localhost kernel: Memory: 7765688K/8388068K available (16384K kernel code, 5784K rwdata, 13988K rodata, 4072K init, 7304K bss, 616480K reserved, 0K cma-reserved)
Sep 30 20:15:06 localhost kernel: devtmpfs: initialized
Sep 30 20:15:06 localhost kernel: x86/mm: Memory block size: 128MB
Sep 30 20:15:06 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Sep 30 20:15:06 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: pinctrl core: initialized pinctrl subsystem
Sep 30 20:15:06 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Sep 30 20:15:06 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Sep 30 20:15:06 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Sep 30 20:15:06 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Sep 30 20:15:06 localhost kernel: audit: initializing netlink subsys (disabled)
Sep 30 20:15:06 localhost kernel: audit: type=2000 audit(1759263305.484:1): state=initialized audit_enabled=0 res=1
Sep 30 20:15:06 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Sep 30 20:15:06 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Sep 30 20:15:06 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Sep 30 20:15:06 localhost kernel: cpuidle: using governor menu
Sep 30 20:15:06 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Sep 30 20:15:06 localhost kernel: PCI: Using configuration type 1 for base access
Sep 30 20:15:06 localhost kernel: PCI: Using configuration type 1 for extended access
Sep 30 20:15:06 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Sep 30 20:15:06 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Sep 30 20:15:06 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Sep 30 20:15:06 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Sep 30 20:15:06 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Sep 30 20:15:06 localhost kernel: Demotion targets for Node 0: null
Sep 30 20:15:06 localhost kernel: cryptd: max_cpu_qlen set to 1000
Sep 30 20:15:06 localhost kernel: ACPI: Added _OSI(Module Device)
Sep 30 20:15:06 localhost kernel: ACPI: Added _OSI(Processor Device)
Sep 30 20:15:06 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Sep 30 20:15:06 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Sep 30 20:15:06 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Sep 30 20:15:06 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Sep 30 20:15:06 localhost kernel: ACPI: Interpreter enabled
Sep 30 20:15:06 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Sep 30 20:15:06 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Sep 30 20:15:06 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Sep 30 20:15:06 localhost kernel: PCI: Using E820 reservations for host bridge windows
Sep 30 20:15:06 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Sep 30 20:15:06 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Sep 30 20:15:06 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [3] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [4] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [5] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [6] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [7] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [8] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [9] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [10] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [11] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [12] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [13] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [14] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [15] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [16] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [17] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [18] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [19] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [20] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [21] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [22] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [23] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [24] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [25] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [26] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [27] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [28] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [29] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [30] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [31] registered
Sep 30 20:15:06 localhost kernel: PCI host bridge to bus 0000:00
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Sep 30 20:15:06 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Sep 30 20:15:06 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Sep 30 20:15:06 localhost kernel: iommu: Default domain type: Translated
Sep 30 20:15:06 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Sep 30 20:15:06 localhost kernel: SCSI subsystem initialized
Sep 30 20:15:06 localhost kernel: ACPI: bus type USB registered
Sep 30 20:15:06 localhost kernel: usbcore: registered new interface driver usbfs
Sep 30 20:15:06 localhost kernel: usbcore: registered new interface driver hub
Sep 30 20:15:06 localhost kernel: usbcore: registered new device driver usb
Sep 30 20:15:06 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Sep 30 20:15:06 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Sep 30 20:15:06 localhost kernel: PTP clock support registered
Sep 30 20:15:06 localhost kernel: EDAC MC: Ver: 3.0.0
Sep 30 20:15:06 localhost kernel: NetLabel: Initializing
Sep 30 20:15:06 localhost kernel: NetLabel:  domain hash size = 128
Sep 30 20:15:06 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Sep 30 20:15:06 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Sep 30 20:15:06 localhost kernel: PCI: Using ACPI for IRQ routing
Sep 30 20:15:06 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Sep 30 20:15:06 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Sep 30 20:15:06 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Sep 30 20:15:06 localhost kernel: vgaarb: loaded
Sep 30 20:15:06 localhost kernel: clocksource: Switched to clocksource kvm-clock
Sep 30 20:15:06 localhost kernel: VFS: Disk quotas dquot_6.6.0
Sep 30 20:15:06 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Sep 30 20:15:06 localhost kernel: pnp: PnP ACPI init
Sep 30 20:15:06 localhost kernel: pnp 00:03: [dma 2]
Sep 30 20:15:06 localhost kernel: pnp: PnP ACPI: found 5 devices
Sep 30 20:15:06 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Sep 30 20:15:06 localhost kernel: NET: Registered PF_INET protocol family
Sep 30 20:15:06 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Sep 30 20:15:06 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Sep 30 20:15:06 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Sep 30 20:15:06 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Sep 30 20:15:06 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Sep 30 20:15:06 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Sep 30 20:15:06 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Sep 30 20:15:06 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Sep 30 20:15:06 localhost kernel: NET: Registered PF_XDP protocol family
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Sep 30 20:15:06 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Sep 30 20:15:06 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 82875 usecs
Sep 30 20:15:06 localhost kernel: PCI: CLS 0 bytes, default 64
Sep 30 20:15:06 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Sep 30 20:15:06 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Sep 30 20:15:06 localhost kernel: ACPI: bus type thunderbolt registered
Sep 30 20:15:06 localhost kernel: Trying to unpack rootfs image as initramfs...
Sep 30 20:15:06 localhost kernel: Initialise system trusted keyrings
Sep 30 20:15:06 localhost kernel: Key type blacklist registered
Sep 30 20:15:06 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Sep 30 20:15:06 localhost kernel: zbud: loaded
Sep 30 20:15:06 localhost kernel: integrity: Platform Keyring initialized
Sep 30 20:15:06 localhost kernel: integrity: Machine keyring initialized
Sep 30 20:15:06 localhost kernel: Freeing initrd memory: 86080K
Sep 30 20:15:06 localhost kernel: NET: Registered PF_ALG protocol family
Sep 30 20:15:06 localhost kernel: xor: automatically using best checksumming function   avx       
Sep 30 20:15:06 localhost kernel: Key type asymmetric registered
Sep 30 20:15:06 localhost kernel: Asymmetric key parser 'x509' registered
Sep 30 20:15:06 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Sep 30 20:15:06 localhost kernel: io scheduler mq-deadline registered
Sep 30 20:15:06 localhost kernel: io scheduler kyber registered
Sep 30 20:15:06 localhost kernel: io scheduler bfq registered
Sep 30 20:15:06 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Sep 30 20:15:06 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Sep 30 20:15:06 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Sep 30 20:15:06 localhost kernel: ACPI: button: Power Button [PWRF]
Sep 30 20:15:06 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Sep 30 20:15:06 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Sep 30 20:15:06 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Sep 30 20:15:06 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Sep 30 20:15:06 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Sep 30 20:15:06 localhost kernel: Non-volatile memory driver v1.3
Sep 30 20:15:06 localhost kernel: rdac: device handler registered
Sep 30 20:15:06 localhost kernel: hp_sw: device handler registered
Sep 30 20:15:06 localhost kernel: emc: device handler registered
Sep 30 20:15:06 localhost kernel: alua: device handler registered
Sep 30 20:15:06 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Sep 30 20:15:06 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Sep 30 20:15:06 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Sep 30 20:15:06 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Sep 30 20:15:06 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Sep 30 20:15:06 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Sep 30 20:15:06 localhost kernel: usb usb1: Product: UHCI Host Controller
Sep 30 20:15:06 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-617.el9.x86_64 uhci_hcd
Sep 30 20:15:06 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Sep 30 20:15:06 localhost kernel: hub 1-0:1.0: USB hub found
Sep 30 20:15:06 localhost kernel: hub 1-0:1.0: 2 ports detected
Sep 30 20:15:06 localhost kernel: usbcore: registered new interface driver usbserial_generic
Sep 30 20:15:06 localhost kernel: usbserial: USB Serial support registered for generic
Sep 30 20:15:06 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Sep 30 20:15:06 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Sep 30 20:15:06 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Sep 30 20:15:06 localhost kernel: mousedev: PS/2 mouse device common for all mice
Sep 30 20:15:06 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Sep 30 20:15:06 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Sep 30 20:15:06 localhost kernel: rtc_cmos 00:04: registered as rtc0
Sep 30 20:15:06 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Sep 30 20:15:06 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-09-30T20:15:05 UTC (1759263305)
Sep 30 20:15:06 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Sep 30 20:15:06 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Sep 30 20:15:06 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Sep 30 20:15:06 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Sep 30 20:15:06 localhost kernel: usbcore: registered new interface driver usbhid
Sep 30 20:15:06 localhost kernel: usbhid: USB HID core driver
Sep 30 20:15:06 localhost kernel: drop_monitor: Initializing network drop monitor service
Sep 30 20:15:06 localhost kernel: Initializing XFRM netlink socket
Sep 30 20:15:06 localhost kernel: NET: Registered PF_INET6 protocol family
Sep 30 20:15:06 localhost kernel: Segment Routing with IPv6
Sep 30 20:15:06 localhost kernel: NET: Registered PF_PACKET protocol family
Sep 30 20:15:06 localhost kernel: mpls_gso: MPLS GSO support
Sep 30 20:15:06 localhost kernel: IPI shorthand broadcast: enabled
Sep 30 20:15:06 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Sep 30 20:15:06 localhost kernel: AES CTR mode by8 optimization enabled
Sep 30 20:15:06 localhost kernel: sched_clock: Marking stable (1298007512, 156946742)->(1574779984, -119825730)
Sep 30 20:15:06 localhost kernel: registered taskstats version 1
Sep 30 20:15:06 localhost kernel: Loading compiled-in X.509 certificates
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Sep 30 20:15:06 localhost kernel: Demotion targets for Node 0: null
Sep 30 20:15:06 localhost kernel: page_owner is disabled
Sep 30 20:15:06 localhost kernel: Key type .fscrypt registered
Sep 30 20:15:06 localhost kernel: Key type fscrypt-provisioning registered
Sep 30 20:15:06 localhost kernel: Key type big_key registered
Sep 30 20:15:06 localhost kernel: Key type encrypted registered
Sep 30 20:15:06 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Sep 30 20:15:06 localhost kernel: Loading compiled-in module X.509 certificates
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 20:15:06 localhost kernel: ima: Allocated hash algorithm: sha256
Sep 30 20:15:06 localhost kernel: ima: No architecture policies found
Sep 30 20:15:06 localhost kernel: evm: Initialising EVM extended attributes:
Sep 30 20:15:06 localhost kernel: evm: security.selinux
Sep 30 20:15:06 localhost kernel: evm: security.SMACK64 (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.SMACK64EXEC (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.SMACK64MMAP (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.apparmor (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.ima
Sep 30 20:15:06 localhost kernel: evm: security.capability
Sep 30 20:15:06 localhost kernel: evm: HMAC attrs: 0x1
Sep 30 20:15:06 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Sep 30 20:15:06 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Sep 30 20:15:06 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Sep 30 20:15:06 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Sep 30 20:15:06 localhost kernel: usb 1-1: Manufacturer: QEMU
Sep 30 20:15:06 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Sep 30 20:15:06 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Sep 30 20:15:06 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Sep 30 20:15:06 localhost kernel: Running certificate verification RSA selftest
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Sep 30 20:15:06 localhost kernel: Running certificate verification ECDSA selftest
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Sep 30 20:15:06 localhost kernel: clk: Disabling unused clocks
Sep 30 20:15:06 localhost kernel: Freeing unused decrypted memory: 2028K
Sep 30 20:15:06 localhost kernel: Freeing unused kernel image (initmem) memory: 4072K
Sep 30 20:15:06 localhost kernel: Write protecting the kernel read-only data: 30720k
Sep 30 20:15:06 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 348K
Sep 30 20:15:06 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Sep 30 20:15:06 localhost kernel: Run /init as init process
Sep 30 20:15:06 localhost kernel:   with arguments:
Sep 30 20:15:06 localhost kernel:     /init
Sep 30 20:15:06 localhost kernel:   with environment:
Sep 30 20:15:06 localhost kernel:     HOME=/
Sep 30 20:15:06 localhost kernel:     TERM=linux
Sep 30 20:15:06 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64
Sep 30 20:15:06 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 20:15:06 localhost systemd[1]: Detected virtualization kvm.
Sep 30 20:15:06 localhost systemd[1]: Detected architecture x86-64.
Sep 30 20:15:06 localhost systemd[1]: Running in initrd.
Sep 30 20:15:06 localhost systemd[1]: No hostname configured, using default hostname.
Sep 30 20:15:06 localhost systemd[1]: Hostname set to <localhost>.
Sep 30 20:15:06 localhost systemd[1]: Initializing machine ID from VM UUID.
Sep 30 20:15:06 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Sep 30 20:15:06 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 20:15:06 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 20:15:06 localhost systemd[1]: Reached target Initrd /usr File System.
Sep 30 20:15:06 localhost systemd[1]: Reached target Local File Systems.
Sep 30 20:15:06 localhost systemd[1]: Reached target Path Units.
Sep 30 20:15:06 localhost systemd[1]: Reached target Slice Units.
Sep 30 20:15:06 localhost systemd[1]: Reached target Swaps.
Sep 30 20:15:06 localhost systemd[1]: Reached target Timer Units.
Sep 30 20:15:06 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 20:15:06 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Sep 30 20:15:06 localhost systemd[1]: Listening on Journal Socket.
Sep 30 20:15:06 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 20:15:06 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 20:15:06 localhost systemd[1]: Reached target Socket Units.
Sep 30 20:15:06 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 20:15:06 localhost systemd[1]: Starting Journal Service...
Sep 30 20:15:06 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 20:15:06 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 20:15:06 localhost systemd[1]: Starting Create System Users...
Sep 30 20:15:06 localhost systemd[1]: Starting Setup Virtual Console...
Sep 30 20:15:06 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 20:15:06 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 20:15:06 localhost systemd[1]: Finished Create System Users.
Sep 30 20:15:06 localhost systemd-journald[308]: Journal started
Sep 30 20:15:06 localhost systemd-journald[308]: Runtime Journal (/run/log/journal/1cca553b9dfa48668e03ea5bf0299bf0) is 8.0M, max 153.5M, 145.5M free.
Sep 30 20:15:06 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Sep 30 20:15:06 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Sep 30 20:15:06 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Sep 30 20:15:06 localhost systemd[1]: Started Journal Service.
Sep 30 20:15:06 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 20:15:06 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 20:15:06 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 20:15:06 localhost systemd[1]: Finished Setup Virtual Console.
Sep 30 20:15:06 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Sep 30 20:15:06 localhost systemd[1]: Starting dracut cmdline hook...
Sep 30 20:15:06 localhost dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Sep 30 20:15:06 localhost dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 20:15:06 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 20:15:06 localhost systemd[1]: Finished dracut cmdline hook.
Sep 30 20:15:06 localhost systemd[1]: Starting dracut pre-udev hook...
Sep 30 20:15:06 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Sep 30 20:15:06 localhost kernel: device-mapper: uevent: version 1.0.3
Sep 30 20:15:06 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Sep 30 20:15:06 localhost kernel: RPC: Registered named UNIX socket transport module.
Sep 30 20:15:06 localhost kernel: RPC: Registered udp transport module.
Sep 30 20:15:06 localhost kernel: RPC: Registered tcp transport module.
Sep 30 20:15:06 localhost kernel: RPC: Registered tcp-with-tls transport module.
Sep 30 20:15:06 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Sep 30 20:15:06 localhost rpc.statd[443]: Version 2.5.4 starting
Sep 30 20:15:06 localhost rpc.statd[443]: Initializing NSM state
Sep 30 20:15:07 localhost rpc.idmapd[448]: Setting log level to 0
Sep 30 20:15:07 localhost systemd[1]: Finished dracut pre-udev hook.
Sep 30 20:15:07 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 20:15:07 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 20:15:07 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 20:15:07 localhost systemd[1]: Starting dracut pre-trigger hook...
Sep 30 20:15:07 localhost systemd[1]: Finished dracut pre-trigger hook.
Sep 30 20:15:07 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 20:15:07 localhost systemd[1]: Created slice Slice /system/modprobe.
Sep 30 20:15:07 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 20:15:07 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 20:15:07 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 20:15:07 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 20:15:07 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 20:15:07 localhost systemd[1]: Reached target Network.
Sep 30 20:15:07 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 20:15:07 localhost systemd[1]: Starting dracut initqueue hook...
Sep 30 20:15:07 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Sep 30 20:15:07 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Sep 30 20:15:07 localhost kernel:  vda: vda1
Sep 30 20:15:07 localhost kernel: libata version 3.00 loaded.
Sep 30 20:15:07 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Sep 30 20:15:07 localhost systemd-udevd[473]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:15:07 localhost kernel: scsi host0: ata_piix
Sep 30 20:15:07 localhost kernel: scsi host1: ata_piix
Sep 30 20:15:07 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Sep 30 20:15:07 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Sep 30 20:15:07 localhost systemd[1]: Mounting Kernel Configuration File System...
Sep 30 20:15:07 localhost systemd[1]: Mounted Kernel Configuration File System.
Sep 30 20:15:07 localhost systemd[1]: Found device /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 20:15:07 localhost systemd[1]: Reached target Initrd Root Device.
Sep 30 20:15:07 localhost systemd[1]: Reached target System Initialization.
Sep 30 20:15:07 localhost systemd[1]: Reached target Basic System.
Sep 30 20:15:07 localhost kernel: ata1: found unknown device (class 0)
Sep 30 20:15:07 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Sep 30 20:15:07 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Sep 30 20:15:07 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Sep 30 20:15:07 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Sep 30 20:15:07 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Sep 30 20:15:07 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Sep 30 20:15:07 localhost systemd[1]: Finished dracut initqueue hook.
Sep 30 20:15:07 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 20:15:07 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Sep 30 20:15:07 localhost systemd[1]: Reached target Remote File Systems.
Sep 30 20:15:07 localhost systemd[1]: Starting dracut pre-mount hook...
Sep 30 20:15:07 localhost systemd[1]: Finished dracut pre-mount hook.
Sep 30 20:15:07 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8...
Sep 30 20:15:07 localhost systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Sep 30 20:15:07 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 20:15:07 localhost systemd[1]: Mounting /sysroot...
Sep 30 20:15:08 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Sep 30 20:15:08 localhost kernel: XFS (vda1): Mounting V5 Filesystem d6a81468-b74c-4055-b485-def635ab40f8
Sep 30 20:15:08 localhost kernel: XFS (vda1): Ending clean mount
Sep 30 20:15:08 localhost systemd[1]: Mounted /sysroot.
Sep 30 20:15:08 localhost systemd[1]: Reached target Initrd Root File System.
Sep 30 20:15:08 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Sep 30 20:15:08 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Sep 30 20:15:08 localhost systemd[1]: Reached target Initrd File Systems.
Sep 30 20:15:08 localhost systemd[1]: Reached target Initrd Default Target.
Sep 30 20:15:08 localhost systemd[1]: Starting dracut mount hook...
Sep 30 20:15:08 localhost systemd[1]: Finished dracut mount hook.
Sep 30 20:15:08 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Sep 30 20:15:08 localhost rpc.idmapd[448]: exiting on signal 15
Sep 30 20:15:08 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Sep 30 20:15:08 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Sep 30 20:15:08 localhost systemd[1]: Stopped target Network.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Timer Units.
Sep 30 20:15:08 localhost systemd[1]: dbus.socket: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Sep 30 20:15:08 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Initrd Default Target.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Basic System.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Initrd Root Device.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Initrd /usr File System.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Path Units.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Remote File Systems.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Slice Units.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Socket Units.
Sep 30 20:15:08 localhost systemd[1]: Stopped target System Initialization.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Local File Systems.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Swaps.
Sep 30 20:15:08 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped dracut mount hook.
Sep 30 20:15:08 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped dracut pre-mount hook.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Sep 30 20:15:08 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Sep 30 20:15:08 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped dracut initqueue hook.
Sep 30 20:15:08 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped Apply Kernel Variables.
Sep 30 20:15:08 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Sep 30 20:15:08 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped Coldplug All udev Devices.
Sep 30 20:15:08 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped dracut pre-trigger hook.
Sep 30 20:15:08 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Sep 30 20:15:08 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped Setup Virtual Console.
Sep 30 20:15:08 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Sep 30 20:15:08 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Sep 30 20:15:08 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Closed udev Control Socket.
Sep 30 20:15:08 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Closed udev Kernel Socket.
Sep 30 20:15:08 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped dracut pre-udev hook.
Sep 30 20:15:08 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped dracut cmdline hook.
Sep 30 20:15:08 localhost systemd[1]: Starting Cleanup udev Database...
Sep 30 20:15:08 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Sep 30 20:15:08 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Sep 30 20:15:08 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Stopped Create System Users.
Sep 30 20:15:08 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Finished Cleanup udev Database.
Sep 30 20:15:08 localhost systemd[1]: Reached target Switch Root.
Sep 30 20:15:08 localhost systemd[1]: Starting Switch Root...
Sep 30 20:15:08 localhost systemd[1]: Switching root.
Sep 30 20:15:08 localhost systemd-journald[308]: Journal stopped
Sep 30 20:15:09 localhost systemd-journald[308]: Received SIGTERM from PID 1 (systemd).
Sep 30 20:15:09 localhost kernel: audit: type=1404 audit(1759263308.833:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Sep 30 20:15:09 localhost kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:15:09 localhost kernel: SELinux:  policy capability open_perms=1
Sep 30 20:15:09 localhost kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:15:09 localhost kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:15:09 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:15:09 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:15:09 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:15:09 localhost kernel: audit: type=1403 audit(1759263308.998:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Sep 30 20:15:09 localhost systemd[1]: Successfully loaded SELinux policy in 169.588ms.
Sep 30 20:15:09 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.653ms.
Sep 30 20:15:09 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 20:15:09 localhost systemd[1]: Detected virtualization kvm.
Sep 30 20:15:09 localhost systemd[1]: Detected architecture x86-64.
Sep 30 20:15:09 localhost systemd-rc-local-generator[635]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:15:09 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Switch Root.
Sep 30 20:15:09 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Sep 30 20:15:09 localhost systemd[1]: Created slice Slice /system/getty.
Sep 30 20:15:09 localhost systemd[1]: Created slice Slice /system/serial-getty.
Sep 30 20:15:09 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Sep 30 20:15:09 localhost systemd[1]: Created slice User and Session Slice.
Sep 30 20:15:09 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 20:15:09 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Sep 30 20:15:09 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Sep 30 20:15:09 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Switch Root.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Initrd File Systems.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Initrd Root File System.
Sep 30 20:15:09 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Sep 30 20:15:09 localhost systemd[1]: Reached target Path Units.
Sep 30 20:15:09 localhost systemd[1]: Reached target rpc_pipefs.target.
Sep 30 20:15:09 localhost systemd[1]: Reached target Slice Units.
Sep 30 20:15:09 localhost systemd[1]: Reached target Swaps.
Sep 30 20:15:09 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Sep 30 20:15:09 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Sep 30 20:15:09 localhost systemd[1]: Reached target RPC Port Mapper.
Sep 30 20:15:09 localhost systemd[1]: Listening on Process Core Dump Socket.
Sep 30 20:15:09 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Sep 30 20:15:09 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 20:15:09 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 20:15:09 localhost systemd[1]: Mounting Huge Pages File System...
Sep 30 20:15:09 localhost systemd[1]: Mounting POSIX Message Queue File System...
Sep 30 20:15:09 localhost systemd[1]: Mounting Kernel Debug File System...
Sep 30 20:15:09 localhost systemd[1]: Mounting Kernel Trace File System...
Sep 30 20:15:09 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 20:15:09 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 20:15:09 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 20:15:09 localhost systemd[1]: Starting Load Kernel Module drm...
Sep 30 20:15:09 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Sep 30 20:15:09 localhost systemd[1]: Starting Load Kernel Module fuse...
Sep 30 20:15:09 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Sep 30 20:15:09 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped File System Check on Root Device.
Sep 30 20:15:09 localhost systemd[1]: Stopped Journal Service.
Sep 30 20:15:09 localhost systemd[1]: Starting Journal Service...
Sep 30 20:15:09 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 20:15:09 localhost systemd[1]: Starting Generate network units from Kernel command line...
Sep 30 20:15:09 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 20:15:09 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Sep 30 20:15:09 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Sep 30 20:15:09 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 20:15:09 localhost kernel: fuse: init (API version 7.37)
Sep 30 20:15:09 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 20:15:09 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Sep 30 20:15:09 localhost systemd[1]: Mounted Huge Pages File System.
Sep 30 20:15:09 localhost systemd[1]: Mounted POSIX Message Queue File System.
Sep 30 20:15:09 localhost systemd[1]: Mounted Kernel Debug File System.
Sep 30 20:15:09 localhost systemd[1]: Mounted Kernel Trace File System.
Sep 30 20:15:09 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 20:15:09 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 20:15:09 localhost systemd-journald[676]: Journal started
Sep 30 20:15:09 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 20:15:09 localhost systemd[1]: Queued start job for default target Multi-User System.
Sep 30 20:15:09 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Started Journal Service.
Sep 30 20:15:09 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Sep 30 20:15:09 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Finished Load Kernel Module fuse.
Sep 30 20:15:09 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Sep 30 20:15:09 localhost systemd[1]: Finished Generate network units from Kernel command line.
Sep 30 20:15:09 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Sep 30 20:15:09 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 20:15:09 localhost systemd[1]: Mounting FUSE Control File System...
Sep 30 20:15:09 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 20:15:09 localhost systemd[1]: Starting Rebuild Hardware Database...
Sep 30 20:15:09 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Sep 30 20:15:09 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Sep 30 20:15:09 localhost systemd[1]: Starting Load/Save OS Random Seed...
Sep 30 20:15:09 localhost kernel: ACPI: bus type drm_connector registered
Sep 30 20:15:09 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 20:15:09 localhost systemd-journald[676]: Received client request to flush runtime journal.
Sep 30 20:15:09 localhost systemd[1]: Starting Create System Users...
Sep 30 20:15:09 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Finished Load Kernel Module drm.
Sep 30 20:15:09 localhost systemd[1]: Mounted FUSE Control File System.
Sep 30 20:15:09 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Sep 30 20:15:09 localhost systemd[1]: Finished Load/Save OS Random Seed.
Sep 30 20:15:09 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 20:15:09 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 20:15:09 localhost systemd[1]: Finished Create System Users.
Sep 30 20:15:09 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 20:15:09 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 20:15:09 localhost systemd[1]: Reached target Preparation for Local File Systems.
Sep 30 20:15:09 localhost systemd[1]: Reached target Local File Systems.
Sep 30 20:15:09 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Sep 30 20:15:09 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Sep 30 20:15:09 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Sep 30 20:15:09 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Sep 30 20:15:09 localhost systemd[1]: Starting Automatic Boot Loader Update...
Sep 30 20:15:09 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Sep 30 20:15:09 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 20:15:09 localhost bootctl[694]: Couldn't find EFI system partition, skipping.
Sep 30 20:15:09 localhost systemd[1]: Finished Automatic Boot Loader Update.
Sep 30 20:15:10 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 20:15:10 localhost systemd[1]: Starting Security Auditing Service...
Sep 30 20:15:10 localhost systemd[1]: Starting RPC Bind...
Sep 30 20:15:10 localhost systemd[1]: Starting Rebuild Journal Catalog...
Sep 30 20:15:10 localhost auditd[700]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Sep 30 20:15:10 localhost auditd[700]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Sep 30 20:15:10 localhost systemd[1]: Started RPC Bind.
Sep 30 20:15:10 localhost systemd[1]: Finished Rebuild Journal Catalog.
Sep 30 20:15:10 localhost augenrules[705]: /sbin/augenrules: No change
Sep 30 20:15:10 localhost augenrules[720]: No rules
Sep 30 20:15:10 localhost augenrules[720]: enabled 1
Sep 30 20:15:10 localhost augenrules[720]: failure 1
Sep 30 20:15:10 localhost augenrules[720]: pid 700
Sep 30 20:15:10 localhost augenrules[720]: rate_limit 0
Sep 30 20:15:10 localhost augenrules[720]: backlog_limit 8192
Sep 30 20:15:10 localhost augenrules[720]: lost 0
Sep 30 20:15:10 localhost augenrules[720]: backlog 3
Sep 30 20:15:10 localhost augenrules[720]: backlog_wait_time 60000
Sep 30 20:15:10 localhost augenrules[720]: backlog_wait_time_actual 0
Sep 30 20:15:10 localhost augenrules[720]: enabled 1
Sep 30 20:15:10 localhost augenrules[720]: failure 1
Sep 30 20:15:10 localhost augenrules[720]: pid 700
Sep 30 20:15:10 localhost augenrules[720]: rate_limit 0
Sep 30 20:15:10 localhost augenrules[720]: backlog_limit 8192
Sep 30 20:15:10 localhost augenrules[720]: lost 0
Sep 30 20:15:10 localhost augenrules[720]: backlog 0
Sep 30 20:15:10 localhost augenrules[720]: backlog_wait_time 60000
Sep 30 20:15:10 localhost augenrules[720]: backlog_wait_time_actual 0
Sep 30 20:15:10 localhost augenrules[720]: enabled 1
Sep 30 20:15:10 localhost augenrules[720]: failure 1
Sep 30 20:15:10 localhost augenrules[720]: pid 700
Sep 30 20:15:10 localhost augenrules[720]: rate_limit 0
Sep 30 20:15:10 localhost augenrules[720]: backlog_limit 8192
Sep 30 20:15:10 localhost augenrules[720]: lost 0
Sep 30 20:15:10 localhost augenrules[720]: backlog 3
Sep 30 20:15:10 localhost augenrules[720]: backlog_wait_time 60000
Sep 30 20:15:10 localhost augenrules[720]: backlog_wait_time_actual 0
Sep 30 20:15:10 localhost systemd[1]: Started Security Auditing Service.
Sep 30 20:15:10 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Sep 30 20:15:10 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Sep 30 20:15:10 localhost systemd[1]: Finished Rebuild Hardware Database.
Sep 30 20:15:10 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 20:15:10 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Sep 30 20:15:10 localhost systemd[1]: Starting Update is Completed...
Sep 30 20:15:10 localhost systemd-udevd[728]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 20:15:10 localhost systemd[1]: Finished Update is Completed.
Sep 30 20:15:10 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 20:15:10 localhost systemd[1]: Reached target System Initialization.
Sep 30 20:15:10 localhost systemd[1]: Started dnf makecache --timer.
Sep 30 20:15:10 localhost systemd[1]: Started Daily rotation of log files.
Sep 30 20:15:10 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Sep 30 20:15:10 localhost systemd[1]: Reached target Timer Units.
Sep 30 20:15:10 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 20:15:10 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Sep 30 20:15:10 localhost systemd[1]: Reached target Socket Units.
Sep 30 20:15:10 localhost systemd[1]: Starting D-Bus System Message Bus...
Sep 30 20:15:10 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 20:15:10 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Sep 30 20:15:10 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 20:15:10 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 20:15:10 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 20:15:10 localhost systemd-udevd[744]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:15:10 localhost systemd[1]: Started D-Bus System Message Bus.
Sep 30 20:15:10 localhost systemd[1]: Reached target Basic System.
Sep 30 20:15:10 localhost dbus-broker-lau[764]: Ready
Sep 30 20:15:10 localhost systemd[1]: Starting NTP client/server...
Sep 30 20:15:10 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Sep 30 20:15:10 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Sep 30 20:15:10 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Sep 30 20:15:10 localhost systemd[1]: Starting IPv4 firewall with iptables...
Sep 30 20:15:10 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Sep 30 20:15:10 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Sep 30 20:15:10 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Sep 30 20:15:10 localhost systemd[1]: Started irqbalance daemon.
Sep 30 20:15:10 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Sep 30 20:15:10 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 20:15:10 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 20:15:10 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 20:15:10 localhost systemd[1]: Reached target sshd-keygen.target.
Sep 30 20:15:10 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Sep 30 20:15:10 localhost systemd[1]: Reached target User and Group Name Lookups.
Sep 30 20:15:10 localhost chronyd[797]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 20:15:10 localhost chronyd[797]: Loaded 0 symmetric keys
Sep 30 20:15:10 localhost chronyd[797]: Using right/UTC timezone to obtain leap second data
Sep 30 20:15:10 localhost chronyd[797]: Loaded seccomp filter (level 2)
Sep 30 20:15:10 localhost systemd[1]: Starting User Login Management...
Sep 30 20:15:10 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Sep 30 20:15:10 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Sep 30 20:15:10 localhost systemd[1]: Started NTP client/server.
Sep 30 20:15:10 localhost kernel: Console: switching to colour dummy device 80x25
Sep 30 20:15:10 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Sep 30 20:15:10 localhost kernel: [drm] features: -context_init
Sep 30 20:15:10 localhost kernel: [drm] number of scanouts: 1
Sep 30 20:15:10 localhost kernel: [drm] number of cap sets: 0
Sep 30 20:15:10 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Sep 30 20:15:10 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Sep 30 20:15:10 localhost kernel: Console: switching to colour frame buffer device 128x48
Sep 30 20:15:10 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Sep 30 20:15:10 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Sep 30 20:15:10 localhost systemd-logind[793]: New seat seat0.
Sep 30 20:15:10 localhost systemd-logind[793]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 20:15:10 localhost systemd-logind[793]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 20:15:10 localhost systemd[1]: Started User Login Management.
Sep 30 20:15:10 localhost kernel: kvm_amd: TSC scaling supported
Sep 30 20:15:10 localhost kernel: kvm_amd: Nested Virtualization enabled
Sep 30 20:15:10 localhost kernel: kvm_amd: Nested Paging enabled
Sep 30 20:15:10 localhost kernel: kvm_amd: LBR virtualization supported
Sep 30 20:15:10 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Sep 30 20:15:10 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Sep 30 20:15:10 localhost iptables.init[786]: iptables: Applying firewall rules: [  OK  ]
Sep 30 20:15:10 localhost systemd[1]: Finished IPv4 firewall with iptables.
Sep 30 20:15:11 localhost cloud-init[840]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 30 Sep 2025 20:15:11 +0000. Up 7.10 seconds.
Sep 30 20:15:11 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Sep 30 20:15:11 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Sep 30 20:15:11 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpfub9o9a7.mount: Deactivated successfully.
Sep 30 20:15:11 localhost systemd[1]: Starting Hostname Service...
Sep 30 20:15:11 localhost systemd[1]: Started Hostname Service.
Sep 30 20:15:11 np0005463581.novalocal systemd-hostnamed[854]: Hostname set to <np0005463581.novalocal> (static)
Sep 30 20:15:11 np0005463581.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Sep 30 20:15:11 np0005463581.novalocal systemd[1]: Reached target Preparation for Network.
Sep 30 20:15:11 np0005463581.novalocal systemd[1]: Starting Network Manager...
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.0521] NetworkManager (version 1.54.1-1.el9) is starting... (boot:3eba4184-9928-4449-a716-6939f6e53713)
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.0527] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.0706] manager[0x5596c77eb080]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.0774] hostname: hostname: using hostnamed
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.0775] hostname: static hostname changed from (none) to "np0005463581.novalocal"
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.0781] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.0945] manager[0x5596c77eb080]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.0945] manager[0x5596c77eb080]: rfkill: WWAN hardware radio set enabled
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1055] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1055] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1056] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1056] manager: Networking is enabled by state file
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1062] settings: Loaded settings plugin: keyfile (internal)
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1100] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1135] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1169] dhcp: init: Using DHCP client 'internal'
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1174] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1194] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1208] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1220] device (lo): Activation: starting connection 'lo' (e0c59d25-6b1e-4298-8ada-e0f1bea62f04)
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1234] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1238] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Started Network Manager.
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1284] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1289] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1292] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1295] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Reached target Network.
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1298] device (eth0): carrier: link connected
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1302] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1312] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1322] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1329] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1330] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1333] manager: NetworkManager state is now CONNECTING
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1335] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1345] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1348] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1414] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1440] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1480] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1499] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1507] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1521] device (lo): Activation: successful, device activated.
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1540] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1546] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1562] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1574] device (eth0): Activation: successful, device activated.
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1586] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 20:15:12 np0005463581.novalocal NetworkManager[858]: <info>  [1759263312.1593] manager: startup complete
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Reached target NFS client services.
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Reached target Remote File Systems.
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 20:15:12 np0005463581.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 30 Sep 2025 20:15:12 +0000. Up 8.23 seconds.
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: |  eth0  | True |         38.102.83.50         | 255.255.255.0 | global | fa:16:3e:80:33:fc |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe80:33fc/64 |       .       |  link  | fa:16:3e:80:33:fc |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Sep 30 20:15:12 np0005463581.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 20:15:13 np0005463581.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Sep 30 20:15:13 np0005463581.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Sep 30 20:15:13 np0005463581.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Sep 30 20:15:13 np0005463581.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Sep 30 20:15:13 np0005463581.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Sep 30 20:15:13 np0005463581.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: Generating public/private rsa key pair.
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: The key fingerprint is:
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: SHA256:XZqAREH7515CdDBWb/MGQHjM79RH9PJjkBAMDxebwfs root@np0005463581.novalocal
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: The key's randomart image is:
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: +---[RSA 3072]----+
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |     o=. oB%*  ..|
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |     . o .==Oo. o|
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |      o . .=++*o.|
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |       . + * .+*o|
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |        S * .o o=|
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |         +   Eo..|
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |          o .    |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |         . o     |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |          .      |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: +----[SHA256]-----+
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: The key fingerprint is:
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: SHA256:ll4u9igOeIt3cm+J4I1ziMEIH7A/B1Uu9E9JBrRVfms root@np0005463581.novalocal
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: The key's randomart image is:
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: +---[ECDSA 256]---+
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |    .o+.+..      |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |.  ..o = o       |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: | o .. + o . .    |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |o o  . o . . .   |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |.= o    S . E    |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |. *.o  o o .     |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |  .*o= .+..      |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |  .oBoB.o+       |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |  ...B.+o .      |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: +----[SHA256]-----+
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: The key fingerprint is:
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: SHA256:GtjV44ajmbAwSSrqkeoEBh/6U/kJnfNa1DE4I9EI65M root@np0005463581.novalocal
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: The key's randomart image is:
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: +--[ED25519 256]--+
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |    ...o         |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |     ....o       |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |. o . . = =      |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |.= + * + * +     |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |=.= E * S +      |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |=..+ * @ o       |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |.+o . B o        |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |+ ..   o         |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: |oo    .          |
Sep 30 20:15:13 np0005463581.novalocal cloud-init[922]: +----[SHA256]-----+
Sep 30 20:15:13 np0005463581.novalocal sm-notify[1004]: Version 2.5.4 starting
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Sep 30 20:15:13 np0005463581.novalocal sshd[1006]: Server listening on 0.0.0.0 port 22.
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Reached target Cloud-config availability.
Sep 30 20:15:13 np0005463581.novalocal sshd[1006]: Server listening on :: port 22.
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Reached target Network is Online.
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Starting System Logging Service...
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Starting OpenSSH server daemon...
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Starting Permit User Sessions...
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Started Notify NFS peers of a restart.
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Started OpenSSH server daemon.
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Finished Permit User Sessions.
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Started Command Scheduler.
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Started Getty on tty1.
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Started Serial Getty on ttyS0.
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Reached target Login Prompts.
Sep 30 20:15:13 np0005463581.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Sep 30 20:15:13 np0005463581.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Sep 30 20:15:13 np0005463581.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 56% if used.)
Sep 30 20:15:13 np0005463581.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Started System Logging Service.
Sep 30 20:15:13 np0005463581.novalocal rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Sep 30 20:15:13 np0005463581.novalocal rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Reached target Multi-User System.
Sep 30 20:15:13 np0005463581.novalocal sshd-session[1014]: Unable to negotiate with 38.102.83.114 port 55996: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Sep 30 20:15:13 np0005463581.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Sep 30 20:15:14 np0005463581.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Sep 30 20:15:14 np0005463581.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Sep 30 20:15:14 np0005463581.novalocal sshd-session[1022]: Unable to negotiate with 38.102.83.114 port 38682: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Sep 30 20:15:14 np0005463581.novalocal sshd-session[1024]: Unable to negotiate with 38.102.83.114 port 38690: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Sep 30 20:15:14 np0005463581.novalocal rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 20:15:14 np0005463581.novalocal sshd-session[1008]: Connection closed by 38.102.83.114 port 55990 [preauth]
Sep 30 20:15:14 np0005463581.novalocal sshd-session[1020]: Connection closed by 38.102.83.114 port 38672 [preauth]
Sep 30 20:15:14 np0005463581.novalocal sshd-session[1028]: Connection closed by 38.102.83.114 port 38700 [preauth]
Sep 30 20:15:14 np0005463581.novalocal sshd-session[1032]: Unable to negotiate with 38.102.83.114 port 38716: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Sep 30 20:15:14 np0005463581.novalocal sshd-session[1034]: Unable to negotiate with 38.102.83.114 port 38720: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1036]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 30 Sep 2025 20:15:14 +0000. Up 9.88 seconds.
Sep 30 20:15:14 np0005463581.novalocal sshd-session[1026]: Connection closed by 38.102.83.114 port 38694 [preauth]
Sep 30 20:15:14 np0005463581.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Sep 30 20:15:14 np0005463581.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1040]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 30 Sep 2025 20:15:14 +0000. Up 10.25 seconds.
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1042]: #############################################################
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1043]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1045]: 256 SHA256:ll4u9igOeIt3cm+J4I1ziMEIH7A/B1Uu9E9JBrRVfms root@np0005463581.novalocal (ECDSA)
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1047]: 256 SHA256:GtjV44ajmbAwSSrqkeoEBh/6U/kJnfNa1DE4I9EI65M root@np0005463581.novalocal (ED25519)
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1049]: 3072 SHA256:XZqAREH7515CdDBWb/MGQHjM79RH9PJjkBAMDxebwfs root@np0005463581.novalocal (RSA)
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1050]: -----END SSH HOST KEY FINGERPRINTS-----
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1051]: #############################################################
Sep 30 20:15:14 np0005463581.novalocal cloud-init[1040]: Cloud-init v. 24.4-7.el9 finished at Tue, 30 Sep 2025 20:15:14 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.49 seconds
Sep 30 20:15:14 np0005463581.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Sep 30 20:15:14 np0005463581.novalocal systemd[1]: Reached target Cloud-init target.
Sep 30 20:15:14 np0005463581.novalocal systemd[1]: Startup finished in 1.817s (kernel) + 2.736s (initrd) + 6.006s (userspace) = 10.561s.
Sep 30 20:15:16 np0005463581.novalocal chronyd[797]: Selected source 162.159.200.1 (2.centos.pool.ntp.org)
Sep 30 20:15:16 np0005463581.novalocal chronyd[797]: System clock TAI offset set to 37 seconds
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: Cannot change IRQ 25 affinity: Operation not permitted
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: IRQ 25 affinity is now unmanaged
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: Cannot change IRQ 31 affinity: Operation not permitted
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: IRQ 31 affinity is now unmanaged
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: Cannot change IRQ 28 affinity: Operation not permitted
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: IRQ 28 affinity is now unmanaged
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: Cannot change IRQ 32 affinity: Operation not permitted
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: IRQ 32 affinity is now unmanaged
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: Cannot change IRQ 30 affinity: Operation not permitted
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: IRQ 30 affinity is now unmanaged
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: Cannot change IRQ 29 affinity: Operation not permitted
Sep 30 20:15:21 np0005463581.novalocal irqbalance[788]: IRQ 29 affinity is now unmanaged
Sep 30 20:15:22 np0005463581.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:15:41 np0005463581.novalocal sshd-session[1055]: Accepted publickey for zuul from 38.102.83.114 port 38252 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Sep 30 20:15:41 np0005463581.novalocal systemd[1]: Created slice User Slice of UID 1000.
Sep 30 20:15:41 np0005463581.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Sep 30 20:15:41 np0005463581.novalocal systemd-logind[793]: New session 1 of user zuul.
Sep 30 20:15:41 np0005463581.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Sep 30 20:15:41 np0005463581.novalocal systemd[1]: Starting User Manager for UID 1000...
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Queued start job for default target Main User Target.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Created slice User Application Slice.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Reached target Paths.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Reached target Timers.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Starting D-Bus User Message Bus Socket...
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Starting Create User's Volatile Files and Directories...
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Listening on D-Bus User Message Bus Socket.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Reached target Sockets.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Finished Create User's Volatile Files and Directories.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Reached target Basic System.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Reached target Main User Target.
Sep 30 20:15:41 np0005463581.novalocal systemd[1059]: Startup finished in 140ms.
Sep 30 20:15:41 np0005463581.novalocal systemd[1]: Started User Manager for UID 1000.
Sep 30 20:15:41 np0005463581.novalocal systemd[1]: Started Session 1 of User zuul.
Sep 30 20:15:41 np0005463581.novalocal sshd-session[1055]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:15:42 np0005463581.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 20:15:42 np0005463581.novalocal python3[1141]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:15:45 np0005463581.novalocal python3[1171]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:15:53 np0005463581.novalocal python3[1229]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:15:54 np0005463581.novalocal python3[1269]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Sep 30 20:15:56 np0005463581.novalocal python3[1295]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzZ/XL27VGgCAJBiqlI6HKkfnRsk1V+FDeDk6XlKIPmzzSPxdfYbN8SC3V4Szi81J+zjdz4In1dc3Hx+bw/GZY+bYflSKUGZLyYdkaicR/Y3qYu3tj4Nt71877eD1Tim1HczV9VrNXM2AjNdeTYpQfp80ysnEqPHeMdwWGq2tFVRcOzH7JYXxSqZDmklwzzrohmXY5x2oEcELxovEfbdEGP0NpGKVEKSW8ITqvpBXJ1dg5OAgwT5JcxE2tGjt91ndS681iepQpWzRNkEDvLhWMSB8YqJIf03rzB9tOSiWnPvuBxvq1mNiHUpIquweTpseBqrt5m5kH371FTgjx086ATpjRmXxmMapRtjOtqIOzMfjuiG0R5W4Y+WFnrvWupXF3THo6Tio85Fxqu6JV/IhfmF0dq9gJSsAeOPPx+Th7fExTwDPRz3zIKNDocBZJAskdrkMYEhr4/6qaKlASm1WPxRQ4aJKGpWZFLWdRVz9MjDwfLt/dYpYQS0a+AeTQ5HU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:15:56 np0005463581.novalocal python3[1319]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:15:57 np0005463581.novalocal python3[1418]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:15:57 np0005463581.novalocal python3[1489]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759263356.7879162-252-221275290449891/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=7f1af54a767e4de98c60b8f1f39de2f1_id_rsa follow=False checksum=50656a9e54d1b7e57b54f7402d0dd79ba3dcaae0 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:15:58 np0005463581.novalocal python3[1612]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:15:58 np0005463581.novalocal python3[1683]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759263357.837888-307-166026059761401/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=7f1af54a767e4de98c60b8f1f39de2f1_id_rsa.pub follow=False checksum=9ff1c1ce825bc16e71c494cc60ffaad2f27c83d2 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:00 np0005463581.novalocal python3[1731]: ansible-ping Invoked with data=pong
Sep 30 20:16:01 np0005463581.novalocal python3[1755]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:16:03 np0005463581.novalocal python3[1813]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Sep 30 20:16:04 np0005463581.novalocal python3[1845]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:04 np0005463581.novalocal python3[1869]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:04 np0005463581.novalocal python3[1893]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:05 np0005463581.novalocal python3[1917]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:05 np0005463581.novalocal python3[1941]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:05 np0005463581.novalocal python3[1965]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:07 np0005463581.novalocal sudo[1989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hywwnglbvasqmwbgtkiruowvemxlamrl ; /usr/bin/python3'
Sep 30 20:16:07 np0005463581.novalocal sudo[1989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:07 np0005463581.novalocal python3[1991]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:07 np0005463581.novalocal sudo[1989]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:08 np0005463581.novalocal sudo[2067]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjmtkhiczmrdjslmbvrarfgohnztmmar ; /usr/bin/python3'
Sep 30 20:16:08 np0005463581.novalocal sudo[2067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:08 np0005463581.novalocal python3[2069]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:08 np0005463581.novalocal sudo[2067]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:08 np0005463581.novalocal sudo[2140]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rurlvehzemqtkgysokplxofozdzousrx ; /usr/bin/python3'
Sep 30 20:16:08 np0005463581.novalocal sudo[2140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:08 np0005463581.novalocal python3[2142]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759263367.7886217-32-238072036518612/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:08 np0005463581.novalocal sudo[2140]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:09 np0005463581.novalocal python3[2190]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:09 np0005463581.novalocal python3[2214]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:10 np0005463581.novalocal python3[2238]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:10 np0005463581.novalocal python3[2262]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:10 np0005463581.novalocal python3[2286]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:10 np0005463581.novalocal python3[2310]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:11 np0005463581.novalocal python3[2334]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:11 np0005463581.novalocal python3[2358]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:11 np0005463581.novalocal python3[2382]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:12 np0005463581.novalocal python3[2406]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:12 np0005463581.novalocal python3[2430]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:12 np0005463581.novalocal python3[2454]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:12 np0005463581.novalocal python3[2478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:13 np0005463581.novalocal python3[2502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:13 np0005463581.novalocal python3[2526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:13 np0005463581.novalocal python3[2550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:14 np0005463581.novalocal python3[2574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:14 np0005463581.novalocal python3[2598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:14 np0005463581.novalocal python3[2622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:15 np0005463581.novalocal python3[2646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:15 np0005463581.novalocal python3[2670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:15 np0005463581.novalocal python3[2694]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:15 np0005463581.novalocal python3[2718]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:16 np0005463581.novalocal python3[2742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:16 np0005463581.novalocal python3[2766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:16 np0005463581.novalocal python3[2790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:18 np0005463581.novalocal sudo[2814]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kscgiiafcdbfrxrnfnnuaagyfmjbcwij ; /usr/bin/python3'
Sep 30 20:16:18 np0005463581.novalocal sudo[2814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:19 np0005463581.novalocal python3[2816]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 20:16:19 np0005463581.novalocal systemd[1]: Starting Time & Date Service...
Sep 30 20:16:19 np0005463581.novalocal systemd[1]: Started Time & Date Service.
Sep 30 20:16:19 np0005463581.novalocal systemd-timedated[2818]: Changed time zone to 'UTC' (UTC).
Sep 30 20:16:19 np0005463581.novalocal sudo[2814]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:20 np0005463581.novalocal sudo[2845]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krujurffdqipzznwfvwtrolfoxextrvw ; /usr/bin/python3'
Sep 30 20:16:20 np0005463581.novalocal sudo[2845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:20 np0005463581.novalocal python3[2847]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:20 np0005463581.novalocal sudo[2845]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:21 np0005463581.novalocal python3[2923]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:21 np0005463581.novalocal python3[2994]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759263380.9460166-252-273151610624065/source _original_basename=tmpcp4hszjy follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:22 np0005463581.novalocal python3[3094]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:22 np0005463581.novalocal python3[3165]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759263381.9633543-302-217501416950032/source _original_basename=tmpc827ewel follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:23 np0005463581.novalocal sudo[3265]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptlnyzyxgxwtvafeufpagngyrznkgzku ; /usr/bin/python3'
Sep 30 20:16:23 np0005463581.novalocal sudo[3265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:23 np0005463581.novalocal python3[3267]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:23 np0005463581.novalocal sudo[3265]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:23 np0005463581.novalocal sudo[3338]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkzjscrbavbdqacbgflvodpfvtzereit ; /usr/bin/python3'
Sep 30 20:16:23 np0005463581.novalocal sudo[3338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:24 np0005463581.novalocal python3[3340]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759263383.311852-382-202390585756807/source _original_basename=tmpq8j_cvr8 follow=False checksum=9dc2039529c0f35ddba9b5f747501467f5135778 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:24 np0005463581.novalocal sudo[3338]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:24 np0005463581.novalocal python3[3388]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:16:25 np0005463581.novalocal python3[3414]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:16:25 np0005463581.novalocal sudo[3492]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tisvlqrjvxuzkahhzoenzfvxqvhdglii ; /usr/bin/python3'
Sep 30 20:16:25 np0005463581.novalocal sudo[3492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:25 np0005463581.novalocal python3[3494]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:25 np0005463581.novalocal sudo[3492]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:25 np0005463581.novalocal sudo[3565]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obhlqwgnieelmzflurjllrawyarrmiuo ; /usr/bin/python3'
Sep 30 20:16:25 np0005463581.novalocal sudo[3565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:25 np0005463581.novalocal python3[3567]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759263385.196667-453-38303848267101/source _original_basename=tmp1prhbeiz follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:25 np0005463581.novalocal sudo[3565]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:26 np0005463581.novalocal sudo[3616]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lljmfevycwcbnymlobznynhklshghujy ; /usr/bin/python3'
Sep 30 20:16:26 np0005463581.novalocal sudo[3616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:26 np0005463581.novalocal python3[3618]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-48f2-6e73-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:16:26 np0005463581.novalocal sudo[3616]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:27 np0005463581.novalocal python3[3646]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-48f2-6e73-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Sep 30 20:16:28 np0005463581.novalocal python3[3674]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:46 np0005463581.novalocal sudo[3698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqjyhwepetxvcinnopnsjmwbuorytjbh ; /usr/bin/python3'
Sep 30 20:16:46 np0005463581.novalocal sudo[3698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:46 np0005463581.novalocal python3[3700]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:46 np0005463581.novalocal sudo[3698]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:49 np0005463581.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 20:17:03 np0005463581.novalocal sshd-session[3703]: Invalid user default from 80.94.95.116 port 45676
Sep 30 20:17:03 np0005463581.novalocal sshd-session[3703]: Connection closed by invalid user default 80.94.95.116 port 45676 [preauth]
Sep 30 20:17:46 np0005463581.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 20:17:46 np0005463581.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Sep 30 20:17:46 np0005463581.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Sep 30 20:17:46 np0005463581.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Sep 30 20:17:46 np0005463581.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Sep 30 20:17:46 np0005463581.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Sep 30 20:17:46 np0005463581.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Sep 30 20:17:46 np0005463581.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Sep 30 20:17:46 np0005463581.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Sep 30 20:17:46 np0005463581.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7558] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 20:17:46 np0005463581.novalocal systemd-udevd[3705]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7763] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7792] settings: (eth1): created default wired connection 'Wired connection 1'
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7798] device (eth1): carrier: link connected
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7801] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7809] policy: auto-activating connection 'Wired connection 1' (4847d465-cf78-3ee4-ab52-42c2398330d2)
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7815] device (eth1): Activation: starting connection 'Wired connection 1' (4847d465-cf78-3ee4-ab52-42c2398330d2)
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7817] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7821] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7827] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:17:46 np0005463581.novalocal NetworkManager[858]: <info>  [1759263466.7835] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:17:46 np0005463581.novalocal systemd[1059]: Starting Mark boot as successful...
Sep 30 20:17:46 np0005463581.novalocal systemd[1059]: Finished Mark boot as successful.
Sep 30 20:17:46 np0005463581.novalocal sshd-session[1068]: Received disconnect from 38.102.83.114 port 38252:11: disconnected by user
Sep 30 20:17:46 np0005463581.novalocal sshd-session[1068]: Disconnected from user zuul 38.102.83.114 port 38252
Sep 30 20:17:46 np0005463581.novalocal sshd-session[1055]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:17:46 np0005463581.novalocal systemd-logind[793]: Session 1 logged out. Waiting for processes to exit.
Sep 30 20:17:47 np0005463581.novalocal sshd-session[3710]: Accepted publickey for zuul from 38.102.83.114 port 37784 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:17:47 np0005463581.novalocal systemd-logind[793]: New session 3 of user zuul.
Sep 30 20:17:47 np0005463581.novalocal systemd[1]: Started Session 3 of User zuul.
Sep 30 20:17:47 np0005463581.novalocal sshd-session[3710]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:17:47 np0005463581.novalocal python3[3737]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-ae5f-fbcf-000000000189-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:17:54 np0005463581.novalocal sudo[3817]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klbrjjrycjyfqdzfyqqrbegobelqdmni ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:17:54 np0005463581.novalocal sudo[3817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:17:54 np0005463581.novalocal python3[3819]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:17:54 np0005463581.novalocal sudo[3817]: pam_unix(sudo:session): session closed for user root
Sep 30 20:17:55 np0005463581.novalocal sudo[3890]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhqoghgionkoaainxmatdfkrtcbbjjtf ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:17:55 np0005463581.novalocal sudo[3890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:17:55 np0005463581.novalocal python3[3892]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759263474.421239-155-109863207861700/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=53ba91fc42f539cfd5ee8cda2318ff19033a47c6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:17:55 np0005463581.novalocal sudo[3890]: pam_unix(sudo:session): session closed for user root
Sep 30 20:17:55 np0005463581.novalocal sudo[3940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coqwqzixyhqgfvzzasgywvbtgipkfiko ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:17:55 np0005463581.novalocal sudo[3940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:17:55 np0005463581.novalocal sshd-session[3740]: Invalid user 0 from 185.217.1.246 port 22177
Sep 30 20:17:55 np0005463581.novalocal python3[3942]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: Stopped Network Manager Wait Online.
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: Stopping Network Manager Wait Online...
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: Stopping Network Manager...
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[858]: <info>  [1759263475.8220] caught SIGTERM, shutting down normally.
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[858]: <info>  [1759263475.8239] dhcp4 (eth0): canceled DHCP transaction
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[858]: <info>  [1759263475.8239] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[858]: <info>  [1759263475.8239] dhcp4 (eth0): state changed no lease
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[858]: <info>  [1759263475.8247] manager: NetworkManager state is now CONNECTING
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[858]: <info>  [1759263475.8343] dhcp4 (eth1): canceled DHCP transaction
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[858]: <info>  [1759263475.8343] dhcp4 (eth1): state changed no lease
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[858]: <info>  [1759263475.8404] exiting (success)
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: Stopped Network Manager.
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: NetworkManager.service: Consumed 1.478s CPU time, 10.0M memory peak.
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: Starting Network Manager...
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263475.9116] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:3eba4184-9928-4449-a716-6939f6e53713)
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263475.9119] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 20:17:55 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263475.9187] manager[0x55e9639d8070]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 20:17:55 np0005463581.novalocal systemd[1]: Starting Hostname Service...
Sep 30 20:17:56 np0005463581.novalocal systemd[1]: Started Hostname Service.
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0327] hostname: hostname: using hostnamed
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0328] hostname: static hostname changed from (none) to "np0005463581.novalocal"
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0336] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0344] manager[0x55e9639d8070]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0345] manager[0x55e9639d8070]: rfkill: WWAN hardware radio set enabled
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0391] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0391] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0392] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0394] manager: Networking is enabled by state file
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0397] settings: Loaded settings plugin: keyfile (internal)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0404] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0443] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0458] dhcp: init: Using DHCP client 'internal'
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0463] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0471] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0481] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0494] device (lo): Activation: starting connection 'lo' (e0c59d25-6b1e-4298-8ada-e0f1bea62f04)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0505] device (eth0): carrier: link connected
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0512] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0520] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0521] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0531] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0542] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0553] device (eth1): carrier: link connected
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0559] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0568] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (4847d465-cf78-3ee4-ab52-42c2398330d2) (indicated)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0568] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0578] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0590] device (eth1): Activation: starting connection 'Wired connection 1' (4847d465-cf78-3ee4-ab52-42c2398330d2)
Sep 30 20:17:56 np0005463581.novalocal systemd[1]: Started Network Manager.
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0599] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0605] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0610] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0612] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0616] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0620] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0624] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0628] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0633] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0644] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0647] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0659] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0663] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0686] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0692] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0700] device (lo): Activation: successful, device activated.
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0710] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0720] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0807] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0901] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0905] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0909] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0914] device (eth0): Activation: successful, device activated.
Sep 30 20:17:56 np0005463581.novalocal sudo[3940]: pam_unix(sudo:session): session closed for user root
Sep 30 20:17:56 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263476.0921] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 20:17:56 np0005463581.novalocal python3[4026]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-ae5f-fbcf-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:17:56 np0005463581.novalocal sshd-session[3740]: Disconnecting invalid user 0 185.217.1.246 port 22177: Change of username or service not allowed: (0,ssh-connection) -> (thomas,ssh-connection) [preauth]
Sep 30 20:18:02 np0005463581.novalocal sshd-session[4029]: Invalid user thomas from 185.217.1.246 port 44439
Sep 30 20:18:02 np0005463581.novalocal sshd-session[4029]: Disconnecting invalid user thomas 185.217.1.246 port 44439: Change of username or service not allowed: (thomas,ssh-connection) -> (uniswap,ssh-connection) [preauth]
Sep 30 20:18:04 np0005463581.novalocal sshd-session[4031]: Invalid user uniswap from 185.217.1.246 port 36956
Sep 30 20:18:05 np0005463581.novalocal sshd-session[4031]: Disconnecting invalid user uniswap 185.217.1.246 port 36956: Change of username or service not allowed: (uniswap,ssh-connection) -> (dspace,ssh-connection) [preauth]
Sep 30 20:18:06 np0005463581.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:18:08 np0005463581.novalocal sshd-session[4033]: Invalid user dspace from 185.217.1.246 port 54765
Sep 30 20:18:16 np0005463581.novalocal sshd-session[4033]: error: maximum authentication attempts exceeded for invalid user dspace from 185.217.1.246 port 54765 ssh2 [preauth]
Sep 30 20:18:16 np0005463581.novalocal sshd-session[4033]: Disconnecting invalid user dspace 185.217.1.246 port 54765: Too many authentication failures [preauth]
Sep 30 20:18:18 np0005463581.novalocal sshd-session[4035]: Invalid user dspace from 185.217.1.246 port 15319
Sep 30 20:18:23 np0005463581.novalocal sshd-session[4035]: Disconnecting invalid user dspace 185.217.1.246 port 15319: Change of username or service not allowed: (dspace,ssh-connection) -> (backup,ssh-connection) [preauth]
Sep 30 20:18:26 np0005463581.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 20:18:27 np0005463581.novalocal sshd-session[4037]: Invalid user backup from 185.217.1.246 port 20872
Sep 30 20:18:29 np0005463581.novalocal sshd-session[4037]: error: maximum authentication attempts exceeded for invalid user backup from 185.217.1.246 port 20872 ssh2 [preauth]
Sep 30 20:18:29 np0005463581.novalocal sshd-session[4037]: Disconnecting invalid user backup 185.217.1.246 port 20872: Too many authentication failures [preauth]
Sep 30 20:18:36 np0005463581.novalocal sshd-session[4041]: Invalid user backup from 185.217.1.246 port 6979
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.2677] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 20:18:41 np0005463581.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:18:41 np0005463581.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.2947] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.2951] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.2964] device (eth1): Activation: successful, device activated.
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.2973] manager: startup complete
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.2975] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <warn>  [1759263521.2984] device (eth1): Activation: failed for connection 'Wired connection 1'
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.2995] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Sep 30 20:18:41 np0005463581.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3164] dhcp4 (eth1): canceled DHCP transaction
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3165] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3165] dhcp4 (eth1): state changed no lease
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3179] policy: auto-activating connection 'ci-private-network' (e5996ec0-9f62-5678-a1f9-72d8777d08c6)
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3183] device (eth1): Activation: starting connection 'ci-private-network' (e5996ec0-9f62-5678-a1f9-72d8777d08c6)
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3184] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3186] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3191] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3197] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3234] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3235] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:18:41 np0005463581.novalocal NetworkManager[3950]: <info>  [1759263521.3238] device (eth1): Activation: successful, device activated.
Sep 30 20:18:41 np0005463581.novalocal sshd-session[4041]: Disconnecting invalid user backup 185.217.1.246 port 6979: Change of username or service not allowed: (backup,ssh-connection) -> (mina,ssh-connection) [preauth]
Sep 30 20:18:44 np0005463581.novalocal sshd-session[4066]: Invalid user mina from 185.217.1.246 port 30300
Sep 30 20:18:44 np0005463581.novalocal sshd-session[4066]: Disconnecting invalid user mina 185.217.1.246 port 30300: Change of username or service not allowed: (mina,ssh-connection) -> (ubuntu,ssh-connection) [preauth]
Sep 30 20:18:47 np0005463581.novalocal sshd-session[4068]: Invalid user ubuntu from 185.217.1.246 port 60409
Sep 30 20:18:49 np0005463581.novalocal sshd-session[4068]: error: maximum authentication attempts exceeded for invalid user ubuntu from 185.217.1.246 port 60409 ssh2 [preauth]
Sep 30 20:18:49 np0005463581.novalocal sshd-session[4068]: Disconnecting invalid user ubuntu 185.217.1.246 port 60409: Too many authentication failures [preauth]
Sep 30 20:18:51 np0005463581.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:18:56 np0005463581.novalocal sshd-session[3713]: Received disconnect from 38.102.83.114 port 37784:11: disconnected by user
Sep 30 20:18:56 np0005463581.novalocal sshd-session[3713]: Disconnected from user zuul 38.102.83.114 port 37784
Sep 30 20:18:56 np0005463581.novalocal sshd-session[3710]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:18:56 np0005463581.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Sep 30 20:18:56 np0005463581.novalocal systemd[1]: session-3.scope: Consumed 1.922s CPU time.
Sep 30 20:18:56 np0005463581.novalocal systemd-logind[793]: Session 3 logged out. Waiting for processes to exit.
Sep 30 20:18:56 np0005463581.novalocal systemd-logind[793]: Removed session 3.
Sep 30 20:19:00 np0005463581.novalocal sshd-session[4070]: Invalid user ubuntu from 185.217.1.246 port 62417
Sep 30 20:19:05 np0005463581.novalocal sshd-session[4070]: Disconnecting invalid user ubuntu 185.217.1.246 port 62417: Change of username or service not allowed: (ubuntu,ssh-connection) -> (ltc,ssh-connection) [preauth]
Sep 30 20:19:15 np0005463581.novalocal sshd-session[4072]: Invalid user ltc from 185.217.1.246 port 42712
Sep 30 20:19:15 np0005463581.novalocal sshd-session[4072]: Disconnecting invalid user ltc 185.217.1.246 port 42712: Change of username or service not allowed: (ltc,ssh-connection) -> (lido,ssh-connection) [preauth]
Sep 30 20:19:21 np0005463581.novalocal sshd-session[4074]: Invalid user lido from 185.217.1.246 port 64823
Sep 30 20:19:23 np0005463581.novalocal sshd-session[4074]: Disconnecting invalid user lido 185.217.1.246 port 64823: Change of username or service not allowed: (lido,ssh-connection) -> (okx,ssh-connection) [preauth]
Sep 30 20:19:25 np0005463581.novalocal sshd-session[4078]: Accepted publickey for zuul from 38.102.83.114 port 55724 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:19:25 np0005463581.novalocal systemd-logind[793]: New session 4 of user zuul.
Sep 30 20:19:25 np0005463581.novalocal systemd[1]: Started Session 4 of User zuul.
Sep 30 20:19:25 np0005463581.novalocal sshd-session[4078]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:19:25 np0005463581.novalocal sshd-session[4076]: Invalid user okx from 185.217.1.246 port 39628
Sep 30 20:19:25 np0005463581.novalocal sudo[4157]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdvfyppmvkngowlmtqypbkdwgonzxfkr ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:19:25 np0005463581.novalocal sudo[4157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:19:25 np0005463581.novalocal python3[4159]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:19:25 np0005463581.novalocal sudo[4157]: pam_unix(sudo:session): session closed for user root
Sep 30 20:19:25 np0005463581.novalocal sudo[4230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmmtkuhjodlesgchjmyiszhzhkuopxdo ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:19:25 np0005463581.novalocal sudo[4230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:19:25 np0005463581.novalocal python3[4232]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759263565.204321-365-164937855792575/source _original_basename=tmpq5yy3ybj follow=False checksum=fce3c23802257d751453fb1742d790946575ef6f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:19:25 np0005463581.novalocal sudo[4230]: pam_unix(sudo:session): session closed for user root
Sep 30 20:19:26 np0005463581.novalocal sshd-session[4076]: Disconnecting invalid user okx 185.217.1.246 port 39628: Change of username or service not allowed: (okx,ssh-connection) -> (james,ssh-connection) [preauth]
Sep 30 20:19:29 np0005463581.novalocal sshd-session[4081]: Connection closed by 38.102.83.114 port 55724
Sep 30 20:19:29 np0005463581.novalocal sshd-session[4078]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:19:29 np0005463581.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Sep 30 20:19:29 np0005463581.novalocal systemd-logind[793]: Session 4 logged out. Waiting for processes to exit.
Sep 30 20:19:29 np0005463581.novalocal systemd-logind[793]: Removed session 4.
Sep 30 20:19:30 np0005463581.novalocal sshd-session[4257]: Invalid user james from 185.217.1.246 port 18087
Sep 30 20:19:30 np0005463581.novalocal sshd-session[4257]: Disconnecting invalid user james 185.217.1.246 port 18087: Change of username or service not allowed: (james,ssh-connection) -> (mine,ssh-connection) [preauth]
Sep 30 20:19:36 np0005463581.novalocal sshd-session[4259]: Invalid user mine from 185.217.1.246 port 51725
Sep 30 20:19:38 np0005463581.novalocal sshd-session[4259]: Disconnecting invalid user mine 185.217.1.246 port 51725: Change of username or service not allowed: (mine,ssh-connection) -> (operator,ssh-connection) [preauth]
Sep 30 20:19:46 np0005463581.novalocal sshd-session[4261]: error: maximum authentication attempts exceeded for operator from 185.217.1.246 port 49168 ssh2 [preauth]
Sep 30 20:19:46 np0005463581.novalocal sshd-session[4261]: Disconnecting authenticating user operator 185.217.1.246 port 49168: Too many authentication failures [preauth]
Sep 30 20:19:50 np0005463581.novalocal sshd-session[4263]: Disconnecting authenticating user operator 185.217.1.246 port 40208: Change of username or service not allowed: (operator,ssh-connection) -> (terraform,ssh-connection) [preauth]
Sep 30 20:19:58 np0005463581.novalocal sshd-session[4265]: Invalid user terraform from 185.217.1.246 port 1155
Sep 30 20:19:58 np0005463581.novalocal sshd-session[4265]: Disconnecting invalid user terraform 185.217.1.246 port 1155: Change of username or service not allowed: (terraform,ssh-connection) -> (usdt,ssh-connection) [preauth]
Sep 30 20:20:04 np0005463581.novalocal sshd-session[4267]: Invalid user usdt from 185.217.1.246 port 17750
Sep 30 20:20:07 np0005463581.novalocal sshd-session[4267]: Disconnecting invalid user usdt 185.217.1.246 port 17750: Change of username or service not allowed: (usdt,ssh-connection) -> (ftp,ssh-connection) [preauth]
Sep 30 20:20:20 np0005463581.novalocal sshd-session[4269]: error: maximum authentication attempts exceeded for ftp from 185.217.1.246 port 28774 ssh2 [preauth]
Sep 30 20:20:20 np0005463581.novalocal sshd-session[4269]: Disconnecting authenticating user ftp 185.217.1.246 port 28774: Too many authentication failures [preauth]
Sep 30 20:20:26 np0005463581.novalocal sshd-session[4271]: Disconnecting authenticating user ftp 185.217.1.246 port 60614: Change of username or service not allowed: (ftp,ssh-connection) -> (joseph,ssh-connection) [preauth]
Sep 30 20:20:30 np0005463581.novalocal sshd-session[4273]: Invalid user joseph from 185.217.1.246 port 40841
Sep 30 20:20:30 np0005463581.novalocal sshd-session[4273]: Disconnecting invalid user joseph 185.217.1.246 port 40841: Change of username or service not allowed: (joseph,ssh-connection) -> (tomcat,ssh-connection) [preauth]
Sep 30 20:20:36 np0005463581.novalocal sshd-session[4275]: Invalid user tomcat from 185.217.1.246 port 19152
Sep 30 20:20:37 np0005463581.novalocal sshd-session[4275]: error: maximum authentication attempts exceeded for invalid user tomcat from 185.217.1.246 port 19152 ssh2 [preauth]
Sep 30 20:20:37 np0005463581.novalocal sshd-session[4275]: Disconnecting invalid user tomcat 185.217.1.246 port 19152: Too many authentication failures [preauth]
Sep 30 20:20:40 np0005463581.novalocal sshd-session[4278]: Invalid user tomcat from 185.217.1.246 port 13168
Sep 30 20:20:42 np0005463581.novalocal sshd-session[4278]: Disconnecting invalid user tomcat 185.217.1.246 port 13168: Change of username or service not allowed: (tomcat,ssh-connection) -> (test,ssh-connection) [preauth]
Sep 30 20:20:48 np0005463581.novalocal sshd-session[4280]: Invalid user test from 185.217.1.246 port 48283
Sep 30 20:20:48 np0005463581.novalocal sshd-session[4280]: Disconnecting invalid user test 185.217.1.246 port 48283: Change of username or service not allowed: (test,ssh-connection) -> (oracle,ssh-connection) [preauth]
Sep 30 20:20:49 np0005463581.novalocal sshd-session[4282]: Invalid user oracle from 185.217.1.246 port 25663
Sep 30 20:20:52 np0005463581.novalocal sshd-session[4282]: Disconnecting invalid user oracle 185.217.1.246 port 25663: Change of username or service not allowed: (oracle,ssh-connection) -> (git,ssh-connection) [preauth]
Sep 30 20:20:57 np0005463581.novalocal sshd-session[4284]: Invalid user git from 185.217.1.246 port 49192
Sep 30 20:21:00 np0005463581.novalocal sshd-session[4284]: error: maximum authentication attempts exceeded for invalid user git from 185.217.1.246 port 49192 ssh2 [preauth]
Sep 30 20:21:00 np0005463581.novalocal sshd-session[4284]: Disconnecting invalid user git 185.217.1.246 port 49192: Too many authentication failures [preauth]
Sep 30 20:21:06 np0005463581.novalocal sshd-session[4286]: Invalid user git from 185.217.1.246 port 63869
Sep 30 20:21:06 np0005463581.novalocal sshd-session[4286]: Disconnecting invalid user git 185.217.1.246 port 63869: Change of username or service not allowed: (git,ssh-connection) -> (validate,ssh-connection) [preauth]
Sep 30 20:21:10 np0005463581.novalocal sshd-session[4288]: Invalid user validate from 185.217.1.246 port 41745
Sep 30 20:21:10 np0005463581.novalocal sshd-session[4288]: Disconnecting invalid user validate 185.217.1.246 port 41745: Change of username or service not allowed: (validate,ssh-connection) -> (monitor,ssh-connection) [preauth]
Sep 30 20:21:18 np0005463581.novalocal sshd-session[4290]: Invalid user monitor from 185.217.1.246 port 18704
Sep 30 20:21:20 np0005463581.novalocal sshd-session[4290]: Disconnecting invalid user monitor 185.217.1.246 port 18704: Change of username or service not allowed: (monitor,ssh-connection) -> (jenkins,ssh-connection) [preauth]
Sep 30 20:21:28 np0005463581.novalocal sshd-session[4292]: Invalid user jenkins from 185.217.1.246 port 30700
Sep 30 20:21:33 np0005463581.novalocal sshd-session[4292]: error: maximum authentication attempts exceeded for invalid user jenkins from 185.217.1.246 port 30700 ssh2 [preauth]
Sep 30 20:21:33 np0005463581.novalocal sshd-session[4292]: Disconnecting invalid user jenkins 185.217.1.246 port 30700: Too many authentication failures [preauth]
Sep 30 20:21:38 np0005463581.novalocal systemd[1059]: Created slice User Background Tasks Slice.
Sep 30 20:21:38 np0005463581.novalocal systemd[1059]: Starting Cleanup of User's Temporary Files and Directories...
Sep 30 20:21:38 np0005463581.novalocal systemd[1059]: Finished Cleanup of User's Temporary Files and Directories.
Sep 30 20:21:39 np0005463581.novalocal sshd-session[4294]: Invalid user jenkins from 185.217.1.246 port 1798
Sep 30 20:21:40 np0005463581.novalocal sshd-session[4294]: Disconnecting invalid user jenkins 185.217.1.246 port 1798: Change of username or service not allowed: (jenkins,ssh-connection) -> (ada,ssh-connection) [preauth]
Sep 30 20:21:42 np0005463581.novalocal sshd-session[4298]: Invalid user ada from 185.217.1.246 port 5540
Sep 30 20:21:43 np0005463581.novalocal sshd-session[4298]: Disconnecting invalid user ada 185.217.1.246 port 5540: Change of username or service not allowed: (ada,ssh-connection) -> (fa,ssh-connection) [preauth]
Sep 30 20:21:51 np0005463581.novalocal sshd-session[4301]: Invalid user fa from 185.217.1.246 port 28257
Sep 30 20:21:53 np0005463581.novalocal sshd-session[4301]: Disconnecting invalid user fa 185.217.1.246 port 28257: Change of username or service not allowed: (fa,ssh-connection) -> (pi,ssh-connection) [preauth]
Sep 30 20:21:58 np0005463581.novalocal sshd-session[4304]: Invalid user pi from 185.217.1.246 port 36004
Sep 30 20:21:59 np0005463581.novalocal sshd-session[4304]: Disconnecting invalid user pi 185.217.1.246 port 36004: Change of username or service not allowed: (pi,ssh-connection) -> (admin,ssh-connection) [preauth]
Sep 30 20:22:01 np0005463581.novalocal sshd-session[4306]: Invalid user admin from 185.217.1.246 port 18584
Sep 30 20:22:04 np0005463581.novalocal sshd-session[4306]: error: maximum authentication attempts exceeded for invalid user admin from 185.217.1.246 port 18584 ssh2 [preauth]
Sep 30 20:22:04 np0005463581.novalocal sshd-session[4306]: Disconnecting invalid user admin 185.217.1.246 port 18584: Too many authentication failures [preauth]
Sep 30 20:22:07 np0005463581.novalocal sshd-session[4308]: Invalid user admin from 185.217.1.246 port 1100
Sep 30 20:22:09 np0005463581.novalocal sshd-session[4308]: error: maximum authentication attempts exceeded for invalid user admin from 185.217.1.246 port 1100 ssh2 [preauth]
Sep 30 20:22:09 np0005463581.novalocal sshd-session[4308]: Disconnecting invalid user admin 185.217.1.246 port 1100: Too many authentication failures [preauth]
Sep 30 20:22:13 np0005463581.novalocal sshd-session[4310]: Invalid user admin from 185.217.1.246 port 44386
Sep 30 20:22:13 np0005463581.novalocal sshd-session[4310]: Disconnecting invalid user admin 185.217.1.246 port 44386: Change of username or service not allowed: (admin,ssh-connection) -> (authority,ssh-connection) [preauth]
Sep 30 20:22:24 np0005463581.novalocal sshd-session[4312]: Invalid user authority from 185.217.1.246 port 33142
Sep 30 20:22:24 np0005463581.novalocal sshd-session[4312]: Disconnecting invalid user authority 185.217.1.246 port 33142: Change of username or service not allowed: (authority,ssh-connection) -> (partimag,ssh-connection) [preauth]
Sep 30 20:22:30 np0005463581.novalocal sshd-session[4315]: Invalid user partimag from 185.217.1.246 port 54612
Sep 30 20:22:30 np0005463581.novalocal sshd-session[4315]: Disconnecting invalid user partimag 185.217.1.246 port 54612: Change of username or service not allowed: (partimag,ssh-connection) -> (michael,ssh-connection) [preauth]
Sep 30 20:22:34 np0005463581.novalocal sshd-session[4317]: Invalid user michael from 185.217.1.246 port 41185
Sep 30 20:22:34 np0005463581.novalocal sshd-session[4317]: Disconnecting invalid user michael 185.217.1.246 port 41185: Change of username or service not allowed: (michael,ssh-connection) -> (user12,ssh-connection) [preauth]
Sep 30 20:22:38 np0005463581.novalocal sshd-session[4319]: Invalid user user12 from 185.217.1.246 port 52156
Sep 30 20:22:39 np0005463581.novalocal sshd-session[4319]: Disconnecting invalid user user12 185.217.1.246 port 52156: Change of username or service not allowed: (user12,ssh-connection) -> (user15,ssh-connection) [preauth]
Sep 30 20:22:48 np0005463581.novalocal sshd-session[4321]: Invalid user user15 from 185.217.1.246 port 44302
Sep 30 20:22:51 np0005463581.novalocal sshd-session[4321]: Disconnecting invalid user user15 185.217.1.246 port 44302: Change of username or service not allowed: (user15,ssh-connection) -> (kraken,ssh-connection) [preauth]
Sep 30 20:22:58 np0005463581.novalocal sshd-session[4323]: Invalid user kraken from 185.217.1.246 port 20392
Sep 30 20:22:59 np0005463581.novalocal sshd-session[4323]: Disconnecting invalid user kraken 185.217.1.246 port 20392: Change of username or service not allowed: (kraken,ssh-connection) -> (USER1,ssh-connection) [preauth]
Sep 30 20:23:04 np0005463581.novalocal sshd-session[4325]: Invalid user USER1 from 185.217.1.246 port 20088
Sep 30 20:23:05 np0005463581.novalocal sshd-session[4325]: Disconnecting invalid user USER1 185.217.1.246 port 20088: Change of username or service not allowed: (USER1,ssh-connection) -> (user20,ssh-connection) [preauth]
Sep 30 20:23:11 np0005463581.novalocal sshd-session[4327]: Invalid user user20 from 185.217.1.246 port 54460
Sep 30 20:23:11 np0005463581.novalocal sshd-session[4327]: Disconnecting invalid user user20 185.217.1.246 port 54460: Change of username or service not allowed: (user20,ssh-connection) -> (delegate,ssh-connection) [preauth]
Sep 30 20:23:20 np0005463581.novalocal sshd-session[4329]: Invalid user delegate from 185.217.1.246 port 55639
Sep 30 20:23:20 np0005463581.novalocal sshd-session[4329]: Disconnecting invalid user delegate 185.217.1.246 port 55639: Change of username or service not allowed: (delegate,ssh-connection) -> (doge,ssh-connection) [preauth]
Sep 30 20:23:24 np0005463581.novalocal sshd-session[4331]: Invalid user doge from 185.217.1.246 port 58125
Sep 30 20:23:24 np0005463581.novalocal sshd-session[4331]: Disconnecting invalid user doge 185.217.1.246 port 58125: Change of username or service not allowed: (doge,ssh-connection) -> (user-1,ssh-connection) [preauth]
Sep 30 20:23:26 np0005463581.novalocal sshd-session[4333]: Invalid user user-1 from 185.217.1.246 port 20820
Sep 30 20:23:27 np0005463581.novalocal sshd-session[4333]: Disconnecting invalid user user-1 185.217.1.246 port 20820: Change of username or service not allowed: (user-1,ssh-connection) -> (miner,ssh-connection) [preauth]
Sep 30 20:23:31 np0005463581.novalocal sshd-session[4335]: Invalid user miner from 185.217.1.246 port 52501
Sep 30 20:23:31 np0005463581.novalocal sshd-session[4335]: Disconnecting invalid user miner 185.217.1.246 port 52501: Change of username or service not allowed: (miner,ssh-connection) -> (robert,ssh-connection) [preauth]
Sep 30 20:23:35 np0005463581.novalocal sshd-session[4337]: Invalid user robert from 185.217.1.246 port 13889
Sep 30 20:23:36 np0005463581.novalocal sshd-session[4337]: Disconnecting invalid user robert 185.217.1.246 port 13889: Change of username or service not allowed: (robert,ssh-connection) -> (teamspeak,ssh-connection) [preauth]
Sep 30 20:23:41 np0005463581.novalocal sshd-session[4339]: Invalid user teamspeak from 185.217.1.246 port 56208
Sep 30 20:23:42 np0005463581.novalocal sshd-session[4339]: Disconnecting invalid user teamspeak 185.217.1.246 port 56208: Change of username or service not allowed: (teamspeak,ssh-connection) -> (debian,ssh-connection) [preauth]
Sep 30 20:23:47 np0005463581.novalocal sshd-session[4341]: Invalid user debian from 185.217.1.246 port 50683
Sep 30 20:23:52 np0005463581.novalocal sshd-session[4341]: Disconnecting invalid user debian 185.217.1.246 port 50683: Change of username or service not allowed: (debian,ssh-connection) -> (user1,ssh-connection) [preauth]
Sep 30 20:23:55 np0005463581.novalocal sshd[1006]: Timeout before authentication for connection from 192.140.173.109 to 38.102.83.50, pid = 4303
Sep 30 20:23:56 np0005463581.novalocal sshd-session[4343]: Invalid user user1 from 185.217.1.246 port 8395
Sep 30 20:24:01 np0005463581.novalocal sshd-session[4343]: error: maximum authentication attempts exceeded for invalid user user1 from 185.217.1.246 port 8395 ssh2 [preauth]
Sep 30 20:24:01 np0005463581.novalocal sshd-session[4343]: Disconnecting invalid user user1 185.217.1.246 port 8395: Too many authentication failures [preauth]
Sep 30 20:24:05 np0005463581.novalocal sshd-session[4345]: Invalid user user1 from 185.217.1.246 port 11008
Sep 30 20:24:07 np0005463581.novalocal sshd-session[4345]: error: maximum authentication attempts exceeded for invalid user user1 from 185.217.1.246 port 11008 ssh2 [preauth]
Sep 30 20:24:07 np0005463581.novalocal sshd-session[4345]: Disconnecting invalid user user1 185.217.1.246 port 11008: Too many authentication failures [preauth]
Sep 30 20:24:10 np0005463581.novalocal sshd-session[4347]: Invalid user user1 from 185.217.1.246 port 52815
Sep 30 20:24:14 np0005463581.novalocal sshd-session[4347]: error: maximum authentication attempts exceeded for invalid user user1 from 185.217.1.246 port 52815 ssh2 [preauth]
Sep 30 20:24:14 np0005463581.novalocal sshd-session[4347]: Disconnecting invalid user user1 185.217.1.246 port 52815: Too many authentication failures [preauth]
Sep 30 20:24:16 np0005463581.novalocal sshd-session[4349]: Invalid user user1 from 185.217.1.246 port 52930
Sep 30 20:24:22 np0005463581.novalocal sshd-session[4349]: Disconnecting invalid user user1 185.217.1.246 port 52930: Change of username or service not allowed: (user1,ssh-connection) -> (stake,ssh-connection) [preauth]
Sep 30 20:24:26 np0005463581.novalocal sshd-session[4351]: Invalid user stake from 185.217.1.246 port 3625
Sep 30 20:24:26 np0005463581.novalocal sshd-session[4351]: Disconnecting invalid user stake 185.217.1.246 port 3625: Change of username or service not allowed: (stake,ssh-connection) -> (binance,ssh-connection) [preauth]
Sep 30 20:24:28 np0005463581.novalocal sshd-session[4353]: Invalid user binance from 185.217.1.246 port 20108
Sep 30 20:24:28 np0005463581.novalocal sshd-session[4353]: Disconnecting invalid user binance 185.217.1.246 port 20108: Change of username or service not allowed: (binance,ssh-connection) -> (usdc,ssh-connection) [preauth]
Sep 30 20:24:35 np0005463581.novalocal sshd-session[4355]: Invalid user usdc from 185.217.1.246 port 2142
Sep 30 20:24:35 np0005463581.novalocal sshd-session[4355]: Disconnecting invalid user usdc 185.217.1.246 port 2142: Change of username or service not allowed: (usdc,ssh-connection) -> (ether,ssh-connection) [preauth]
Sep 30 20:24:38 np0005463581.novalocal sshd-session[4357]: Invalid user ether from 185.217.1.246 port 29297
Sep 30 20:24:43 np0005463581.novalocal sshd-session[4357]: Disconnecting invalid user ether 185.217.1.246 port 29297: Change of username or service not allowed: (ether,ssh-connection) -> (user14,ssh-connection) [preauth]
Sep 30 20:24:51 np0005463581.novalocal sshd-session[4359]: Invalid user user14 from 185.217.1.246 port 55344
Sep 30 20:24:52 np0005463581.novalocal sshd-session[4359]: Disconnecting invalid user user14 185.217.1.246 port 55344: Change of username or service not allowed: (user14,ssh-connection) -> (user_1,ssh-connection) [preauth]
Sep 30 20:24:58 np0005463581.novalocal sshd-session[4361]: Invalid user user_1 from 185.217.1.246 port 50719
Sep 30 20:24:59 np0005463581.novalocal sshd-session[4361]: Disconnecting invalid user user_1 185.217.1.246 port 50719: Change of username or service not allowed: (user_1,ssh-connection) -> (useradmin,ssh-connection) [preauth]
Sep 30 20:25:05 np0005463581.novalocal sshd-session[4363]: Invalid user useradmin from 185.217.1.246 port 57550
Sep 30 20:25:06 np0005463581.novalocal sshd-session[4363]: Disconnecting invalid user useradmin 185.217.1.246 port 57550: Change of username or service not allowed: (useradmin,ssh-connection) -> (polkadot,ssh-connection) [preauth]
Sep 30 20:25:11 np0005463581.novalocal sshd-session[4365]: Invalid user polkadot from 185.217.1.246 port 38059
Sep 30 20:25:12 np0005463581.novalocal sshd-session[4365]: Disconnecting invalid user polkadot 185.217.1.246 port 38059: Change of username or service not allowed: (polkadot,ssh-connection) -> (trx,ssh-connection) [preauth]
Sep 30 20:25:15 np0005463581.novalocal sshd-session[4370]: Accepted publickey for zuul from 38.102.83.114 port 42368 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:25:15 np0005463581.novalocal systemd-logind[793]: New session 5 of user zuul.
Sep 30 20:25:15 np0005463581.novalocal systemd[1]: Started Session 5 of User zuul.
Sep 30 20:25:15 np0005463581.novalocal sshd-session[4370]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:25:15 np0005463581.novalocal sudo[4397]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sharmjhksqycwkxwguyjuqurxcfsvrxz ; /usr/bin/python3'
Sep 30 20:25:15 np0005463581.novalocal sudo[4397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:15 np0005463581.novalocal sshd-session[4367]: Invalid user trx from 185.217.1.246 port 17801
Sep 30 20:25:15 np0005463581.novalocal python3[4399]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-e819-e42a-000000000ca0-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:15 np0005463581.novalocal sudo[4397]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:15 np0005463581.novalocal sshd-session[4367]: Disconnecting invalid user trx 185.217.1.246 port 17801: Change of username or service not allowed: (trx,ssh-connection) -> (user19,ssh-connection) [preauth]
Sep 30 20:25:16 np0005463581.novalocal sudo[4427]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuwntelmujcwrfyqkuuwltwsbfticgxa ; /usr/bin/python3'
Sep 30 20:25:16 np0005463581.novalocal sudo[4427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:16 np0005463581.novalocal python3[4429]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:16 np0005463581.novalocal sudo[4427]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:16 np0005463581.novalocal sudo[4454]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyluxqnpljvtteoiryeiabjdbylspzyo ; /usr/bin/python3'
Sep 30 20:25:16 np0005463581.novalocal sudo[4454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:16 np0005463581.novalocal python3[4456]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:16 np0005463581.novalocal sudo[4454]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:16 np0005463581.novalocal sudo[4480]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvqanqkygvvzmltemwaygbwvsgwebgjx ; /usr/bin/python3'
Sep 30 20:25:16 np0005463581.novalocal sudo[4480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:17 np0005463581.novalocal python3[4482]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:17 np0005463581.novalocal sudo[4480]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:17 np0005463581.novalocal sudo[4506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pazntunuygqmczlkpbpqpjkwjvborsbq ; /usr/bin/python3'
Sep 30 20:25:17 np0005463581.novalocal sudo[4506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:17 np0005463581.novalocal sshd-session[4409]: Invalid user user19 from 185.217.1.246 port 41842
Sep 30 20:25:17 np0005463581.novalocal python3[4508]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:17 np0005463581.novalocal sudo[4506]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:17 np0005463581.novalocal sshd-session[4409]: Disconnecting invalid user user19 185.217.1.246 port 41842: Change of username or service not allowed: (user19,ssh-connection) -> (gwei,ssh-connection) [preauth]
Sep 30 20:25:17 np0005463581.novalocal sudo[4532]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slyyeiywqzdtxpgxbaagqixdxhebdndq ; /usr/bin/python3'
Sep 30 20:25:17 np0005463581.novalocal sudo[4532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:17 np0005463581.novalocal python3[4534]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:17 np0005463581.novalocal python3[4534]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Sep 30 20:25:17 np0005463581.novalocal sudo[4532]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:18 np0005463581.novalocal sudo[4558]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnkcsmqhquekadigodkakfqdhhvijbcn ; /usr/bin/python3'
Sep 30 20:25:18 np0005463581.novalocal sudo[4558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:18 np0005463581.novalocal python3[4560]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 20:25:18 np0005463581.novalocal systemd[1]: Reloading.
Sep 30 20:25:18 np0005463581.novalocal systemd-rc-local-generator[4582]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:25:18 np0005463581.novalocal sudo[4558]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:20 np0005463581.novalocal sudo[4616]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inaqrhvmsgamnywaxgeddyxecudsbtfu ; /usr/bin/python3'
Sep 30 20:25:20 np0005463581.novalocal sudo[4616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:20 np0005463581.novalocal python3[4618]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Sep 30 20:25:20 np0005463581.novalocal sudo[4616]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:20 np0005463581.novalocal sudo[4642]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdzfbxfhksyeamfqydlhdwuyadmppaew ; /usr/bin/python3'
Sep 30 20:25:20 np0005463581.novalocal sudo[4642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:20 np0005463581.novalocal python3[4644]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:20 np0005463581.novalocal sudo[4642]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:21 np0005463581.novalocal sudo[4670]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqwyrigptwbfvhbgjivuekkicmdzuucq ; /usr/bin/python3'
Sep 30 20:25:21 np0005463581.novalocal sudo[4670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:21 np0005463581.novalocal python3[4672]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:21 np0005463581.novalocal sudo[4670]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:21 np0005463581.novalocal sudo[4698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clsnfyaipkzpgpxgfinfrgigblsoukkw ; /usr/bin/python3'
Sep 30 20:25:21 np0005463581.novalocal sudo[4698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:21 np0005463581.novalocal python3[4700]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:21 np0005463581.novalocal sudo[4698]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:21 np0005463581.novalocal sudo[4726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgsykupwoqackgicsxkyaqtwwxwkqike ; /usr/bin/python3'
Sep 30 20:25:21 np0005463581.novalocal sudo[4726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:21 np0005463581.novalocal python3[4728]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:21 np0005463581.novalocal sudo[4726]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:22 np0005463581.novalocal python3[4755]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-e819-e42a-000000000ca6-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:22 np0005463581.novalocal python3[4785]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:25:24 np0005463581.novalocal sshd-session[4561]: Invalid user gwei from 185.217.1.246 port 58938
Sep 30 20:25:25 np0005463581.novalocal sshd-session[4373]: Connection closed by 38.102.83.114 port 42368
Sep 30 20:25:25 np0005463581.novalocal sshd-session[4370]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:25:25 np0005463581.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Sep 30 20:25:25 np0005463581.novalocal systemd[1]: session-5.scope: Consumed 3.999s CPU time.
Sep 30 20:25:25 np0005463581.novalocal systemd-logind[793]: Session 5 logged out. Waiting for processes to exit.
Sep 30 20:25:25 np0005463581.novalocal systemd-logind[793]: Removed session 5.
Sep 30 20:25:26 np0005463581.novalocal sshd-session[4561]: Disconnecting invalid user gwei 185.217.1.246 port 58938: Change of username or service not allowed: (gwei,ssh-connection) -> (hpc-riscv,ssh-connection) [preauth]
Sep 30 20:25:27 np0005463581.novalocal sshd-session[4794]: Accepted publickey for zuul from 38.102.83.114 port 56748 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:25:27 np0005463581.novalocal systemd-logind[793]: New session 6 of user zuul.
Sep 30 20:25:27 np0005463581.novalocal systemd[1]: Started Session 6 of User zuul.
Sep 30 20:25:27 np0005463581.novalocal sshd-session[4794]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:25:27 np0005463581.novalocal sudo[4821]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzkqsfrcpcgneiadxnomexkrgasitltf ; /usr/bin/python3'
Sep 30 20:25:27 np0005463581.novalocal sudo[4821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:27 np0005463581.novalocal python3[4823]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Sep 30 20:25:28 np0005463581.novalocal sshd-session[4792]: Invalid user hpc-riscv from 185.217.1.246 port 62882
Sep 30 20:25:28 np0005463581.novalocal sshd-session[4792]: Disconnecting invalid user hpc-riscv 185.217.1.246 port 62882: Change of username or service not allowed: (hpc-riscv,ssh-connection) -> (sync,ssh-connection) [preauth]
Sep 30 20:25:34 np0005463581.novalocal sshd-session[4833]: Disconnecting authenticating user sync 185.217.1.246 port 23660: Change of username or service not allowed: (sync,ssh-connection) -> (cirros,ssh-connection) [preauth]
Sep 30 20:25:38 np0005463581.novalocal sshd-session[4865]: Invalid user cirros from 185.217.1.246 port 13047
Sep 30 20:25:39 np0005463581.novalocal sshd-session[4865]: Disconnecting invalid user cirros 185.217.1.246 port 13047: Change of username or service not allowed: (cirros,ssh-connection) -> (volumio,ssh-connection) [preauth]
Sep 30 20:25:44 np0005463581.novalocal kernel: SELinux:  Converting 363 SID table entries...
Sep 30 20:25:44 np0005463581.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:25:44 np0005463581.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 20:25:44 np0005463581.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:25:44 np0005463581.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:25:44 np0005463581.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:25:44 np0005463581.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:25:44 np0005463581.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:25:45 np0005463581.novalocal sshd-session[4867]: Invalid user volumio from 185.217.1.246 port 36876
Sep 30 20:25:45 np0005463581.novalocal sshd-session[4867]: Disconnecting invalid user volumio 185.217.1.246 port 36876: Change of username or service not allowed: (volumio,ssh-connection) -> (kali,ssh-connection) [preauth]
Sep 30 20:25:54 np0005463581.novalocal kernel: SELinux:  Converting 363 SID table entries...
Sep 30 20:25:54 np0005463581.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:25:54 np0005463581.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 20:25:54 np0005463581.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:25:54 np0005463581.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:25:54 np0005463581.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:25:54 np0005463581.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:25:54 np0005463581.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:25:57 np0005463581.novalocal sshd-session[4879]: Invalid user kali from 185.217.1.246 port 57929
Sep 30 20:25:57 np0005463581.novalocal sshd-session[4879]: Disconnecting invalid user kali 185.217.1.246 port 57929: Change of username or service not allowed: (kali,ssh-connection) -> (root,ssh-connection) [preauth]
Sep 30 20:26:04 np0005463581.novalocal kernel: SELinux:  Converting 363 SID table entries...
Sep 30 20:26:04 np0005463581.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:26:04 np0005463581.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 20:26:04 np0005463581.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:26:04 np0005463581.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:26:04 np0005463581.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:26:04 np0005463581.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:26:04 np0005463581.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:26:06 np0005463581.novalocal setsebool[4898]: The virt_use_nfs policy boolean was changed to 1 by root
Sep 30 20:26:06 np0005463581.novalocal setsebool[4898]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Sep 30 20:26:07 np0005463581.novalocal sshd-session[4888]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 62716 ssh2 [preauth]
Sep 30 20:26:07 np0005463581.novalocal sshd-session[4888]: Disconnecting authenticating user root 185.217.1.246 port 62716: Too many authentication failures [preauth]
Sep 30 20:26:14 np0005463581.novalocal sshd-session[4906]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14338 ssh2 [preauth]
Sep 30 20:26:14 np0005463581.novalocal sshd-session[4906]: Disconnecting authenticating user root 185.217.1.246 port 14338: Too many authentication failures [preauth]
Sep 30 20:26:18 np0005463581.novalocal kernel: SELinux:  Converting 366 SID table entries...
Sep 30 20:26:18 np0005463581.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:26:18 np0005463581.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 20:26:18 np0005463581.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:26:18 np0005463581.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:26:18 np0005463581.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:26:18 np0005463581.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:26:18 np0005463581.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:26:23 np0005463581.novalocal sshd-session[4910]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 2134 ssh2 [preauth]
Sep 30 20:26:23 np0005463581.novalocal sshd-session[4910]: Disconnecting authenticating user root 185.217.1.246 port 2134: Too many authentication failures [preauth]
Sep 30 20:26:29 np0005463581.novalocal sshd-session[5617]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 30546 ssh2 [preauth]
Sep 30 20:26:29 np0005463581.novalocal sshd-session[5617]: Disconnecting authenticating user root 185.217.1.246 port 30546: Too many authentication failures [preauth]
Sep 30 20:26:36 np0005463581.novalocal dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 20:26:36 np0005463581.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:26:36 np0005463581.novalocal systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:26:36 np0005463581.novalocal systemd[1]: Reloading.
Sep 30 20:26:36 np0005463581.novalocal systemd-rc-local-generator[5663]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:26:36 np0005463581.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:26:37 np0005463581.novalocal systemd[1]: Starting PackageKit Daemon...
Sep 30 20:26:37 np0005463581.novalocal PackageKit[6441]: daemon start
Sep 30 20:26:37 np0005463581.novalocal systemd[1]: Starting Authorization Manager...
Sep 30 20:26:37 np0005463581.novalocal polkitd[6517]: Started polkitd version 0.117
Sep 30 20:26:37 np0005463581.novalocal polkitd[6517]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 20:26:37 np0005463581.novalocal polkitd[6517]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 20:26:37 np0005463581.novalocal polkitd[6517]: Finished loading, compiling and executing 3 rules
Sep 30 20:26:37 np0005463581.novalocal systemd[1]: Started Authorization Manager.
Sep 30 20:26:37 np0005463581.novalocal polkitd[6517]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Sep 30 20:26:37 np0005463581.novalocal systemd[1]: Started PackageKit Daemon.
Sep 30 20:26:38 np0005463581.novalocal sudo[4821]: pam_unix(sudo:session): session closed for user root
Sep 30 20:26:39 np0005463581.novalocal sshd-session[5619]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 9211 ssh2 [preauth]
Sep 30 20:26:39 np0005463581.novalocal sshd-session[5619]: Disconnecting authenticating user root 185.217.1.246 port 9211: Too many authentication failures [preauth]
Sep 30 20:26:42 np0005463581.novalocal sshd-session[8023]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14644 ssh2 [preauth]
Sep 30 20:26:42 np0005463581.novalocal sshd-session[8023]: Disconnecting authenticating user root 185.217.1.246 port 14644: Too many authentication failures [preauth]
Sep 30 20:26:50 np0005463581.novalocal sshd-session[11469]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 12089 ssh2 [preauth]
Sep 30 20:26:50 np0005463581.novalocal sshd-session[11469]: Disconnecting authenticating user root 185.217.1.246 port 12089: Too many authentication failures [preauth]
Sep 30 20:26:58 np0005463581.novalocal sshd-session[12795]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 42749 ssh2 [preauth]
Sep 30 20:26:58 np0005463581.novalocal sshd-session[12795]: Disconnecting authenticating user root 185.217.1.246 port 42749: Too many authentication failures [preauth]
Sep 30 20:27:01 np0005463581.novalocal irqbalance[788]: Cannot change IRQ 27 affinity: Operation not permitted
Sep 30 20:27:01 np0005463581.novalocal irqbalance[788]: IRQ 27 affinity is now unmanaged
Sep 30 20:27:07 np0005463581.novalocal python3[16576]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-d4e4-86ab-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:27:08 np0005463581.novalocal sshd-session[14818]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 54631 ssh2 [preauth]
Sep 30 20:27:08 np0005463581.novalocal sshd-session[14818]: Disconnecting authenticating user root 185.217.1.246 port 54631: Too many authentication failures [preauth]
Sep 30 20:27:08 np0005463581.novalocal kernel: evm: overlay not supported
Sep 30 20:27:09 np0005463581.novalocal systemd[1059]: Starting D-Bus User Message Bus...
Sep 30 20:27:09 np0005463581.novalocal dbus-broker-launch[16983]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Sep 30 20:27:09 np0005463581.novalocal dbus-broker-launch[16983]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Sep 30 20:27:09 np0005463581.novalocal systemd[1059]: Started D-Bus User Message Bus.
Sep 30 20:27:09 np0005463581.novalocal dbus-broker-lau[16983]: Ready
Sep 30 20:27:09 np0005463581.novalocal systemd[1059]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 20:27:09 np0005463581.novalocal systemd[1059]: Created slice Slice /user.
Sep 30 20:27:09 np0005463581.novalocal systemd[1059]: podman-16869.scope: unit configures an IP firewall, but not running as root.
Sep 30 20:27:09 np0005463581.novalocal systemd[1059]: (This warning is only shown for the first unit using IP firewalling.)
Sep 30 20:27:09 np0005463581.novalocal systemd[1059]: Started podman-16869.scope.
Sep 30 20:27:09 np0005463581.novalocal systemd[1059]: Started podman-pause-66e2949a.scope.
Sep 30 20:27:10 np0005463581.novalocal sudo[17482]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnugmuhxhngcgvimcfhvatstrkseobom ; /usr/bin/python3'
Sep 30 20:27:10 np0005463581.novalocal sudo[17482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:10 np0005463581.novalocal python3[17494]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.98:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.98:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:27:11 np0005463581.novalocal sudo[17482]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:11 np0005463581.novalocal sshd-session[4797]: Connection closed by 38.102.83.114 port 56748
Sep 30 20:27:11 np0005463581.novalocal sshd-session[4794]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:27:11 np0005463581.novalocal systemd-logind[793]: Session 6 logged out. Waiting for processes to exit.
Sep 30 20:27:11 np0005463581.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Sep 30 20:27:11 np0005463581.novalocal systemd[1]: session-6.scope: Consumed 1min 6.069s CPU time.
Sep 30 20:27:11 np0005463581.novalocal systemd-logind[793]: Removed session 6.
Sep 30 20:27:21 np0005463581.novalocal sshd-session[17070]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 8598 ssh2 [preauth]
Sep 30 20:27:21 np0005463581.novalocal sshd-session[17070]: Disconnecting authenticating user root 185.217.1.246 port 8598: Too many authentication failures [preauth]
Sep 30 20:27:30 np0005463581.novalocal sshd-session[20540]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 38759 ssh2 [preauth]
Sep 30 20:27:30 np0005463581.novalocal sshd-session[20540]: Disconnecting authenticating user root 185.217.1.246 port 38759: Too many authentication failures [preauth]
Sep 30 20:27:33 np0005463581.novalocal sshd-session[23732]: Connection closed by 38.102.83.65 port 54912 [preauth]
Sep 30 20:27:33 np0005463581.novalocal sshd-session[23734]: Connection closed by 38.102.83.65 port 54922 [preauth]
Sep 30 20:27:33 np0005463581.novalocal sshd-session[23737]: Unable to negotiate with 38.102.83.65 port 54950: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Sep 30 20:27:33 np0005463581.novalocal sshd-session[23735]: Unable to negotiate with 38.102.83.65 port 54964: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Sep 30 20:27:33 np0005463581.novalocal sshd-session[23738]: Unable to negotiate with 38.102.83.65 port 54934: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Sep 30 20:27:38 np0005463581.novalocal sshd-session[24999]: Accepted publickey for zuul from 38.102.83.114 port 47788 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:27:38 np0005463581.novalocal systemd-logind[793]: New session 7 of user zuul.
Sep 30 20:27:38 np0005463581.novalocal systemd[1]: Started Session 7 of User zuul.
Sep 30 20:27:38 np0005463581.novalocal sshd-session[24999]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:27:39 np0005463581.novalocal python3[25082]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKP9FKOWW9m598mLycZPRqlsVgMJVsU+fwfdzM9eBK+TwOcguPrf/EtkmgDfbYrAdUCisCCNyF1sODVm7Os50jE= zuul@np0005463579.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:27:39 np0005463581.novalocal sudo[25215]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmsrlzqehkvefsikzteahcgvjvskvksm ; /usr/bin/python3'
Sep 30 20:27:39 np0005463581.novalocal sudo[25215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:39 np0005463581.novalocal python3[25221]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKP9FKOWW9m598mLycZPRqlsVgMJVsU+fwfdzM9eBK+TwOcguPrf/EtkmgDfbYrAdUCisCCNyF1sODVm7Os50jE= zuul@np0005463579.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:27:39 np0005463581.novalocal sudo[25215]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:40 np0005463581.novalocal sudo[25461]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqopvbyhtnasbizqbmmclutrkajakiph ; /usr/bin/python3'
Sep 30 20:27:40 np0005463581.novalocal sudo[25461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:40 np0005463581.novalocal sshd-session[23430]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 7764 ssh2 [preauth]
Sep 30 20:27:40 np0005463581.novalocal sshd-session[23430]: Disconnecting authenticating user root 185.217.1.246 port 7764: Too many authentication failures [preauth]
Sep 30 20:27:40 np0005463581.novalocal python3[25470]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005463581.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Sep 30 20:27:40 np0005463581.novalocal useradd[25527]: new group: name=cloud-admin, GID=1002
Sep 30 20:27:40 np0005463581.novalocal useradd[25527]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Sep 30 20:27:40 np0005463581.novalocal sudo[25461]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:40 np0005463581.novalocal sudo[25640]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bakvypauzjotvocuhczzsnrqpvjmiegz ; /usr/bin/python3'
Sep 30 20:27:40 np0005463581.novalocal sudo[25640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:41 np0005463581.novalocal python3[25648]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKP9FKOWW9m598mLycZPRqlsVgMJVsU+fwfdzM9eBK+TwOcguPrf/EtkmgDfbYrAdUCisCCNyF1sODVm7Os50jE= zuul@np0005463579.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:27:41 np0005463581.novalocal sudo[25640]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:41 np0005463581.novalocal sudo[25835]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcwgnxzqvdxbailrtwdxobddgimgqlza ; /usr/bin/python3'
Sep 30 20:27:41 np0005463581.novalocal sudo[25835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:41 np0005463581.novalocal python3[25841]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:27:41 np0005463581.novalocal sudo[25835]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:41 np0005463581.novalocal sudo[26012]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfkoojugarscypovnrznnmxrwdaevauo ; /usr/bin/python3'
Sep 30 20:27:41 np0005463581.novalocal sudo[26012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:42 np0005463581.novalocal python3[26019]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759264061.303668-168-97462096776081/source _original_basename=tmpjfz994vz follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:27:42 np0005463581.novalocal sudo[26012]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:42 np0005463581.novalocal sudo[26274]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhmujfghznxfwrcndeihjgkzhbevyatp ; /usr/bin/python3'
Sep 30 20:27:42 np0005463581.novalocal sudo[26274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:43 np0005463581.novalocal python3[26285]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Sep 30 20:27:43 np0005463581.novalocal systemd[1]: Starting Hostname Service...
Sep 30 20:27:43 np0005463581.novalocal systemd[1]: Started Hostname Service.
Sep 30 20:27:43 np0005463581.novalocal systemd-hostnamed[26368]: Changed pretty hostname to 'compute-1'
Sep 30 20:27:43 compute-1 systemd-hostnamed[26368]: Hostname set to <compute-1> (static)
Sep 30 20:27:43 compute-1 NetworkManager[3950]: <info>  [1759264063.3000] hostname: static hostname changed from "np0005463581.novalocal" to "compute-1"
Sep 30 20:27:43 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:27:43 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:27:43 compute-1 sudo[26274]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:43 compute-1 sshd-session[25034]: Connection closed by 38.102.83.114 port 47788
Sep 30 20:27:43 compute-1 sshd-session[24999]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:27:43 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Sep 30 20:27:43 compute-1 systemd[1]: session-7.scope: Consumed 2.921s CPU time.
Sep 30 20:27:43 compute-1 systemd-logind[793]: Session 7 logged out. Waiting for processes to exit.
Sep 30 20:27:43 compute-1 systemd-logind[793]: Removed session 7.
Sep 30 20:27:44 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:27:44 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:27:44 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1min 18.554s CPU time.
Sep 30 20:27:44 compute-1 systemd[1]: run-r798da2efb2394fa79fe9aaefb1d8f8c6.service: Deactivated successfully.
Sep 30 20:27:49 compute-1 sshd-session[25609]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 12027 ssh2 [preauth]
Sep 30 20:27:49 compute-1 sshd-session[25609]: Disconnecting authenticating user root 185.217.1.246 port 12027: Too many authentication failures [preauth]
Sep 30 20:27:53 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:28:02 compute-1 sshd-session[26703]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 34373 ssh2 [preauth]
Sep 30 20:28:02 compute-1 sshd-session[26703]: Disconnecting authenticating user root 185.217.1.246 port 34373: Too many authentication failures [preauth]
Sep 30 20:28:10 compute-1 sshd-session[26705]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 9794 ssh2 [preauth]
Sep 30 20:28:10 compute-1 sshd-session[26705]: Disconnecting authenticating user root 185.217.1.246 port 9794: Too many authentication failures [preauth]
Sep 30 20:28:13 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 20:28:17 compute-1 sshd-session[26707]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 10449 ssh2 [preauth]
Sep 30 20:28:17 compute-1 sshd-session[26707]: Disconnecting authenticating user root 185.217.1.246 port 10449: Too many authentication failures [preauth]
Sep 30 20:28:20 compute-1 sshd-session[26713]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 4107 ssh2 [preauth]
Sep 30 20:28:20 compute-1 sshd-session[26713]: Disconnecting authenticating user root 185.217.1.246 port 4107: Too many authentication failures [preauth]
Sep 30 20:28:30 compute-1 sshd-session[26715]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 42783 ssh2 [preauth]
Sep 30 20:28:30 compute-1 sshd-session[26715]: Disconnecting authenticating user root 185.217.1.246 port 42783: Too many authentication failures [preauth]
Sep 30 20:28:34 compute-1 sshd-session[26717]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 50220 ssh2 [preauth]
Sep 30 20:28:34 compute-1 sshd-session[26717]: Disconnecting authenticating user root 185.217.1.246 port 50220: Too many authentication failures [preauth]
Sep 30 20:28:43 compute-1 sshd-session[26719]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 36147 ssh2 [preauth]
Sep 30 20:28:43 compute-1 sshd-session[26719]: Disconnecting authenticating user root 185.217.1.246 port 36147: Too many authentication failures [preauth]
Sep 30 20:28:51 compute-1 sshd-session[26721]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 31089 ssh2 [preauth]
Sep 30 20:28:51 compute-1 sshd-session[26721]: Disconnecting authenticating user root 185.217.1.246 port 31089: Too many authentication failures [preauth]
Sep 30 20:28:54 compute-1 sshd-session[26723]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 32592 ssh2 [preauth]
Sep 30 20:28:54 compute-1 sshd-session[26723]: Disconnecting authenticating user root 185.217.1.246 port 32592: Too many authentication failures [preauth]
Sep 30 20:29:05 compute-1 sshd-session[26727]: Invalid user manager from 194.0.234.93 port 36064
Sep 30 20:29:06 compute-1 sshd-session[26727]: Connection closed by invalid user manager 194.0.234.93 port 36064 [preauth]
Sep 30 20:29:14 compute-1 sshd-session[26725]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 57142 ssh2 [preauth]
Sep 30 20:29:14 compute-1 sshd-session[26725]: Disconnecting authenticating user root 185.217.1.246 port 57142: Too many authentication failures [preauth]
Sep 30 20:29:24 compute-1 sshd-session[26731]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 38477 ssh2 [preauth]
Sep 30 20:29:24 compute-1 sshd-session[26731]: Disconnecting authenticating user root 185.217.1.246 port 38477: Too many authentication failures [preauth]
Sep 30 20:29:30 compute-1 sshd-session[26734]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 60921 ssh2 [preauth]
Sep 30 20:29:30 compute-1 sshd-session[26734]: Disconnecting authenticating user root 185.217.1.246 port 60921: Too many authentication failures [preauth]
Sep 30 20:29:40 compute-1 sshd-session[26736]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 58108 ssh2 [preauth]
Sep 30 20:29:40 compute-1 sshd-session[26736]: Disconnecting authenticating user root 185.217.1.246 port 58108: Too many authentication failures [preauth]
Sep 30 20:29:50 compute-1 sshd-session[26738]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 22244 ssh2 [preauth]
Sep 30 20:29:50 compute-1 sshd-session[26738]: Disconnecting authenticating user root 185.217.1.246 port 22244: Too many authentication failures [preauth]
Sep 30 20:29:59 compute-1 sshd-session[26740]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15888 ssh2 [preauth]
Sep 30 20:29:59 compute-1 sshd-session[26740]: Disconnecting authenticating user root 185.217.1.246 port 15888: Too many authentication failures [preauth]
Sep 30 20:30:03 compute-1 sshd-session[26742]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 17473 ssh2 [preauth]
Sep 30 20:30:03 compute-1 sshd-session[26742]: Disconnecting authenticating user root 185.217.1.246 port 17473: Too many authentication failures [preauth]
Sep 30 20:30:10 compute-1 sshd-session[26744]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 58464 ssh2 [preauth]
Sep 30 20:30:10 compute-1 sshd-session[26744]: Disconnecting authenticating user root 185.217.1.246 port 58464: Too many authentication failures [preauth]
Sep 30 20:30:10 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Sep 30 20:30:10 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Sep 30 20:30:10 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Sep 30 20:30:10 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Sep 30 20:30:14 compute-1 sshd-session[26748]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 49717 ssh2 [preauth]
Sep 30 20:30:14 compute-1 sshd-session[26748]: Disconnecting authenticating user root 185.217.1.246 port 49717: Too many authentication failures [preauth]
Sep 30 20:30:20 compute-1 sshd-session[26752]: Invalid user ubuntu from 114.66.3.37 port 49584
Sep 30 20:30:20 compute-1 sshd-session[26752]: Received disconnect from 114.66.3.37 port 49584:11:  [preauth]
Sep 30 20:30:20 compute-1 sshd-session[26752]: Disconnected from invalid user ubuntu 114.66.3.37 port 49584 [preauth]
Sep 30 20:30:23 compute-1 sshd-session[26750]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 35326 ssh2 [preauth]
Sep 30 20:30:23 compute-1 sshd-session[26750]: Disconnecting authenticating user root 185.217.1.246 port 35326: Too many authentication failures [preauth]
Sep 30 20:30:25 compute-1 sshd-session[26754]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 30864 ssh2 [preauth]
Sep 30 20:30:25 compute-1 sshd-session[26754]: Disconnecting authenticating user root 185.217.1.246 port 30864: Too many authentication failures [preauth]
Sep 30 20:30:41 compute-1 sshd-session[26756]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 57134 ssh2 [preauth]
Sep 30 20:30:41 compute-1 sshd-session[26756]: Disconnecting authenticating user root 185.217.1.246 port 57134: Too many authentication failures [preauth]
Sep 30 20:30:45 compute-1 sshd-session[26758]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 64871 ssh2 [preauth]
Sep 30 20:30:45 compute-1 sshd-session[26758]: Disconnecting authenticating user root 185.217.1.246 port 64871: Too many authentication failures [preauth]
Sep 30 20:31:01 compute-1 sshd-session[26760]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 31479 ssh2 [preauth]
Sep 30 20:31:01 compute-1 sshd-session[26760]: Disconnecting authenticating user root 185.217.1.246 port 31479: Too many authentication failures [preauth]
Sep 30 20:31:05 compute-1 sshd-session[26762]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 22373 ssh2 [preauth]
Sep 30 20:31:05 compute-1 sshd-session[26762]: Disconnecting authenticating user root 185.217.1.246 port 22373: Too many authentication failures [preauth]
Sep 30 20:31:14 compute-1 sshd-session[26764]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 54248 ssh2 [preauth]
Sep 30 20:31:14 compute-1 sshd-session[26764]: Disconnecting authenticating user root 185.217.1.246 port 54248: Too many authentication failures [preauth]
Sep 30 20:31:24 compute-1 sshd-session[26766]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 6761 ssh2 [preauth]
Sep 30 20:31:24 compute-1 sshd-session[26766]: Disconnecting authenticating user root 185.217.1.246 port 6761: Too many authentication failures [preauth]
Sep 30 20:31:28 compute-1 sshd-session[26768]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 21833 ssh2 [preauth]
Sep 30 20:31:28 compute-1 sshd-session[26768]: Disconnecting authenticating user root 185.217.1.246 port 21833: Too many authentication failures [preauth]
Sep 30 20:31:37 compute-1 sshd-session[26770]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 28971 ssh2 [preauth]
Sep 30 20:31:37 compute-1 sshd-session[26770]: Disconnecting authenticating user root 185.217.1.246 port 28971: Too many authentication failures [preauth]
Sep 30 20:31:43 compute-1 PackageKit[6441]: daemon quit
Sep 30 20:31:43 compute-1 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 20:31:48 compute-1 sshd-session[26773]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 16467 ssh2 [preauth]
Sep 30 20:31:48 compute-1 sshd-session[26773]: Disconnecting authenticating user root 185.217.1.246 port 16467: Too many authentication failures [preauth]
Sep 30 20:31:56 compute-1 sshd-session[26778]: Accepted publickey for zuul from 38.102.83.65 port 35498 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:31:56 compute-1 systemd-logind[793]: New session 8 of user zuul.
Sep 30 20:31:56 compute-1 systemd[1]: Started Session 8 of User zuul.
Sep 30 20:31:56 compute-1 sshd-session[26778]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:31:57 compute-1 python3[26854]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:31:59 compute-1 sshd-session[26776]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 22706 ssh2 [preauth]
Sep 30 20:31:59 compute-1 sshd-session[26776]: Disconnecting authenticating user root 185.217.1.246 port 22706: Too many authentication failures [preauth]
Sep 30 20:31:59 compute-1 sudo[26968]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msnicqorxfuqkfnztqoyvkrsmhhjkmaj ; /usr/bin/python3'
Sep 30 20:31:59 compute-1 sudo[26968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:31:59 compute-1 python3[26970]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:31:59 compute-1 sudo[26968]: pam_unix(sudo:session): session closed for user root
Sep 30 20:31:59 compute-1 sudo[27041]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbkttiufvtydcbmhkginbevxjszztgcf ; /usr/bin/python3'
Sep 30 20:31:59 compute-1 sudo[27041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:31:59 compute-1 python3[27043]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.276445-30950-172808162820332/source mode=0755 _original_basename=delorean.repo follow=False checksum=fdbc451c7e16efca2444f90fdb72f8eb1c12a1b5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:31:59 compute-1 sudo[27041]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:00 compute-1 sudo[27067]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hadyctoylhejyahlgnfbajtvzioilwuo ; /usr/bin/python3'
Sep 30 20:32:00 compute-1 sudo[27067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:00 compute-1 python3[27069]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:00 compute-1 sudo[27067]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:00 compute-1 sudo[27140]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkvqsjvoixlnmlwnfbprarxylulawjrc ; /usr/bin/python3'
Sep 30 20:32:00 compute-1 sudo[27140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:00 compute-1 python3[27142]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.276445-30950-172808162820332/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:00 compute-1 sudo[27140]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:00 compute-1 sudo[27166]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjomcoztyokipbmbhvygjxinmmhihdnz ; /usr/bin/python3'
Sep 30 20:32:00 compute-1 sudo[27166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:00 compute-1 python3[27168]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:00 compute-1 sudo[27166]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:01 compute-1 sudo[27240]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkhkbrjykdudwziydkplqlvuhfmpvlzw ; /usr/bin/python3'
Sep 30 20:32:01 compute-1 sudo[27240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:01 compute-1 python3[27242]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.276445-30950-172808162820332/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:01 compute-1 sudo[27240]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:01 compute-1 sudo[27266]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxqxytlbkwgnvtcikkexmwrfdecqdrnn ; /usr/bin/python3'
Sep 30 20:32:01 compute-1 sudo[27266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:01 compute-1 python3[27268]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:01 compute-1 sudo[27266]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:01 compute-1 sudo[27339]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxrbbxehohxboajhjiahhcyejmxtuqxd ; /usr/bin/python3'
Sep 30 20:32:01 compute-1 sudo[27339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:01 compute-1 python3[27341]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.276445-30950-172808162820332/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:01 compute-1 sudo[27339]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:02 compute-1 sudo[27365]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcvlaellsccozdpuwyxkzijjbtawlpwv ; /usr/bin/python3'
Sep 30 20:32:02 compute-1 sudo[27365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:02 compute-1 python3[27367]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:02 compute-1 sudo[27365]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:02 compute-1 sudo[27438]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kftjocowtpmqcflsiylmjqhthvkimcdb ; /usr/bin/python3'
Sep 30 20:32:02 compute-1 sudo[27438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:02 compute-1 python3[27440]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.276445-30950-172808162820332/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:02 compute-1 sudo[27438]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:02 compute-1 sudo[27464]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxzczkvavgbssuxvluotavyjkbtkkjcl ; /usr/bin/python3'
Sep 30 20:32:02 compute-1 sudo[27464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:02 compute-1 python3[27466]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:02 compute-1 sudo[27464]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:03 compute-1 sudo[27537]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyxnroqtgvdvktivvpiarwzuqdtuupse ; /usr/bin/python3'
Sep 30 20:32:03 compute-1 sudo[27537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:03 compute-1 python3[27539]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.276445-30950-172808162820332/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:03 compute-1 sudo[27537]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:03 compute-1 sudo[27564]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzwhjrtbsxgnccqddvysulfwcqukjxta ; /usr/bin/python3'
Sep 30 20:32:03 compute-1 sudo[27564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:03 compute-1 python3[27566]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:03 compute-1 sudo[27564]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:03 compute-1 sudo[27637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shdztzsxshkvrgabhlrmoaghpsdpqchs ; /usr/bin/python3'
Sep 30 20:32:03 compute-1 sudo[27637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:03 compute-1 python3[27639]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.276445-30950-172808162820332/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=3193b2329e025492c2ae01f1388d5694c4facea6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:03 compute-1 sudo[27637]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:08 compute-1 sshd-session[27169]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 59490 ssh2 [preauth]
Sep 30 20:32:08 compute-1 sshd-session[27169]: Disconnecting authenticating user root 185.217.1.246 port 59490: Too many authentication failures [preauth]
Sep 30 20:32:12 compute-1 python3[27689]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:32:17 compute-1 sshd-session[27664]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 55029 ssh2 [preauth]
Sep 30 20:32:17 compute-1 sshd-session[27664]: Disconnecting authenticating user root 185.217.1.246 port 55029: Too many authentication failures [preauth]
Sep 30 20:32:30 compute-1 sshd-session[27691]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15997 ssh2 [preauth]
Sep 30 20:32:30 compute-1 sshd-session[27691]: Disconnecting authenticating user root 185.217.1.246 port 15997: Too many authentication failures [preauth]
Sep 30 20:32:43 compute-1 sshd-session[27693]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 6388 ssh2 [preauth]
Sep 30 20:32:43 compute-1 sshd-session[27693]: Disconnecting authenticating user root 185.217.1.246 port 6388: Too many authentication failures [preauth]
Sep 30 20:32:54 compute-1 sshd-session[27695]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 24795 ssh2 [preauth]
Sep 30 20:32:54 compute-1 sshd-session[27695]: Disconnecting authenticating user root 185.217.1.246 port 24795: Too many authentication failures [preauth]
Sep 30 20:32:57 compute-1 sshd-session[27697]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 50582 ssh2 [preauth]
Sep 30 20:32:57 compute-1 sshd-session[27697]: Disconnecting authenticating user root 185.217.1.246 port 50582: Too many authentication failures [preauth]
Sep 30 20:33:05 compute-1 sshd-session[27699]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14609 ssh2 [preauth]
Sep 30 20:33:05 compute-1 sshd-session[27699]: Disconnecting authenticating user root 185.217.1.246 port 14609: Too many authentication failures [preauth]
Sep 30 20:33:09 compute-1 sshd-session[27701]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 20387 ssh2 [preauth]
Sep 30 20:33:09 compute-1 sshd-session[27701]: Disconnecting authenticating user root 185.217.1.246 port 20387: Too many authentication failures [preauth]
Sep 30 20:33:14 compute-1 sshd-session[27706]: error: kex_exchange_identification: read: Connection reset by peer
Sep 30 20:33:14 compute-1 sshd-session[27706]: Connection reset by 45.140.17.97 port 62027
Sep 30 20:33:17 compute-1 sshd-session[27703]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 42712 ssh2 [preauth]
Sep 30 20:33:17 compute-1 sshd-session[27703]: Disconnecting authenticating user root 185.217.1.246 port 42712: Too many authentication failures [preauth]
Sep 30 20:33:27 compute-1 sshd-session[27707]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 51740 ssh2 [preauth]
Sep 30 20:33:27 compute-1 sshd-session[27707]: Disconnecting authenticating user root 185.217.1.246 port 51740: Too many authentication failures [preauth]
Sep 30 20:33:35 compute-1 sshd-session[27710]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 3140 ssh2 [preauth]
Sep 30 20:33:35 compute-1 sshd-session[27710]: Disconnecting authenticating user root 185.217.1.246 port 3140: Too many authentication failures [preauth]
Sep 30 20:33:44 compute-1 sshd-session[27712]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 3042 ssh2 [preauth]
Sep 30 20:33:44 compute-1 sshd-session[27712]: Disconnecting authenticating user root 185.217.1.246 port 3042: Too many authentication failures [preauth]
Sep 30 20:33:48 compute-1 sshd-session[27714]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 22505 ssh2 [preauth]
Sep 30 20:33:48 compute-1 sshd-session[27714]: Disconnecting authenticating user root 185.217.1.246 port 22505: Too many authentication failures [preauth]
Sep 30 20:33:55 compute-1 sshd-session[27716]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 47599 ssh2 [preauth]
Sep 30 20:33:55 compute-1 sshd-session[27716]: Disconnecting authenticating user root 185.217.1.246 port 47599: Too many authentication failures [preauth]
Sep 30 20:34:00 compute-1 sshd-session[27718]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 37045 ssh2 [preauth]
Sep 30 20:34:00 compute-1 sshd-session[27718]: Disconnecting authenticating user root 185.217.1.246 port 37045: Too many authentication failures [preauth]
Sep 30 20:34:07 compute-1 sshd-session[27720]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 20051 ssh2 [preauth]
Sep 30 20:34:07 compute-1 sshd-session[27720]: Disconnecting authenticating user root 185.217.1.246 port 20051: Too many authentication failures [preauth]
Sep 30 20:34:10 compute-1 sshd-session[27722]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 17767 ssh2 [preauth]
Sep 30 20:34:10 compute-1 sshd-session[27722]: Disconnecting authenticating user root 185.217.1.246 port 17767: Too many authentication failures [preauth]
Sep 30 20:34:20 compute-1 sshd-session[27724]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 31461 ssh2 [preauth]
Sep 30 20:34:20 compute-1 sshd-session[27724]: Disconnecting authenticating user root 185.217.1.246 port 31461: Too many authentication failures [preauth]
Sep 30 20:34:31 compute-1 sshd-session[27726]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 57775 ssh2 [preauth]
Sep 30 20:34:31 compute-1 sshd-session[27726]: Disconnecting authenticating user root 185.217.1.246 port 57775: Too many authentication failures [preauth]
Sep 30 20:34:40 compute-1 sshd-session[27728]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 36329 ssh2 [preauth]
Sep 30 20:34:40 compute-1 sshd-session[27728]: Disconnecting authenticating user root 185.217.1.246 port 36329: Too many authentication failures [preauth]
Sep 30 20:34:51 compute-1 sshd-session[27730]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 33257 ssh2 [preauth]
Sep 30 20:34:51 compute-1 sshd-session[27730]: Disconnecting authenticating user root 185.217.1.246 port 33257: Too many authentication failures [preauth]
Sep 30 20:34:58 compute-1 sshd-session[27732]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 6261 ssh2 [preauth]
Sep 30 20:34:58 compute-1 sshd-session[27732]: Disconnecting authenticating user root 185.217.1.246 port 6261: Too many authentication failures [preauth]
Sep 30 20:35:00 compute-1 sshd-session[27734]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 47875 ssh2 [preauth]
Sep 30 20:35:00 compute-1 sshd-session[27734]: Disconnecting authenticating user root 185.217.1.246 port 47875: Too many authentication failures [preauth]
Sep 30 20:35:07 compute-1 sshd-session[27736]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15879 ssh2 [preauth]
Sep 30 20:35:07 compute-1 sshd-session[27736]: Disconnecting authenticating user root 185.217.1.246 port 15879: Too many authentication failures [preauth]
Sep 30 20:35:11 compute-1 sshd-session[27738]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 64518 ssh2 [preauth]
Sep 30 20:35:11 compute-1 sshd-session[27738]: Disconnecting authenticating user root 185.217.1.246 port 64518: Too many authentication failures [preauth]
Sep 30 20:35:21 compute-1 sshd-session[27740]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 45939 ssh2 [preauth]
Sep 30 20:35:21 compute-1 sshd-session[27740]: Disconnecting authenticating user root 185.217.1.246 port 45939: Too many authentication failures [preauth]
Sep 30 20:35:32 compute-1 sshd-session[27742]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14234 ssh2 [preauth]
Sep 30 20:35:32 compute-1 sshd-session[27742]: Disconnecting authenticating user root 185.217.1.246 port 14234: Too many authentication failures [preauth]
Sep 30 20:35:41 compute-1 sshd-session[27744]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 32739 ssh2 [preauth]
Sep 30 20:35:41 compute-1 sshd-session[27744]: Disconnecting authenticating user root 185.217.1.246 port 32739: Too many authentication failures [preauth]
Sep 30 20:35:44 compute-1 sshd-session[27746]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 28629 ssh2 [preauth]
Sep 30 20:35:44 compute-1 sshd-session[27746]: Disconnecting authenticating user root 185.217.1.246 port 28629: Too many authentication failures [preauth]
Sep 30 20:35:53 compute-1 sshd-session[27748]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15369 ssh2 [preauth]
Sep 30 20:35:53 compute-1 sshd-session[27748]: Disconnecting authenticating user root 185.217.1.246 port 15369: Too many authentication failures [preauth]
Sep 30 20:36:04 compute-1 sshd-session[27750]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15324 ssh2 [preauth]
Sep 30 20:36:04 compute-1 sshd-session[27750]: Disconnecting authenticating user root 185.217.1.246 port 15324: Too many authentication failures [preauth]
Sep 30 20:36:12 compute-1 sshd-session[27752]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 34978 ssh2 [preauth]
Sep 30 20:36:12 compute-1 sshd-session[27752]: Disconnecting authenticating user root 185.217.1.246 port 34978: Too many authentication failures [preauth]
Sep 30 20:36:15 compute-1 sshd-session[27754]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 32603 ssh2 [preauth]
Sep 30 20:36:15 compute-1 sshd-session[27754]: Disconnecting authenticating user root 185.217.1.246 port 32603: Too many authentication failures [preauth]
Sep 30 20:36:22 compute-1 sshd-session[27756]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 57336 ssh2 [preauth]
Sep 30 20:36:22 compute-1 sshd-session[27756]: Disconnecting authenticating user root 185.217.1.246 port 57336: Too many authentication failures [preauth]
Sep 30 20:36:29 compute-1 sshd-session[27758]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 57907 ssh2 [preauth]
Sep 30 20:36:29 compute-1 sshd-session[27758]: Disconnecting authenticating user root 185.217.1.246 port 57907: Too many authentication failures [preauth]
Sep 30 20:36:33 compute-1 sshd-session[27760]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 51607 ssh2 [preauth]
Sep 30 20:36:33 compute-1 sshd-session[27760]: Disconnecting authenticating user root 185.217.1.246 port 51607: Too many authentication failures [preauth]
Sep 30 20:36:36 compute-1 sshd-session[27762]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14050 ssh2 [preauth]
Sep 30 20:36:36 compute-1 sshd-session[27762]: Disconnecting authenticating user root 185.217.1.246 port 14050: Too many authentication failures [preauth]
Sep 30 20:36:53 compute-1 sshd-session[27764]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 6871 ssh2 [preauth]
Sep 30 20:36:53 compute-1 sshd-session[27764]: Disconnecting authenticating user root 185.217.1.246 port 6871: Too many authentication failures [preauth]
Sep 30 20:37:02 compute-1 sshd-session[27766]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 25992 ssh2 [preauth]
Sep 30 20:37:02 compute-1 sshd-session[27766]: Disconnecting authenticating user root 185.217.1.246 port 25992: Too many authentication failures [preauth]
Sep 30 20:37:10 compute-1 sshd-session[27769]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 11496 ssh2 [preauth]
Sep 30 20:37:10 compute-1 sshd-session[27769]: Disconnecting authenticating user root 185.217.1.246 port 11496: Too many authentication failures [preauth]
Sep 30 20:37:12 compute-1 sshd-session[26781]: Received disconnect from 38.102.83.65 port 35498:11: disconnected by user
Sep 30 20:37:12 compute-1 sshd-session[26781]: Disconnected from user zuul 38.102.83.65 port 35498
Sep 30 20:37:12 compute-1 sshd-session[26778]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:37:12 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Sep 30 20:37:12 compute-1 systemd[1]: session-8.scope: Consumed 5.204s CPU time.
Sep 30 20:37:12 compute-1 systemd-logind[793]: Session 8 logged out. Waiting for processes to exit.
Sep 30 20:37:12 compute-1 systemd-logind[793]: Removed session 8.
Sep 30 20:37:16 compute-1 sshd-session[27771]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 8455 ssh2 [preauth]
Sep 30 20:37:16 compute-1 sshd-session[27771]: Disconnecting authenticating user root 185.217.1.246 port 8455: Too many authentication failures [preauth]
Sep 30 20:37:27 compute-1 sshd-session[27773]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 53662 ssh2 [preauth]
Sep 30 20:37:27 compute-1 sshd-session[27773]: Disconnecting authenticating user root 185.217.1.246 port 53662: Too many authentication failures [preauth]
Sep 30 20:37:43 compute-1 sshd-session[27775]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 48087 ssh2 [preauth]
Sep 30 20:37:43 compute-1 sshd-session[27775]: Disconnecting authenticating user root 185.217.1.246 port 48087: Too many authentication failures [preauth]
Sep 30 20:37:47 compute-1 sshd-session[27777]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 28219 ssh2 [preauth]
Sep 30 20:37:47 compute-1 sshd-session[27777]: Disconnecting authenticating user root 185.217.1.246 port 28219: Too many authentication failures [preauth]
Sep 30 20:37:52 compute-1 sshd-session[27779]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 61796 ssh2 [preauth]
Sep 30 20:37:52 compute-1 sshd-session[27779]: Disconnecting authenticating user root 185.217.1.246 port 61796: Too many authentication failures [preauth]
Sep 30 20:37:58 compute-1 sshd-session[27781]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 27768 ssh2 [preauth]
Sep 30 20:37:58 compute-1 sshd-session[27781]: Disconnecting authenticating user root 185.217.1.246 port 27768: Too many authentication failures [preauth]
Sep 30 20:38:06 compute-1 sshd-session[27783]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 36154 ssh2 [preauth]
Sep 30 20:38:06 compute-1 sshd-session[27783]: Disconnecting authenticating user root 185.217.1.246 port 36154: Too many authentication failures [preauth]
Sep 30 20:38:12 compute-1 sshd-session[27785]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 25624 ssh2 [preauth]
Sep 30 20:38:12 compute-1 sshd-session[27785]: Disconnecting authenticating user root 185.217.1.246 port 25624: Too many authentication failures [preauth]
Sep 30 20:38:22 compute-1 sshd-session[27787]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 2890 ssh2 [preauth]
Sep 30 20:38:22 compute-1 sshd-session[27787]: Disconnecting authenticating user root 185.217.1.246 port 2890: Too many authentication failures [preauth]
Sep 30 20:38:29 compute-1 sshd-session[27791]: Connection closed by authenticating user root 194.0.234.19 port 32402 [preauth]
Sep 30 20:38:31 compute-1 sshd-session[27789]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 25605 ssh2 [preauth]
Sep 30 20:38:31 compute-1 sshd-session[27789]: Disconnecting authenticating user root 185.217.1.246 port 25605: Too many authentication failures [preauth]
Sep 30 20:38:38 compute-1 sshd-session[27793]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 27143 ssh2 [preauth]
Sep 30 20:38:38 compute-1 sshd-session[27793]: Disconnecting authenticating user root 185.217.1.246 port 27143: Too many authentication failures [preauth]
Sep 30 20:38:45 compute-1 sshd-session[27795]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14582 ssh2 [preauth]
Sep 30 20:38:45 compute-1 sshd-session[27795]: Disconnecting authenticating user root 185.217.1.246 port 14582: Too many authentication failures [preauth]
Sep 30 20:38:47 compute-1 sshd-session[27797]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 13572 ssh2 [preauth]
Sep 30 20:38:47 compute-1 sshd-session[27797]: Disconnecting authenticating user root 185.217.1.246 port 13572: Too many authentication failures [preauth]
Sep 30 20:38:54 compute-1 sshd-session[27799]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 31918 ssh2 [preauth]
Sep 30 20:38:54 compute-1 sshd-session[27799]: Disconnecting authenticating user root 185.217.1.246 port 31918: Too many authentication failures [preauth]
Sep 30 20:39:01 compute-1 chronyd[797]: Selected source 50.43.156.177 (2.centos.pool.ntp.org)
Sep 30 20:39:01 compute-1 sshd-session[27801]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 31969 ssh2 [preauth]
Sep 30 20:39:01 compute-1 sshd-session[27801]: Disconnecting authenticating user root 185.217.1.246 port 31969: Too many authentication failures [preauth]
Sep 30 20:39:04 compute-1 sshd-session[27803]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 16218 ssh2 [preauth]
Sep 30 20:39:04 compute-1 sshd-session[27803]: Disconnecting authenticating user root 185.217.1.246 port 16218: Too many authentication failures [preauth]
Sep 30 20:39:09 compute-1 sshd-session[27805]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 49066 ssh2 [preauth]
Sep 30 20:39:09 compute-1 sshd-session[27805]: Disconnecting authenticating user root 185.217.1.246 port 49066: Too many authentication failures [preauth]
Sep 30 20:39:17 compute-1 sshd-session[27807]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 48164 ssh2 [preauth]
Sep 30 20:39:17 compute-1 sshd-session[27807]: Disconnecting authenticating user root 185.217.1.246 port 48164: Too many authentication failures [preauth]
Sep 30 20:39:23 compute-1 sshd-session[27809]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 20736 ssh2 [preauth]
Sep 30 20:39:23 compute-1 sshd-session[27809]: Disconnecting authenticating user root 185.217.1.246 port 20736: Too many authentication failures [preauth]
Sep 30 20:39:28 compute-1 sshd-session[27811]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14428 ssh2 [preauth]
Sep 30 20:39:28 compute-1 sshd-session[27811]: Disconnecting authenticating user root 185.217.1.246 port 14428: Too many authentication failures [preauth]
Sep 30 20:39:34 compute-1 sshd-session[27813]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 61811 ssh2 [preauth]
Sep 30 20:39:34 compute-1 sshd-session[27813]: Disconnecting authenticating user root 185.217.1.246 port 61811: Too many authentication failures [preauth]
Sep 30 20:39:45 compute-1 sshd-session[27815]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 39993 ssh2 [preauth]
Sep 30 20:39:45 compute-1 sshd-session[27815]: Disconnecting authenticating user root 185.217.1.246 port 39993: Too many authentication failures [preauth]
Sep 30 20:39:54 compute-1 sshd-session[27819]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15682 ssh2 [preauth]
Sep 30 20:39:54 compute-1 sshd-session[27819]: Disconnecting authenticating user root 185.217.1.246 port 15682: Too many authentication failures [preauth]
Sep 30 20:40:06 compute-1 sshd-session[27821]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 20245 ssh2 [preauth]
Sep 30 20:40:06 compute-1 sshd-session[27821]: Disconnecting authenticating user root 185.217.1.246 port 20245: Too many authentication failures [preauth]
Sep 30 20:40:08 compute-1 sshd-session[27823]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 51177 ssh2 [preauth]
Sep 30 20:40:08 compute-1 sshd-session[27823]: Disconnecting authenticating user root 185.217.1.246 port 51177: Too many authentication failures [preauth]
Sep 30 20:40:13 compute-1 sshd-session[27825]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 17090 ssh2 [preauth]
Sep 30 20:40:13 compute-1 sshd-session[27825]: Disconnecting authenticating user root 185.217.1.246 port 17090: Too many authentication failures [preauth]
Sep 30 20:40:19 compute-1 sshd-session[27827]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 56919 ssh2 [preauth]
Sep 30 20:40:19 compute-1 sshd-session[27827]: Disconnecting authenticating user root 185.217.1.246 port 56919: Too many authentication failures [preauth]
Sep 30 20:40:24 compute-1 sshd-session[27829]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 43319 ssh2 [preauth]
Sep 30 20:40:24 compute-1 sshd-session[27829]: Disconnecting authenticating user root 185.217.1.246 port 43319: Too many authentication failures [preauth]
Sep 30 20:40:28 compute-1 sshd-session[27831]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 18594 ssh2 [preauth]
Sep 30 20:40:28 compute-1 sshd-session[27831]: Disconnecting authenticating user root 185.217.1.246 port 18594: Too many authentication failures [preauth]
Sep 30 20:40:34 compute-1 sshd-session[27833]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 50875 ssh2 [preauth]
Sep 30 20:40:34 compute-1 sshd-session[27833]: Disconnecting authenticating user root 185.217.1.246 port 50875: Too many authentication failures [preauth]
Sep 30 20:40:38 compute-1 sshd-session[27835]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 39720 ssh2 [preauth]
Sep 30 20:40:38 compute-1 sshd-session[27835]: Disconnecting authenticating user root 185.217.1.246 port 39720: Too many authentication failures [preauth]
Sep 30 20:40:49 compute-1 sshd-session[27837]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 10915 ssh2 [preauth]
Sep 30 20:40:49 compute-1 sshd-session[27837]: Disconnecting authenticating user root 185.217.1.246 port 10915: Too many authentication failures [preauth]
Sep 30 20:40:56 compute-1 sshd-session[27839]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 46279 ssh2 [preauth]
Sep 30 20:40:56 compute-1 sshd-session[27839]: Disconnecting authenticating user root 185.217.1.246 port 46279: Too many authentication failures [preauth]
Sep 30 20:41:02 compute-1 sshd-session[27841]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 27276 ssh2 [preauth]
Sep 30 20:41:02 compute-1 sshd-session[27841]: Disconnecting authenticating user root 185.217.1.246 port 27276: Too many authentication failures [preauth]
Sep 30 20:41:18 compute-1 sshd-session[27843]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 37529 ssh2 [preauth]
Sep 30 20:41:18 compute-1 sshd-session[27843]: Disconnecting authenticating user root 185.217.1.246 port 37529: Too many authentication failures [preauth]
Sep 30 20:41:27 compute-1 sshd-session[27845]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 31368 ssh2 [preauth]
Sep 30 20:41:27 compute-1 sshd-session[27845]: Disconnecting authenticating user root 185.217.1.246 port 31368: Too many authentication failures [preauth]
Sep 30 20:41:31 compute-1 sshd-session[27847]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 30437 ssh2 [preauth]
Sep 30 20:41:31 compute-1 sshd-session[27847]: Disconnecting authenticating user root 185.217.1.246 port 30437: Too many authentication failures [preauth]
Sep 30 20:41:35 compute-1 sshd-session[27849]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 6839 ssh2 [preauth]
Sep 30 20:41:35 compute-1 sshd-session[27849]: Disconnecting authenticating user root 185.217.1.246 port 6839: Too many authentication failures [preauth]
Sep 30 20:41:37 compute-1 sshd-session[27851]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 41031 ssh2 [preauth]
Sep 30 20:41:37 compute-1 sshd-session[27851]: Disconnecting authenticating user root 185.217.1.246 port 41031: Too many authentication failures [preauth]
Sep 30 20:41:44 compute-1 sshd[1006]: Timeout before authentication for connection from 89.223.35.4 to 38.102.83.50, pid = 27818
Sep 30 20:41:45 compute-1 sshd-session[27853]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 63721 ssh2 [preauth]
Sep 30 20:41:45 compute-1 sshd-session[27853]: Disconnecting authenticating user root 185.217.1.246 port 63721: Too many authentication failures [preauth]
Sep 30 20:41:52 compute-1 sshd-session[27855]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 12054 ssh2 [preauth]
Sep 30 20:41:52 compute-1 sshd-session[27855]: Disconnecting authenticating user root 185.217.1.246 port 12054: Too many authentication failures [preauth]
Sep 30 20:42:01 compute-1 sshd-session[27857]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 60476 ssh2 [preauth]
Sep 30 20:42:01 compute-1 sshd-session[27857]: Disconnecting authenticating user root 185.217.1.246 port 60476: Too many authentication failures [preauth]
Sep 30 20:42:05 compute-1 sshd-session[27859]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 63770 ssh2 [preauth]
Sep 30 20:42:05 compute-1 sshd-session[27859]: Disconnecting authenticating user root 185.217.1.246 port 63770: Too many authentication failures [preauth]
Sep 30 20:42:09 compute-1 sshd-session[27861]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 34666 ssh2 [preauth]
Sep 30 20:42:09 compute-1 sshd-session[27861]: Disconnecting authenticating user root 185.217.1.246 port 34666: Too many authentication failures [preauth]
Sep 30 20:42:20 compute-1 sshd-session[27863]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 8291 ssh2 [preauth]
Sep 30 20:42:20 compute-1 sshd-session[27863]: Disconnecting authenticating user root 185.217.1.246 port 8291: Too many authentication failures [preauth]
Sep 30 20:42:31 compute-1 sshd-session[27866]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 33939 ssh2 [preauth]
Sep 30 20:42:31 compute-1 sshd-session[27866]: Disconnecting authenticating user root 185.217.1.246 port 33939: Too many authentication failures [preauth]
Sep 30 20:42:38 compute-1 sshd-session[27869]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15377 ssh2 [preauth]
Sep 30 20:42:38 compute-1 sshd-session[27869]: Disconnecting authenticating user root 185.217.1.246 port 15377: Too many authentication failures [preauth]
Sep 30 20:42:44 compute-1 sshd-session[27871]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 53772 ssh2 [preauth]
Sep 30 20:42:44 compute-1 sshd-session[27871]: Disconnecting authenticating user root 185.217.1.246 port 53772: Too many authentication failures [preauth]
Sep 30 20:42:49 compute-1 sshd-session[27873]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 37754 ssh2 [preauth]
Sep 30 20:42:49 compute-1 sshd-session[27873]: Disconnecting authenticating user root 185.217.1.246 port 37754: Too many authentication failures [preauth]
Sep 30 20:42:53 compute-1 sshd-session[27876]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 13184 ssh2 [preauth]
Sep 30 20:42:53 compute-1 sshd-session[27876]: Disconnecting authenticating user root 185.217.1.246 port 13184: Too many authentication failures [preauth]
Sep 30 20:43:02 compute-1 sshd-session[27878]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 53819 ssh2 [preauth]
Sep 30 20:43:02 compute-1 sshd-session[27878]: Disconnecting authenticating user root 185.217.1.246 port 53819: Too many authentication failures [preauth]
Sep 30 20:43:09 compute-1 sshd-session[27880]: Disconnecting authenticating user root 185.217.1.246 port 9195: Change of username or service not allowed: (root,ssh-connection) -> (root;yuchen!))&gt,ssh-connectio [preauth]
Sep 30 20:43:11 compute-1 sshd-session[27882]: Invalid user root;yuchen!))&gt from 185.217.1.246 port 62176
Sep 30 20:43:11 compute-1 sshd-session[27882]: Disconnecting invalid user root;yuchen!))&gt 185.217.1.246 port 62176: Change of username or service not allowed: (root;yuchen!))&gt,ssh-connection) -> (tazos,ssh-connecti [preauth]
Sep 30 20:43:15 compute-1 sshd-session[27884]: Invalid user tazos from 185.217.1.246 port 8022
Sep 30 20:43:17 compute-1 sshd-session[27884]: Disconnecting invalid user tazos 185.217.1.246 port 8022: Change of username or service not allowed: (tazos,ssh-connection) -> (ethereumdocker,ssh-connection) [preauth]
Sep 30 20:43:19 compute-1 sshd-session[27886]: Invalid user ethereumdocker from 185.217.1.246 port 53622
Sep 30 20:43:19 compute-1 sshd-session[27886]: Disconnecting invalid user ethereumdocker 185.217.1.246 port 53622: Change of username or service not allowed: (ethereumdocker,ssh-connection) -> (btc,ssh-connection) [preauth]
Sep 30 20:43:22 compute-1 sshd-session[27888]: Invalid user btc from 185.217.1.246 port 18702
Sep 30 20:43:22 compute-1 sshd-session[27888]: Disconnecting invalid user btc 185.217.1.246 port 18702: Change of username or service not allowed: (btc,ssh-connection) -> (guest,ssh-connection) [preauth]
Sep 30 20:43:28 compute-1 sshd-session[27890]: Invalid user guest from 185.217.1.246 port 34438
Sep 30 20:43:31 compute-1 sshd-session[27890]: error: maximum authentication attempts exceeded for invalid user guest from 185.217.1.246 port 34438 ssh2 [preauth]
Sep 30 20:43:31 compute-1 sshd-session[27890]: Disconnecting invalid user guest 185.217.1.246 port 34438: Too many authentication failures [preauth]
Sep 30 20:43:33 compute-1 sshd-session[27892]: Invalid user guest from 185.217.1.246 port 44616
Sep 30 20:43:33 compute-1 sshd-session[27892]: Disconnecting invalid user guest 185.217.1.246 port 44616: Change of username or service not allowed: (guest,ssh-connection) -> (sol,ssh-connection) [preauth]
Sep 30 20:43:34 compute-1 sshd-session[27894]: Invalid user sol from 185.217.1.246 port 53501
Sep 30 20:43:35 compute-1 sshd-session[27894]: Disconnecting invalid user sol 185.217.1.246 port 53501: Change of username or service not allowed: (sol,ssh-connection) -> (ethereum,ssh-connection) [preauth]
Sep 30 20:43:39 compute-1 sshd-session[27896]: Invalid user ethereum from 185.217.1.246 port 18479
Sep 30 20:43:40 compute-1 sshd-session[27896]: Disconnecting invalid user ethereum 185.217.1.246 port 18479: Change of username or service not allowed: (ethereum,ssh-connection) -> (user2,ssh-connection) [preauth]
Sep 30 20:43:42 compute-1 sshd-session[27898]: Invalid user user2 from 185.217.1.246 port 41003
Sep 30 20:43:42 compute-1 sshd-session[27898]: Disconnecting invalid user user2 185.217.1.246 port 41003: Change of username or service not allowed: (user2,ssh-connection) -> (alarm,ssh-connection) [preauth]
Sep 30 20:43:46 compute-1 sshd-session[27900]: Invalid user alarm from 185.217.1.246 port 6318
Sep 30 20:43:49 compute-1 sshd-session[27900]: Disconnecting invalid user alarm 185.217.1.246 port 6318: Change of username or service not allowed: (alarm,ssh-connection) -> (dot,ssh-connection) [preauth]
Sep 30 20:43:52 compute-1 sshd-session[27902]: Invalid user dot from 185.217.1.246 port 51202
Sep 30 20:43:52 compute-1 sshd-session[27902]: Disconnecting invalid user dot 185.217.1.246 port 51202: Change of username or service not allowed: (dot,ssh-connection) -> (User,ssh-connection) [preauth]
Sep 30 20:43:55 compute-1 sshd-session[27904]: Invalid user User from 185.217.1.246 port 22273
Sep 30 20:43:59 compute-1 sshd-session[27904]: Disconnecting invalid user User 185.217.1.246 port 22273: Change of username or service not allowed: (User,ssh-connection) -> (user21,ssh-connection) [preauth]
Sep 30 20:44:01 compute-1 sshd-session[27906]: Invalid user user21 from 185.217.1.246 port 10010
Sep 30 20:44:02 compute-1 sshd-session[27906]: Disconnecting invalid user user21 185.217.1.246 port 10010: Change of username or service not allowed: (user21,ssh-connection) -> (xbmc,ssh-connection) [preauth]
Sep 30 20:44:04 compute-1 sshd-session[27908]: Invalid user xbmc from 185.217.1.246 port 42926
Sep 30 20:44:05 compute-1 sshd-session[27908]: Disconnecting invalid user xbmc 185.217.1.246 port 42926: Change of username or service not allowed: (xbmc,ssh-connection) -> (riscv,ssh-connection) [preauth]
Sep 30 20:44:10 compute-1 sshd-session[27910]: Invalid user riscv from 185.217.1.246 port 4561
Sep 30 20:44:12 compute-1 sshd-session[27910]: Disconnecting invalid user riscv 185.217.1.246 port 4561: Change of username or service not allowed: (riscv,ssh-connection) -> (david,ssh-connection) [preauth]
Sep 30 20:44:14 compute-1 sshd-session[27912]: Invalid user david from 185.217.1.246 port 58631
Sep 30 20:44:14 compute-1 sshd-session[27912]: Disconnecting invalid user david 185.217.1.246 port 58631: Change of username or service not allowed: (david,ssh-connection) -> (xrp,ssh-connection) [preauth]
Sep 30 20:44:17 compute-1 sshd-session[27914]: Invalid user xrp from 185.217.1.246 port 13762
Sep 30 20:44:19 compute-1 sshd-session[27914]: Disconnecting invalid user xrp 185.217.1.246 port 13762: Change of username or service not allowed: (xrp,ssh-connection) -> (william,ssh-connection) [preauth]
Sep 30 20:44:24 compute-1 sshd-session[27916]: Invalid user william from 185.217.1.246 port 54659
Sep 30 20:44:24 compute-1 sshd-session[27916]: Disconnecting invalid user william 185.217.1.246 port 54659: Change of username or service not allowed: (william,ssh-connection) -> (bnb,ssh-connection) [preauth]
Sep 30 20:44:27 compute-1 sshd-session[27918]: Invalid user bnb from 185.217.1.246 port 32573
Sep 30 20:44:27 compute-1 sshd-session[27918]: Disconnecting invalid user bnb 185.217.1.246 port 32573: Change of username or service not allowed: (bnb,ssh-connection) -> (web3,ssh-connection) [preauth]
Sep 30 20:44:34 compute-1 sshd-session[27920]: Invalid user web3 from 185.217.1.246 port 11688
Sep 30 20:44:34 compute-1 sshd-session[27920]: Disconnecting invalid user web3 185.217.1.246 port 11688: Change of username or service not allowed: (web3,ssh-connection) -> (user,ssh-connection) [preauth]
Sep 30 20:44:36 compute-1 sshd-session[27922]: Invalid user user from 185.217.1.246 port 42515
Sep 30 20:44:41 compute-1 sshd-session[27922]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 42515 ssh2 [preauth]
Sep 30 20:44:41 compute-1 sshd-session[27922]: Disconnecting invalid user user 185.217.1.246 port 42515: Too many authentication failures [preauth]
Sep 30 20:44:44 compute-1 sshd-session[27924]: Invalid user user from 185.217.1.246 port 49082
Sep 30 20:44:50 compute-1 sshd[1006]: Timeout before authentication for connection from 94.180.130.38 to 38.102.83.50, pid = 27874
Sep 30 20:44:50 compute-1 sshd-session[27924]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 49082 ssh2 [preauth]
Sep 30 20:44:50 compute-1 sshd-session[27924]: Disconnecting invalid user user 185.217.1.246 port 49082: Too many authentication failures [preauth]
Sep 30 20:44:53 compute-1 sshd-session[27926]: Invalid user user from 185.217.1.246 port 42355
Sep 30 20:44:58 compute-1 sshd-session[27926]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 42355 ssh2 [preauth]
Sep 30 20:44:58 compute-1 sshd-session[27926]: Disconnecting invalid user user 185.217.1.246 port 42355: Too many authentication failures [preauth]
Sep 30 20:45:03 compute-1 sshd-session[27928]: Invalid user user from 185.217.1.246 port 56207
Sep 30 20:45:05 compute-1 sshd-session[27928]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 56207 ssh2 [preauth]
Sep 30 20:45:05 compute-1 sshd-session[27928]: Disconnecting invalid user user 185.217.1.246 port 56207: Too many authentication failures [preauth]
Sep 30 20:45:08 compute-1 sshd-session[27930]: Invalid user user from 185.217.1.246 port 32967
Sep 30 20:45:13 compute-1 sshd-session[27930]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 32967 ssh2 [preauth]
Sep 30 20:45:13 compute-1 sshd-session[27930]: Disconnecting invalid user user 185.217.1.246 port 32967: Too many authentication failures [preauth]
Sep 30 20:45:17 compute-1 sshd-session[27932]: Invalid user user from 185.217.1.246 port 47312
Sep 30 20:45:21 compute-1 sshd-session[27932]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 47312 ssh2 [preauth]
Sep 30 20:45:21 compute-1 sshd-session[27932]: Disconnecting invalid user user 185.217.1.246 port 47312: Too many authentication failures [preauth]
Sep 30 20:45:25 compute-1 sshd-session[27935]: Invalid user user from 185.217.1.246 port 42176
Sep 30 20:45:26 compute-1 sshd-session[27935]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 42176 ssh2 [preauth]
Sep 30 20:45:26 compute-1 sshd-session[27935]: Disconnecting invalid user user 185.217.1.246 port 42176: Too many authentication failures [preauth]
Sep 30 20:45:27 compute-1 sshd-session[27937]: Invalid user user from 185.217.1.246 port 7784
Sep 30 20:45:30 compute-1 sshd-session[27937]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 7784 ssh2 [preauth]
Sep 30 20:45:30 compute-1 sshd-session[27937]: Disconnecting invalid user user 185.217.1.246 port 7784: Too many authentication failures [preauth]
Sep 30 20:45:34 compute-1 sshd-session[27939]: Invalid user user from 185.217.1.246 port 46083
Sep 30 20:45:36 compute-1 sshd-session[27939]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 46083 ssh2 [preauth]
Sep 30 20:45:36 compute-1 sshd-session[27939]: Disconnecting invalid user user 185.217.1.246 port 46083: Too many authentication failures [preauth]
Sep 30 20:45:38 compute-1 sshd-session[27941]: Invalid user user from 185.217.1.246 port 29595
Sep 30 20:45:43 compute-1 sshd-session[27943]: Accepted publickey for zuul from 192.168.122.30 port 59312 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:45:43 compute-1 systemd-logind[793]: New session 9 of user zuul.
Sep 30 20:45:43 compute-1 systemd[1]: Started Session 9 of User zuul.
Sep 30 20:45:43 compute-1 sshd-session[27943]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:45:45 compute-1 python3.9[28096]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:45:46 compute-1 sudo[28275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdfmaiwbtmbzuylahmrmmwromhhpnbyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265145.6709702-62-233092993777981/AnsiballZ_command.py'
Sep 30 20:45:46 compute-1 sudo[28275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:45:46 compute-1 python3.9[28277]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:45:47 compute-1 sshd-session[27941]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 29595 ssh2 [preauth]
Sep 30 20:45:47 compute-1 sshd-session[27941]: Disconnecting invalid user user 185.217.1.246 port 29595: Too many authentication failures [preauth]
Sep 30 20:45:53 compute-1 sshd-session[28290]: Invalid user user from 185.217.1.246 port 50351
Sep 30 20:45:53 compute-1 sudo[28275]: pam_unix(sudo:session): session closed for user root
Sep 30 20:45:53 compute-1 sshd-session[27946]: Connection closed by 192.168.122.30 port 59312
Sep 30 20:45:53 compute-1 sshd-session[27943]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:45:53 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Sep 30 20:45:53 compute-1 systemd[1]: session-9.scope: Consumed 8.220s CPU time.
Sep 30 20:45:53 compute-1 systemd-logind[793]: Session 9 logged out. Waiting for processes to exit.
Sep 30 20:45:53 compute-1 systemd-logind[793]: Removed session 9.
Sep 30 20:45:56 compute-1 sshd-session[28290]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 50351 ssh2 [preauth]
Sep 30 20:45:56 compute-1 sshd-session[28290]: Disconnecting invalid user user 185.217.1.246 port 50351: Too many authentication failures [preauth]
Sep 30 20:46:05 compute-1 sshd-session[28336]: Invalid user user from 185.217.1.246 port 5954
Sep 30 20:46:07 compute-1 sshd-session[28336]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 5954 ssh2 [preauth]
Sep 30 20:46:07 compute-1 sshd-session[28336]: Disconnecting invalid user user 185.217.1.246 port 5954: Too many authentication failures [preauth]
Sep 30 20:46:10 compute-1 sshd-session[28340]: Accepted publickey for zuul from 192.168.122.30 port 32834 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:46:10 compute-1 systemd-logind[793]: New session 10 of user zuul.
Sep 30 20:46:10 compute-1 systemd[1]: Started Session 10 of User zuul.
Sep 30 20:46:10 compute-1 sshd-session[28340]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:46:11 compute-1 python3.9[28493]: ansible-ansible.legacy.ping Invoked with data=pong
Sep 30 20:46:12 compute-1 sshd-session[28338]: Invalid user user from 185.217.1.246 port 27610
Sep 30 20:46:12 compute-1 python3.9[28667]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:46:13 compute-1 sudo[28817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkutqfhmaodguqasjymcgepngtyuzbkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265172.8060484-99-41530835209169/AnsiballZ_command.py'
Sep 30 20:46:13 compute-1 sudo[28817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:13 compute-1 python3.9[28819]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:46:13 compute-1 sudo[28817]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:13 compute-1 sshd-session[28338]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 27610 ssh2 [preauth]
Sep 30 20:46:13 compute-1 sshd-session[28338]: Disconnecting invalid user user 185.217.1.246 port 27610: Too many authentication failures [preauth]
Sep 30 20:46:14 compute-1 sudo[28972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgeeedaxcklclyxaihcmlnrpotxfqgox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265173.9074667-135-224850275726055/AnsiballZ_stat.py'
Sep 30 20:46:14 compute-1 sudo[28972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:14 compute-1 python3.9[28974]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:46:14 compute-1 sudo[28972]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:15 compute-1 sudo[29124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohyaregkqrjlcnfililuqldemkumzgxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265174.8839443-159-151095711096270/AnsiballZ_file.py'
Sep 30 20:46:15 compute-1 sudo[29124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:15 compute-1 python3.9[29126]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:46:15 compute-1 sudo[29124]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:16 compute-1 sudo[29276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwafwnwjkyezxkvsarxsostfcvsulgtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265175.811807-183-172816605807641/AnsiballZ_stat.py'
Sep 30 20:46:16 compute-1 sudo[29276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:16 compute-1 python3.9[29278]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:46:16 compute-1 sudo[29276]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:16 compute-1 sshd-session[28927]: Invalid user user from 185.217.1.246 port 7083
Sep 30 20:46:17 compute-1 sudo[29399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noutyghykoiwflxymojbwlfcijegeerm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265175.811807-183-172816605807641/AnsiballZ_copy.py'
Sep 30 20:46:17 compute-1 sudo[29399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:17 compute-1 python3.9[29401]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265175.811807-183-172816605807641/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:46:17 compute-1 sudo[29399]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:17 compute-1 sudo[29551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoyhgugbjmhdkmfqwrjuhxmkpznsnuiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265177.475149-228-69485355608279/AnsiballZ_setup.py'
Sep 30 20:46:17 compute-1 sudo[29551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:18 compute-1 python3.9[29553]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:46:18 compute-1 sudo[29551]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:18 compute-1 sshd-session[28927]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 7083 ssh2 [preauth]
Sep 30 20:46:18 compute-1 sshd-session[28927]: Disconnecting invalid user user 185.217.1.246 port 7083: Too many authentication failures [preauth]
Sep 30 20:46:18 compute-1 sudo[29707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcfvlogboqofaosdzhsicybhnneqcoga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265178.6341288-252-235260109500879/AnsiballZ_file.py'
Sep 30 20:46:18 compute-1 sudo[29707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:19 compute-1 python3.9[29709]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:46:19 compute-1 sudo[29707]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:20 compute-1 python3.9[29861]: ansible-ansible.builtin.service_facts Invoked
Sep 30 20:46:20 compute-1 sshd-session[29710]: Invalid user user from 185.217.1.246 port 49298
Sep 30 20:46:26 compute-1 sshd-session[29710]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 49298 ssh2 [preauth]
Sep 30 20:46:26 compute-1 sshd-session[29710]: Disconnecting invalid user user 185.217.1.246 port 49298: Too many authentication failures [preauth]
Sep 30 20:46:27 compute-1 python3.9[30118]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:46:28 compute-1 sshd-session[30066]: Invalid user user from 185.217.1.246 port 55828
Sep 30 20:46:28 compute-1 python3.9[30268]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:46:30 compute-1 python3.9[30422]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:46:30 compute-1 sudo[30578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gurndwfzbhesmokqbvzvlmegyxufypci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265190.652117-396-126984541943447/AnsiballZ_setup.py'
Sep 30 20:46:30 compute-1 sudo[30578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:31 compute-1 python3.9[30580]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:46:31 compute-1 sudo[30578]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:31 compute-1 sudo[30662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxtkeraxtpndtbwmswqvwnfmwoenzihx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265190.652117-396-126984541943447/AnsiballZ_dnf.py'
Sep 30 20:46:31 compute-1 sudo[30662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:32 compute-1 sshd-session[30066]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 55828 ssh2 [preauth]
Sep 30 20:46:32 compute-1 sshd-session[30066]: Disconnecting invalid user user 185.217.1.246 port 55828: Too many authentication failures [preauth]
Sep 30 20:46:32 compute-1 python3.9[30664]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:46:34 compute-1 sshd-session[30666]: Invalid user user from 185.217.1.246 port 29742
Sep 30 20:46:37 compute-1 sshd-session[30666]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 29742 ssh2 [preauth]
Sep 30 20:46:37 compute-1 sshd-session[30666]: Disconnecting invalid user user 185.217.1.246 port 29742: Too many authentication failures [preauth]
Sep 30 20:46:38 compute-1 sshd-session[30741]: Invalid user user from 185.217.1.246 port 10333
Sep 30 20:46:40 compute-1 sshd-session[30741]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 10333 ssh2 [preauth]
Sep 30 20:46:40 compute-1 sshd-session[30741]: Disconnecting invalid user user 185.217.1.246 port 10333: Too many authentication failures [preauth]
Sep 30 20:46:46 compute-1 sshd-session[30763]: Invalid user user from 185.217.1.246 port 44687
Sep 30 20:46:49 compute-1 sshd-session[30763]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 44687 ssh2 [preauth]
Sep 30 20:46:49 compute-1 sshd-session[30763]: Disconnecting invalid user user 185.217.1.246 port 44687: Too many authentication failures [preauth]
Sep 30 20:46:52 compute-1 sshd-session[30765]: Invalid user user from 185.217.1.246 port 51837
Sep 30 20:46:54 compute-1 sshd-session[30765]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 51837 ssh2 [preauth]
Sep 30 20:46:54 compute-1 sshd-session[30765]: Disconnecting invalid user user 185.217.1.246 port 51837: Too many authentication failures [preauth]
Sep 30 20:46:57 compute-1 sshd-session[30767]: Invalid user user from 185.217.1.246 port 19279
Sep 30 20:47:03 compute-1 sshd-session[30767]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 19279 ssh2 [preauth]
Sep 30 20:47:03 compute-1 sshd-session[30767]: Disconnecting invalid user user 185.217.1.246 port 19279: Too many authentication failures [preauth]
Sep 30 20:47:04 compute-1 sshd-session[30769]: Invalid user user from 185.217.1.246 port 30004
Sep 30 20:47:08 compute-1 sshd-session[30769]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 30004 ssh2 [preauth]
Sep 30 20:47:08 compute-1 sshd-session[30769]: Disconnecting invalid user user 185.217.1.246 port 30004: Too many authentication failures [preauth]
Sep 30 20:47:10 compute-1 sshd-session[30782]: Invalid user user from 185.217.1.246 port 12583
Sep 30 20:47:17 compute-1 sshd-session[30782]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 12583 ssh2 [preauth]
Sep 30 20:47:17 compute-1 sshd-session[30782]: Disconnecting invalid user user 185.217.1.246 port 12583: Too many authentication failures [preauth]
Sep 30 20:47:26 compute-1 sshd-session[30853]: Invalid user user from 185.217.1.246 port 28916
Sep 30 20:47:31 compute-1 sshd-session[30853]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 28916 ssh2 [preauth]
Sep 30 20:47:31 compute-1 sshd-session[30853]: Disconnecting invalid user user 185.217.1.246 port 28916: Too many authentication failures [preauth]
Sep 30 20:47:38 compute-1 sshd-session[30888]: Invalid user user from 185.217.1.246 port 37008
Sep 30 20:47:39 compute-1 sshd-session[30888]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 37008 ssh2 [preauth]
Sep 30 20:47:39 compute-1 sshd-session[30888]: Disconnecting invalid user user 185.217.1.246 port 37008: Too many authentication failures [preauth]
Sep 30 20:47:45 compute-1 sshd-session[30890]: Invalid user user from 185.217.1.246 port 10537
Sep 30 20:47:48 compute-1 systemd[1]: Reloading.
Sep 30 20:47:48 compute-1 systemd-rc-local-generator[30945]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:47:48 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Sep 30 20:47:49 compute-1 systemd[1]: Reloading.
Sep 30 20:47:49 compute-1 systemd-rc-local-generator[30983]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:47:49 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Sep 30 20:47:49 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Sep 30 20:47:49 compute-1 systemd[1]: Reloading.
Sep 30 20:47:49 compute-1 systemd-rc-local-generator[31025]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:47:49 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Sep 30 20:47:49 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Sep 30 20:47:49 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Sep 30 20:47:49 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Sep 30 20:47:51 compute-1 sshd-session[30890]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 10537 ssh2 [preauth]
Sep 30 20:47:51 compute-1 sshd-session[30890]: Disconnecting invalid user user 185.217.1.246 port 10537: Too many authentication failures [preauth]
Sep 30 20:47:56 compute-1 sshd-session[31066]: Invalid user Sujan from 194.0.234.19 port 16866
Sep 30 20:47:57 compute-1 sshd-session[31066]: Connection closed by invalid user Sujan 194.0.234.19 port 16866 [preauth]
Sep 30 20:47:59 compute-1 sshd-session[31053]: Invalid user user from 185.217.1.246 port 57003
Sep 30 20:48:00 compute-1 sshd-session[31053]: Disconnecting invalid user user 185.217.1.246 port 57003: Change of username or service not allowed: (user,ssh-connection) -> (eth,ssh-connection) [preauth]
Sep 30 20:48:04 compute-1 sshd-session[31084]: Invalid user eth from 185.217.1.246 port 62459
Sep 30 20:48:05 compute-1 sshd-session[31084]: Disconnecting invalid user eth 185.217.1.246 port 62459: Change of username or service not allowed: (eth,ssh-connection) -> (cs2sv,ssh-connection) [preauth]
Sep 30 20:48:10 compute-1 sshd-session[31097]: Invalid user cs2sv from 185.217.1.246 port 43792
Sep 30 20:48:11 compute-1 sshd-session[31097]: Disconnecting invalid user cs2sv 185.217.1.246 port 43792: Change of username or service not allowed: (cs2sv,ssh-connection) -> (mysql,ssh-connection) [preauth]
Sep 30 20:48:16 compute-1 sshd-session[31113]: Invalid user mysql from 185.217.1.246 port 33128
Sep 30 20:48:17 compute-1 sshd-session[31113]: Disconnecting invalid user mysql 185.217.1.246 port 33128: Change of username or service not allowed: (mysql,ssh-connection) -> (user13,ssh-connection) [preauth]
Sep 30 20:48:23 compute-1 sshd-session[31132]: Invalid user user13 from 185.217.1.246 port 64809
Sep 30 20:48:26 compute-1 sshd-session[31132]: Disconnecting invalid user user13 185.217.1.246 port 64809: Change of username or service not allowed: (user13,ssh-connection) -> (user123,ssh-connection) [preauth]
Sep 30 20:48:30 compute-1 sshd-session[31165]: Invalid user user123 from 185.217.1.246 port 20109
Sep 30 20:48:31 compute-1 sshd-session[31165]: Disconnecting invalid user user123 185.217.1.246 port 20109: Change of username or service not allowed: (user123,ssh-connection) -> (postgres,ssh-connection) [preauth]
Sep 30 20:48:35 compute-1 sshd-session[31184]: Invalid user postgres from 185.217.1.246 port 60784
Sep 30 20:48:37 compute-1 sshd-session[31184]: Disconnecting invalid user postgres 185.217.1.246 port 60784: Change of username or service not allowed: (postgres,ssh-connection) -> (ethos,ssh-connection) [preauth]
Sep 30 20:48:39 compute-1 sshd-session[31201]: Invalid user ethos from 185.217.1.246 port 35018
Sep 30 20:48:40 compute-1 sshd-session[31201]: Disconnecting invalid user ethos 185.217.1.246 port 35018: Change of username or service not allowed: (ethos,ssh-connection) -> (rob,ssh-connection) [preauth]
Sep 30 20:48:44 compute-1 sshd-session[31211]: Invalid user rob from 185.217.1.246 port 61798
Sep 30 20:48:44 compute-1 sshd-session[31211]: Connection closed by invalid user rob 185.217.1.246 port 61798 [preauth]
Sep 30 20:49:07 compute-1 kernel: SELinux:  Converting 2713 SID table entries...
Sep 30 20:49:07 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:49:07 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 20:49:07 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:49:07 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:49:07 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:49:07 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:49:07 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:49:07 compute-1 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Sep 30 20:49:07 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:49:07 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:49:07 compute-1 systemd[1]: Reloading.
Sep 30 20:49:07 compute-1 systemd-rc-local-generator[31359]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:49:08 compute-1 systemd[1]: Starting dnf makecache...
Sep 30 20:49:08 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:49:08 compute-1 dnf[31408]: Failed determining last makecache time.
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-barbican-42b4c41831408a8e323 113 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 systemd[1]: Starting PackageKit Daemon...
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 161 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 PackageKit[31666]: daemon start
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-cinder-1c00d6490d88e436f26ef 163 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-python-stevedore-c4acc5639fd2329372142 172 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 systemd[1]: Started PackageKit Daemon.
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-python-cloudkitty-tests-tempest-3961dc 151 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-os-net-config-a7aafa88064e25852eddee77 185 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 141 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-python-designate-tests-tempest-347fdbc 148 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-glance-1fd12c29b339f30fe823e 153 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 149 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-manila-3c01b7181572c95dac462 153 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-python-whitebox-neutron-tests-tempest- 155 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-octavia-ba397f07a7331190208c 160 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 sudo[30662]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-watcher-c014f81a8647287f6dcc 165 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-python-tcib-c895740e59940c0bad2e206b0f 157 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 151 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-swift-dc98a8463506ac520c469a 159 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-python-tempestconf-8515371b7cceebd4282 161 kB/s | 3.0 kB     00:00
Sep 30 20:49:08 compute-1 dnf[31408]: delorean-openstack-heat-ui-013accbfd179753bc3f0 164 kB/s | 3.0 kB     00:00
Sep 30 20:49:09 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:49:09 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:49:09 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.274s CPU time.
Sep 30 20:49:09 compute-1 systemd[1]: run-rf121cb05c2a54631b962d71aaa1489d6.service: Deactivated successfully.
Sep 30 20:49:09 compute-1 dnf[31408]: CentOS Stream 9 - BaseOS                         26 kB/s | 7.0 kB     00:00
Sep 30 20:49:09 compute-1 dnf[31408]: CentOS Stream 9 - AppStream                      68 kB/s | 7.1 kB     00:00
Sep 30 20:49:09 compute-1 sudo[32297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewrtluvzorbnxopuzovfewhfpcfkvmoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265348.870903-432-160901423582405/AnsiballZ_command.py'
Sep 30 20:49:09 compute-1 sudo[32297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:09 compute-1 python3.9[32299]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:49:09 compute-1 dnf[31408]: CentOS Stream 9 - CRB                            25 kB/s | 6.9 kB     00:00
Sep 30 20:49:09 compute-1 dnf[31408]: CentOS Stream 9 - Extras packages                55 kB/s | 8.0 kB     00:00
Sep 30 20:49:09 compute-1 dnf[31408]: dlrn-antelope-testing                           147 kB/s | 3.0 kB     00:00
Sep 30 20:49:09 compute-1 dnf[31408]: dlrn-antelope-build-deps                        144 kB/s | 3.0 kB     00:00
Sep 30 20:49:09 compute-1 dnf[31408]: centos9-rabbitmq                                 80 kB/s | 3.0 kB     00:00
Sep 30 20:49:09 compute-1 dnf[31408]: centos9-storage                                 102 kB/s | 3.0 kB     00:00
Sep 30 20:49:09 compute-1 dnf[31408]: centos9-opstools                                108 kB/s | 3.0 kB     00:00
Sep 30 20:49:10 compute-1 dnf[31408]: NFV SIG OpenvSwitch                              83 kB/s | 3.0 kB     00:00
Sep 30 20:49:10 compute-1 dnf[31408]: repo-setup-centos-appstream                      82 kB/s | 4.4 kB     00:00
Sep 30 20:49:10 compute-1 dnf[31408]: repo-setup-centos-baseos                        151 kB/s | 3.9 kB     00:00
Sep 30 20:49:10 compute-1 dnf[31408]: repo-setup-centos-highavailability              137 kB/s | 3.9 kB     00:00
Sep 30 20:49:10 compute-1 dnf[31408]: repo-setup-centos-powertools                    132 kB/s | 4.3 kB     00:00
Sep 30 20:49:10 compute-1 sudo[32297]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:10 compute-1 dnf[31408]: Extra Packages for Enterprise Linux 9 - x86_64   99 kB/s |  35 kB     00:00
Sep 30 20:49:11 compute-1 dnf[31408]: Metadata cache created.
Sep 30 20:49:11 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Sep 30 20:49:11 compute-1 systemd[1]: Finished dnf makecache.
Sep 30 20:49:11 compute-1 systemd[1]: dnf-makecache.service: Consumed 1.976s CPU time.
Sep 30 20:49:11 compute-1 sudo[32600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhgkvyduclyyjkrmcggglgznvmfhluib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265350.9073539-456-164063702946440/AnsiballZ_selinux.py'
Sep 30 20:49:11 compute-1 sudo[32600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:11 compute-1 python3.9[32602]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Sep 30 20:49:11 compute-1 sudo[32600]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:12 compute-1 sudo[32752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsuiyuszxrpwoxsfulntktmaikgeyjgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265352.4907503-489-269699471666416/AnsiballZ_command.py'
Sep 30 20:49:12 compute-1 sudo[32752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:13 compute-1 python3.9[32754]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Sep 30 20:49:14 compute-1 sudo[32752]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:14 compute-1 sudo[32906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idzzvnogxkgfxkpjohmeblytmomykybm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265354.3967762-513-40127153282015/AnsiballZ_file.py'
Sep 30 20:49:14 compute-1 sudo[32906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:15 compute-1 python3.9[32908]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:49:15 compute-1 sudo[32906]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:18 compute-1 sudo[33058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhycsmibswncskfmvjtsdkzakzyesacz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265358.2092676-537-99632465072088/AnsiballZ_mount.py'
Sep 30 20:49:18 compute-1 sudo[33058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:18 compute-1 python3.9[33060]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Sep 30 20:49:19 compute-1 sudo[33058]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:20 compute-1 sudo[33211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxpkbctekkqkavmrayhiblrxsjtiucfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265359.9857574-622-97875778810813/AnsiballZ_file.py'
Sep 30 20:49:20 compute-1 sudo[33211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:22 compute-1 python3.9[33213]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:49:22 compute-1 sudo[33211]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:23 compute-1 sudo[33363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reaxgfruuvfkhkcqhpehjuywhkowaocc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265362.9491103-646-127511317722451/AnsiballZ_stat.py'
Sep 30 20:49:23 compute-1 sudo[33363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:23 compute-1 python3.9[33365]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:49:23 compute-1 sudo[33363]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:23 compute-1 sudo[33486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehctyaifxoaqhxtowlfmdygxhnbzqtrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265362.9491103-646-127511317722451/AnsiballZ_copy.py'
Sep 30 20:49:23 compute-1 sudo[33486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:24 compute-1 python3.9[33488]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265362.9491103-646-127511317722451/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:49:24 compute-1 sudo[33486]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:25 compute-1 sudo[33638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stycfzzzdjlrbvtmhkohfybljqstbwsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265365.2239592-726-83739939261615/AnsiballZ_getent.py'
Sep 30 20:49:25 compute-1 sudo[33638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:25 compute-1 python3.9[33640]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Sep 30 20:49:25 compute-1 sudo[33638]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:26 compute-1 sudo[33791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkhodgxwfeaorziezztupcvfejbwupeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265366.1174934-750-149076939607690/AnsiballZ_group.py'
Sep 30 20:49:26 compute-1 sudo[33791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:26 compute-1 python3.9[33793]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 20:49:26 compute-1 groupadd[33794]: group added to /etc/group: name=qemu, GID=107
Sep 30 20:49:26 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 20:49:26 compute-1 groupadd[33794]: group added to /etc/gshadow: name=qemu
Sep 30 20:49:26 compute-1 groupadd[33794]: new group: name=qemu, GID=107
Sep 30 20:49:26 compute-1 sudo[33791]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:28 compute-1 sudo[33950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btumaaseqndvszhukavfqywachbjxqli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265367.6461313-774-184806317010560/AnsiballZ_user.py'
Sep 30 20:49:28 compute-1 sudo[33950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:28 compute-1 python3.9[33952]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 20:49:28 compute-1 useradd[33954]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 20:49:28 compute-1 sudo[33950]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:29 compute-1 sudo[34110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izuapuuvwqfpxqkymisuftgrtdwzdidt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265368.7493815-798-233125575225865/AnsiballZ_getent.py'
Sep 30 20:49:29 compute-1 sudo[34110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:29 compute-1 python3.9[34112]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Sep 30 20:49:29 compute-1 sudo[34110]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:29 compute-1 sudo[34263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrunvvdfmaduzcutxyjvllbsidkvjujm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265369.5717268-822-65007422697061/AnsiballZ_group.py'
Sep 30 20:49:29 compute-1 sudo[34263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:30 compute-1 python3.9[34265]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 20:49:30 compute-1 groupadd[34266]: group added to /etc/group: name=hugetlbfs, GID=42477
Sep 30 20:49:30 compute-1 groupadd[34266]: group added to /etc/gshadow: name=hugetlbfs
Sep 30 20:49:30 compute-1 groupadd[34266]: new group: name=hugetlbfs, GID=42477
Sep 30 20:49:30 compute-1 sudo[34263]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:30 compute-1 sudo[34421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwcmksopyicebtcrpcdzjfiazecggkwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265370.4654377-849-103799627583894/AnsiballZ_file.py'
Sep 30 20:49:30 compute-1 sudo[34421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:31 compute-1 python3.9[34423]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Sep 30 20:49:31 compute-1 sudo[34421]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:31 compute-1 sudo[34573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oidavdfbtjhdpstfsjsoakjuoavmnzxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265371.5313761-882-274513095994641/AnsiballZ_dnf.py'
Sep 30 20:49:31 compute-1 sudo[34573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:32 compute-1 python3.9[34575]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:49:33 compute-1 sudo[34573]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:37 compute-1 sudo[34726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcrgeeagjsnyqctgtjzhulksgbdenbqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265377.0307932-906-175780280295003/AnsiballZ_file.py'
Sep 30 20:49:37 compute-1 sudo[34726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:37 compute-1 python3.9[34728]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:49:37 compute-1 sudo[34726]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:38 compute-1 sudo[34878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjluqhycahmpjgjqpyjzwawxjxtqijji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265378.037124-930-81242584517005/AnsiballZ_stat.py'
Sep 30 20:49:38 compute-1 sudo[34878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:38 compute-1 python3.9[34880]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:49:38 compute-1 sudo[34878]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:39 compute-1 sudo[35001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thqfklzthdgfpsudiodogvdubhocexft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265378.037124-930-81242584517005/AnsiballZ_copy.py'
Sep 30 20:49:39 compute-1 sudo[35001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:39 compute-1 python3.9[35003]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265378.037124-930-81242584517005/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:49:39 compute-1 sudo[35001]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:40 compute-1 sudo[35153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-facmjjaimbkddyewjriqxsddhywmnonv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265379.5363505-975-55805429626859/AnsiballZ_systemd.py'
Sep 30 20:49:40 compute-1 sudo[35153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:40 compute-1 python3.9[35155]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:49:40 compute-1 systemd[1]: Starting Load Kernel Modules...
Sep 30 20:49:40 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Sep 30 20:49:40 compute-1 kernel: Bridge firewalling registered
Sep 30 20:49:40 compute-1 systemd-modules-load[35159]: Inserted module 'br_netfilter'
Sep 30 20:49:40 compute-1 systemd[1]: Finished Load Kernel Modules.
Sep 30 20:49:40 compute-1 sudo[35153]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:41 compute-1 sudo[35313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyhcoijqwjglopcjkwluyiaqxibbijvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265380.9254398-999-139336229796842/AnsiballZ_stat.py'
Sep 30 20:49:41 compute-1 sudo[35313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:41 compute-1 python3.9[35315]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:49:41 compute-1 sudo[35313]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:42 compute-1 sudo[35436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ninoyeraihkyzbxugwyqzadgwvblawit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265380.9254398-999-139336229796842/AnsiballZ_copy.py'
Sep 30 20:49:42 compute-1 sudo[35436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:42 compute-1 python3.9[35438]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265380.9254398-999-139336229796842/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:49:42 compute-1 sudo[35436]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:43 compute-1 sudo[35588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpvfzrbnpvbhmhnhlhoudvsnpeozrcdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265382.8091838-1053-30446076933714/AnsiballZ_dnf.py'
Sep 30 20:49:43 compute-1 sudo[35588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:43 compute-1 python3.9[35590]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:49:47 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Sep 30 20:49:47 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Sep 30 20:49:48 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:49:48 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:49:48 compute-1 systemd[1]: Reloading.
Sep 30 20:49:48 compute-1 systemd-rc-local-generator[35663]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:49:48 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:49:48 compute-1 sudo[35588]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:51 compute-1 python3.9[38212]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:49:52 compute-1 python3.9[39118]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Sep 30 20:49:52 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:49:52 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:49:52 compute-1 systemd[1]: man-db-cache-update.service: Consumed 5.792s CPU time.
Sep 30 20:49:52 compute-1 systemd[1]: run-r2c8d2b2fd3724a54a7a54326b118486c.service: Deactivated successfully.
Sep 30 20:49:53 compute-1 python3.9[39617]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:49:53 compute-1 sudo[39768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgobnfyzsogejvmtidkyopjvzdmfdccr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265393.4009356-1170-212428264641638/AnsiballZ_command.py'
Sep 30 20:49:53 compute-1 sudo[39768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:54 compute-1 python3.9[39770]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:49:54 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 20:49:54 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 20:49:54 compute-1 sudo[39768]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:55 compute-1 sudo[40141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwmgjiokmcbhcywanhzrktwseibionsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265395.261337-1197-130878047959734/AnsiballZ_systemd.py'
Sep 30 20:49:55 compute-1 sudo[40141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:55 compute-1 python3.9[40143]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:49:56 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Sep 30 20:49:56 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Sep 30 20:49:56 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Sep 30 20:49:56 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 20:49:56 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 20:49:56 compute-1 sudo[40141]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:57 compute-1 python3.9[40305]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Sep 30 20:50:00 compute-1 sudo[40455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtrrbtknabrvypmqwulbknmswyrnkamv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265399.8697307-1368-253629879880149/AnsiballZ_systemd.py'
Sep 30 20:50:00 compute-1 sudo[40455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:00 compute-1 python3.9[40457]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:50:00 compute-1 systemd[1]: Reloading.
Sep 30 20:50:00 compute-1 systemd-rc-local-generator[40488]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:50:00 compute-1 sudo[40455]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:01 compute-1 sudo[40644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esoxzixxnhjcbwmdtxxzdwibdksbceqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265401.0847023-1368-136427629423799/AnsiballZ_systemd.py'
Sep 30 20:50:01 compute-1 sudo[40644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:01 compute-1 python3.9[40646]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:50:01 compute-1 systemd[1]: Reloading.
Sep 30 20:50:01 compute-1 systemd-rc-local-generator[40676]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:50:02 compute-1 sudo[40644]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:02 compute-1 sudo[40833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ollirhchbtyntfrdumuhmuavfofrvfxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265402.4149272-1416-121384830534011/AnsiballZ_command.py'
Sep 30 20:50:02 compute-1 sudo[40833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:02 compute-1 python3.9[40835]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:02 compute-1 sudo[40833]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:03 compute-1 sudo[40986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgzsljchccemvcwlojstbwkumksooayf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265403.236909-1440-178750920799233/AnsiballZ_command.py'
Sep 30 20:50:03 compute-1 sudo[40986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:03 compute-1 python3.9[40988]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:03 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Sep 30 20:50:03 compute-1 sudo[40986]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:04 compute-1 sudo[41139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yozgwegdocduxqfexsxcnnrnkctsubll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265404.1000683-1464-139020334717268/AnsiballZ_command.py'
Sep 30 20:50:04 compute-1 sudo[41139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:04 compute-1 python3.9[41141]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:06 compute-1 sudo[41139]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:06 compute-1 sudo[41301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioaryggtrkbfexatcdbhnbbpiidnodfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265406.5786786-1488-157653055702656/AnsiballZ_command.py'
Sep 30 20:50:06 compute-1 sudo[41301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:07 compute-1 python3.9[41303]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:07 compute-1 sudo[41301]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:07 compute-1 sudo[41454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdnywjequyyjfdiarqmgfmtcqxupodcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265407.387425-1512-91854850805739/AnsiballZ_systemd.py'
Sep 30 20:50:07 compute-1 sudo[41454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:08 compute-1 python3.9[41456]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:50:08 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 20:50:08 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Sep 30 20:50:08 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Sep 30 20:50:08 compute-1 systemd[1]: Starting Apply Kernel Variables...
Sep 30 20:50:08 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 20:50:08 compute-1 systemd[1]: Finished Apply Kernel Variables.
Sep 30 20:50:08 compute-1 sudo[41454]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:08 compute-1 sshd-session[28343]: Connection closed by 192.168.122.30 port 32834
Sep 30 20:50:08 compute-1 sshd-session[28340]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:50:08 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Sep 30 20:50:08 compute-1 systemd[1]: session-10.scope: Consumed 2min 22.724s CPU time.
Sep 30 20:50:08 compute-1 systemd-logind[793]: Session 10 logged out. Waiting for processes to exit.
Sep 30 20:50:08 compute-1 systemd-logind[793]: Removed session 10.
Sep 30 20:50:13 compute-1 sshd-session[41487]: Accepted publickey for zuul from 192.168.122.30 port 57238 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:50:13 compute-1 systemd-logind[793]: New session 11 of user zuul.
Sep 30 20:50:13 compute-1 systemd[1]: Started Session 11 of User zuul.
Sep 30 20:50:13 compute-1 sshd-session[41487]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:50:15 compute-1 python3.9[41640]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:50:16 compute-1 python3.9[41794]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:50:17 compute-1 sudo[41948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdmpjyhihqeautfoumqadyihpqehzdyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265417.3546183-116-188595213762431/AnsiballZ_command.py'
Sep 30 20:50:17 compute-1 sudo[41948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:18 compute-1 python3.9[41950]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:18 compute-1 sudo[41948]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:19 compute-1 python3.9[42101]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:50:20 compute-1 sudo[42255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hulpxvpbuawwbfgjpiwxqmztwzmabchq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265419.7213616-176-279978986123896/AnsiballZ_setup.py'
Sep 30 20:50:20 compute-1 sudo[42255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:20 compute-1 python3.9[42257]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:50:20 compute-1 sudo[42255]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:21 compute-1 sudo[42339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahqdhyuxqbqyqlvbwktcbfxrxtqthvlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265419.7213616-176-279978986123896/AnsiballZ_dnf.py'
Sep 30 20:50:21 compute-1 sudo[42339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:21 compute-1 python3.9[42341]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:50:22 compute-1 sudo[42339]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:23 compute-1 sudo[42492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okqvpqkqaopqhsxpfdpawsdozgocpocj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265422.7383442-212-250671041685677/AnsiballZ_setup.py'
Sep 30 20:50:23 compute-1 sudo[42492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:23 compute-1 python3.9[42494]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:50:23 compute-1 sudo[42492]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:24 compute-1 sudo[42663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jncynjwpztwdtdcwtznqtdpfobkqazps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265423.9594586-245-104695300280010/AnsiballZ_file.py'
Sep 30 20:50:24 compute-1 sudo[42663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:24 compute-1 python3.9[42665]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:50:24 compute-1 sudo[42663]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:25 compute-1 sudo[42815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulwqnxcvgmgveuruqvqdxcokhccsifdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265424.88877-269-269790151751035/AnsiballZ_command.py'
Sep 30 20:50:25 compute-1 sudo[42815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:25 compute-1 python3.9[42817]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:25 compute-1 podman[42818]: 2025-09-30 20:50:25.473482182 +0000 UTC m=+0.054371385 system refresh
Sep 30 20:50:25 compute-1 sudo[42815]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:26 compute-1 sudo[42978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teynozorsatdoehqkurpqczbmenatbsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265425.8425207-293-153900659320453/AnsiballZ_stat.py'
Sep 30 20:50:26 compute-1 sudo[42978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:26 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:50:26 compute-1 python3.9[42980]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:50:26 compute-1 sudo[42978]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:27 compute-1 sudo[43101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mipbxjakzxkeijkjlakfqybgoxzkhotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265425.8425207-293-153900659320453/AnsiballZ_copy.py'
Sep 30 20:50:27 compute-1 sudo[43101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:27 compute-1 python3.9[43103]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265425.8425207-293-153900659320453/.source.json follow=False _original_basename=podman_network_config.j2 checksum=45f0c676be7f4939e18ffa39e23dc61e7eb7627a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:50:27 compute-1 sudo[43101]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:28 compute-1 sudo[43253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuwforhrgfpofpdxzfxmpodtcyypzjbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265427.6482337-338-29817403276951/AnsiballZ_stat.py'
Sep 30 20:50:28 compute-1 sudo[43253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:28 compute-1 python3.9[43255]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:50:28 compute-1 sudo[43253]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:28 compute-1 sudo[43376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngqunfxiwntpleotbzldmxwjkfsikfuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265427.6482337-338-29817403276951/AnsiballZ_copy.py'
Sep 30 20:50:28 compute-1 sudo[43376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:28 compute-1 python3.9[43378]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265427.6482337-338-29817403276951/.source.conf follow=False _original_basename=registries.conf.j2 checksum=74ad3fdf1c9c551f4957cab58c04bb2f8b0dc3e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:28 compute-1 sudo[43376]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:29 compute-1 sudo[43528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaflvxyhqgwluccyppokrqafgvhunqud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265429.2839136-386-114007629333669/AnsiballZ_ini_file.py'
Sep 30 20:50:29 compute-1 sudo[43528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:29 compute-1 python3.9[43530]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:30 compute-1 sudo[43528]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:30 compute-1 sudo[43680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypkfrlkjgvxncxdzgpxlsxligomnnfhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265430.2014585-386-23185110368633/AnsiballZ_ini_file.py'
Sep 30 20:50:30 compute-1 sudo[43680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:30 compute-1 python3.9[43682]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:30 compute-1 sudo[43680]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:31 compute-1 sudo[43832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmgkxtnfasotpsagakeustzssmnfdmib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265431.0420463-386-256419066824739/AnsiballZ_ini_file.py'
Sep 30 20:50:31 compute-1 sudo[43832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:31 compute-1 python3.9[43834]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:31 compute-1 sudo[43832]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:32 compute-1 sudo[43984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsftdvhemytrfwwutwkdkdevplardowq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265431.8055131-386-205395552179277/AnsiballZ_ini_file.py'
Sep 30 20:50:32 compute-1 sudo[43984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:32 compute-1 python3.9[43986]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:32 compute-1 sudo[43984]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:33 compute-1 python3.9[44136]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:50:34 compute-1 sudo[44288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzsqmtjnhfbxczkljgfjixuuxkchwflk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265433.6763413-506-37022953796014/AnsiballZ_dnf.py'
Sep 30 20:50:34 compute-1 sudo[44288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:34 compute-1 python3.9[44290]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:35 compute-1 sudo[44288]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:36 compute-1 sudo[44441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaivhcpnakuclgoucsvvfxabxnllqchi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265435.767548-530-255088500176293/AnsiballZ_dnf.py'
Sep 30 20:50:36 compute-1 sudo[44441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:36 compute-1 python3.9[44443]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:38 compute-1 sudo[44441]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:39 compute-1 sudo[44601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-patjntxbgzzpiteaaeeumfmlcsvaenaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265438.6945674-560-135586170553387/AnsiballZ_dnf.py'
Sep 30 20:50:39 compute-1 sudo[44601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:39 compute-1 python3.9[44603]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:40 compute-1 sudo[44601]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:41 compute-1 sudo[44754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxxtvkovlzudqkjweemgedxitonbefmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265440.9000297-587-78666481679708/AnsiballZ_dnf.py'
Sep 30 20:50:41 compute-1 sudo[44754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:41 compute-1 python3.9[44756]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:42 compute-1 sudo[44754]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:43 compute-1 sudo[44907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pckgozqaijuajmzvzyieirdvmmgbevxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265443.152543-620-117695941422979/AnsiballZ_dnf.py'
Sep 30 20:50:43 compute-1 sudo[44907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:43 compute-1 python3.9[44909]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:45 compute-1 sudo[44907]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:45 compute-1 sudo[45063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxehwiinsxhzwhhbcryuvbvobaslftcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265445.5572703-644-200180272425660/AnsiballZ_dnf.py'
Sep 30 20:50:45 compute-1 sudo[45063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:46 compute-1 python3.9[45065]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:48 compute-1 sudo[45063]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:49 compute-1 sudo[45231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syeggxijydniironpnngyfkqcjybwokv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265449.2906578-671-97463768635717/AnsiballZ_dnf.py'
Sep 30 20:50:49 compute-1 sudo[45231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:49 compute-1 python3.9[45233]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:51 compute-1 sudo[45231]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:51 compute-1 sudo[45384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blucuxopsgxfxicttchthbdmkbcjirvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265451.4462135-698-147067003151024/AnsiballZ_dnf.py'
Sep 30 20:50:51 compute-1 sudo[45384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:52 compute-1 python3.9[45386]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:51:06 compute-1 sudo[45384]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:20 compute-1 sudo[45721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baniovqoohrerzypipeccynsihukbfjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265480.1922297-731-6313313192965/AnsiballZ_file.py'
Sep 30 20:51:20 compute-1 sudo[45721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:20 compute-1 python3.9[45723]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:51:20 compute-1 sudo[45721]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:21 compute-1 sudo[45896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlrxaozbvvunnwbdfimmvduiysaupjqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265480.9448476-755-168515195735257/AnsiballZ_stat.py'
Sep 30 20:51:21 compute-1 sudo[45896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:21 compute-1 python3.9[45898]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:51:21 compute-1 sudo[45896]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:22 compute-1 sudo[46019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opstsjhwejeystytpalzamgnyoxnzizt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265480.9448476-755-168515195735257/AnsiballZ_copy.py'
Sep 30 20:51:22 compute-1 sudo[46019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:22 compute-1 python3.9[46021]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759265480.9448476-755-168515195735257/.source.json _original_basename=.ju66r9hj follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:51:22 compute-1 sudo[46019]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:23 compute-1 sudo[46171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bciwqzebtfbxgdnwjupdvhgqwsbhkqpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265482.612473-809-13406717296548/AnsiballZ_podman_image.py'
Sep 30 20:51:23 compute-1 sudo[46171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:23 compute-1 python3.9[46173]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat2055789181-lower\x2dmapped.mount: Deactivated successfully.
Sep 30 20:51:30 compute-1 podman[46185]: 2025-09-30 20:51:30.371161403 +0000 UTC m=+6.827212356 image pull 4c2cf735485aec82560a51e8042a9e65bbe194a07c6812512d6a5e2ed955852b quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Sep 30 20:51:30 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:30 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:30 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:30 compute-1 sudo[46171]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:31 compute-1 sudo[46483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnudztufuprvdjfsfbvdhhfhztvukuyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265491.0571582-836-233169682256297/AnsiballZ_podman_image.py'
Sep 30 20:51:31 compute-1 sudo[46483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:31 compute-1 python3.9[46485]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:31 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:34 compute-1 podman[46496]: 2025-09-30 20:51:34.228688536 +0000 UTC m=+2.459631475 image pull 7ffac6b06b247caf26cf673b775a5f070f2fa1a6008cf0b0964af7e905ba86a5 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Sep 30 20:51:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:34 compute-1 sudo[46483]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:35 compute-1 sudo[46753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scnjzeydscvpvjuasrsjbkcwhzpygsvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265495.034857-875-119587547320663/AnsiballZ_podman_image.py'
Sep 30 20:51:35 compute-1 sudo[46753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:35 compute-1 python3.9[46755]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:36 compute-1 podman[46766]: 2025-09-30 20:51:36.809694521 +0000 UTC m=+1.074536896 image pull 80aeb93432d60c5f52c5325081f51dbf5658fe1615083ed284852e8f6df43250 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Sep 30 20:51:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:37 compute-1 sudo[46753]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:37 compute-1 sudo[47001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owmnuqirhoblrxuurmcdmjavtvfylxjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265497.443975-902-229972801226375/AnsiballZ_podman_image.py'
Sep 30 20:51:37 compute-1 sudo[47001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:38 compute-1 python3.9[47003]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:48 compute-1 podman[47015]: 2025-09-30 20:51:48.710627498 +0000 UTC m=+10.604983221 image pull 613e2b735827096139e990f475c5ac5de0e55d8048941a4521c0c17a4351e975 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Sep 30 20:51:48 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:48 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:48 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:49 compute-1 sudo[47001]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:52 compute-1 sudo[47289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrblvcfmmmemnbppwmgsffvrtjlyjsse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265511.7248533-935-23697270106150/AnsiballZ_podman_image.py'
Sep 30 20:51:52 compute-1 sudo[47289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:52 compute-1 python3.9[47291]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:56 compute-1 podman[47303]: 2025-09-30 20:51:56.065463932 +0000 UTC m=+3.616755472 image pull c1fbb3a9fe801a81492a24a592ec5927cb36487bb102738c2047084bd3d79886 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Sep 30 20:51:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:56 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:56 compute-1 sudo[47289]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:56 compute-1 sudo[47561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttddaaeffcowtolavkkhenlmahhjdzxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265516.5969377-935-115788109974282/AnsiballZ_podman_image.py'
Sep 30 20:51:56 compute-1 sudo[47561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:57 compute-1 python3.9[47563]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:58 compute-1 podman[47578]: 2025-09-30 20:51:58.659451121 +0000 UTC m=+1.336100713 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Sep 30 20:51:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:58 compute-1 sudo[47561]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:59 compute-1 sshd-session[41490]: Connection closed by 192.168.122.30 port 57238
Sep 30 20:51:59 compute-1 sshd-session[41487]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:51:59 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Sep 30 20:51:59 compute-1 systemd[1]: session-11.scope: Consumed 1min 45.107s CPU time.
Sep 30 20:51:59 compute-1 systemd-logind[793]: Session 11 logged out. Waiting for processes to exit.
Sep 30 20:51:59 compute-1 systemd-logind[793]: Removed session 11.
Sep 30 20:52:04 compute-1 sshd-session[47726]: Accepted publickey for zuul from 192.168.122.30 port 48066 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:52:04 compute-1 systemd-logind[793]: New session 12 of user zuul.
Sep 30 20:52:04 compute-1 systemd[1]: Started Session 12 of User zuul.
Sep 30 20:52:04 compute-1 sshd-session[47726]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:52:06 compute-1 python3.9[47879]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:52:07 compute-1 sudo[48034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-presrrbvqbpnebnuplfupzjfoyhcannw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265526.815127-74-28916956677332/AnsiballZ_getent.py'
Sep 30 20:52:07 compute-1 sudo[48034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:07 compute-1 python3.9[48036]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Sep 30 20:52:07 compute-1 sudo[48034]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:08 compute-1 sudo[48187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wokqrurmgtjogchcmtpvtwupqeebxsqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265527.7593253-98-140500740932590/AnsiballZ_group.py'
Sep 30 20:52:08 compute-1 sudo[48187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:08 compute-1 python3.9[48189]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 20:52:08 compute-1 groupadd[48190]: group added to /etc/group: name=openvswitch, GID=42476
Sep 30 20:52:08 compute-1 groupadd[48190]: group added to /etc/gshadow: name=openvswitch
Sep 30 20:52:08 compute-1 groupadd[48190]: new group: name=openvswitch, GID=42476
Sep 30 20:52:08 compute-1 sudo[48187]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:09 compute-1 sudo[48345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhxhnacgkjsvckfeynskrenasuluzzmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265529.3092053-122-135212909918874/AnsiballZ_user.py'
Sep 30 20:52:09 compute-1 sudo[48345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:10 compute-1 python3.9[48347]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 20:52:10 compute-1 useradd[48349]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 20:52:10 compute-1 useradd[48349]: add 'openvswitch' to group 'hugetlbfs'
Sep 30 20:52:10 compute-1 useradd[48349]: add 'openvswitch' to shadow group 'hugetlbfs'
Sep 30 20:52:10 compute-1 sudo[48345]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:12 compute-1 sudo[48505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgspvruxhpvxhtxforqfcmsdcxjslioz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265531.663687-152-248142757092576/AnsiballZ_setup.py'
Sep 30 20:52:12 compute-1 sudo[48505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:12 compute-1 python3.9[48507]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:52:12 compute-1 sudo[48505]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:13 compute-1 sudo[48589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgqvkibzzyngbcofpiscdmspgrpunmrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265531.663687-152-248142757092576/AnsiballZ_dnf.py'
Sep 30 20:52:13 compute-1 sudo[48589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:13 compute-1 python3.9[48591]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:52:15 compute-1 sudo[48589]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:16 compute-1 sudo[48751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzayunppjiwwnphintfqcukvoofubapq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265536.212422-194-44800502979139/AnsiballZ_dnf.py'
Sep 30 20:52:16 compute-1 sudo[48751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:16 compute-1 python3.9[48753]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:52:28 compute-1 kernel: SELinux:  Converting 2725 SID table entries...
Sep 30 20:52:28 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:52:28 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 20:52:28 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:52:28 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:52:28 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:52:28 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:52:28 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:52:28 compute-1 groupadd[48776]: group added to /etc/group: name=unbound, GID=993
Sep 30 20:52:28 compute-1 groupadd[48776]: group added to /etc/gshadow: name=unbound
Sep 30 20:52:28 compute-1 groupadd[48776]: new group: name=unbound, GID=993
Sep 30 20:52:28 compute-1 useradd[48783]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Sep 30 20:52:28 compute-1 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Sep 30 20:52:28 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Sep 30 20:52:30 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:52:30 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:52:30 compute-1 systemd[1]: Reloading.
Sep 30 20:52:31 compute-1 systemd-sysv-generator[49285]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:52:31 compute-1 systemd-rc-local-generator[49281]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:52:31 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:52:31 compute-1 sudo[48751]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:31 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:52:31 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:52:31 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.091s CPU time.
Sep 30 20:52:31 compute-1 systemd[1]: run-rdcc5872862f143ca878e02c5726d7545.service: Deactivated successfully.
Sep 30 20:52:43 compute-1 sudo[49852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqrhyotaewticjnqynbtetugcgwgzspd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265562.4217086-218-275964709257356/AnsiballZ_systemd.py'
Sep 30 20:52:43 compute-1 sudo[49852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:43 compute-1 python3.9[49854]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 20:52:44 compute-1 systemd[1]: Reloading.
Sep 30 20:52:44 compute-1 systemd-rc-local-generator[49881]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:52:44 compute-1 systemd-sysv-generator[49886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:52:44 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Sep 30 20:52:44 compute-1 chown[49896]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Sep 30 20:52:44 compute-1 ovs-ctl[49901]: /etc/openvswitch/conf.db does not exist ... (warning).
Sep 30 20:52:45 compute-1 ovs-ctl[49901]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Sep 30 20:52:45 compute-1 ovs-ctl[49901]: Starting ovsdb-server [  OK  ]
Sep 30 20:52:45 compute-1 ovs-vsctl[49950]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Sep 30 20:52:45 compute-1 ovs-vsctl[49966]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"78438f8f-1ac2-4393-90b7-0b62e0665947\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Sep 30 20:52:45 compute-1 ovs-ctl[49901]: Configuring Open vSwitch system IDs [  OK  ]
Sep 30 20:52:45 compute-1 ovs-ctl[49901]: Enabling remote OVSDB managers [  OK  ]
Sep 30 20:52:45 compute-1 ovs-vsctl[49976]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Sep 30 20:52:45 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Sep 30 20:52:45 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Sep 30 20:52:45 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Sep 30 20:52:45 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Sep 30 20:52:45 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Sep 30 20:52:45 compute-1 ovs-ctl[50020]: Inserting openvswitch module [  OK  ]
Sep 30 20:52:45 compute-1 ovs-ctl[49989]: Starting ovs-vswitchd [  OK  ]
Sep 30 20:52:45 compute-1 ovs-vsctl[50038]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Sep 30 20:52:45 compute-1 ovs-ctl[49989]: Enabling remote OVSDB managers [  OK  ]
Sep 30 20:52:45 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Sep 30 20:52:45 compute-1 systemd[1]: Starting Open vSwitch...
Sep 30 20:52:45 compute-1 systemd[1]: Finished Open vSwitch.
Sep 30 20:52:45 compute-1 sudo[49852]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:46 compute-1 python3.9[50189]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:52:47 compute-1 sudo[50339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zppexojwdemzynnhjzdqqwwcnldteube ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265566.993678-272-67696569947800/AnsiballZ_sefcontext.py'
Sep 30 20:52:47 compute-1 sudo[50339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:47 compute-1 python3.9[50341]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Sep 30 20:52:49 compute-1 kernel: SELinux:  Converting 2739 SID table entries...
Sep 30 20:52:49 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:52:49 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 20:52:49 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:52:49 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:52:49 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:52:49 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:52:49 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:52:49 compute-1 sudo[50339]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:50 compute-1 python3.9[50496]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:52:51 compute-1 sudo[50652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrcyozfgbqoqoylrzszvkqxcymwkujrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265571.2009275-326-271726638813847/AnsiballZ_dnf.py'
Sep 30 20:52:51 compute-1 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Sep 30 20:52:51 compute-1 sudo[50652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:51 compute-1 python3.9[50654]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:52:53 compute-1 sudo[50652]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:53 compute-1 sudo[50805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmgiytgbukattbkjsknqmppyhgjatcqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265573.3487682-350-227620993407232/AnsiballZ_command.py'
Sep 30 20:52:53 compute-1 sudo[50805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:54 compute-1 python3.9[50807]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:52:54 compute-1 sudo[50805]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:55 compute-1 sudo[51092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqtbsztlcibjjlkooooyoadtqrqzkkdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265575.125192-374-114514263135858/AnsiballZ_file.py'
Sep 30 20:52:55 compute-1 sudo[51092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:55 compute-1 python3.9[51094]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 20:52:55 compute-1 sudo[51092]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:56 compute-1 python3.9[51244]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:52:57 compute-1 sudo[51396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aycmtxunidbvjfdhpxngeiuwldmxnssb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265577.010238-422-271598132196963/AnsiballZ_dnf.py'
Sep 30 20:52:57 compute-1 sudo[51396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:57 compute-1 python3.9[51398]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:52:59 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:52:59 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:52:59 compute-1 systemd[1]: Reloading.
Sep 30 20:52:59 compute-1 systemd-sysv-generator[51442]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:52:59 compute-1 systemd-rc-local-generator[51435]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:52:59 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:53:00 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:53:00 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:53:00 compute-1 systemd[1]: run-rb46e0c281b7442b5b5542b296bebdd0a.service: Deactivated successfully.
Sep 30 20:53:00 compute-1 sudo[51396]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:00 compute-1 sudo[51713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcdnklsgjszokxaflxspooqmmmlqybcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265580.505867-446-16460235440125/AnsiballZ_systemd.py'
Sep 30 20:53:00 compute-1 sudo[51713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:01 compute-1 python3.9[51715]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:53:01 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 20:53:01 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Sep 30 20:53:01 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Sep 30 20:53:01 compute-1 NetworkManager[3950]: <info>  [1759265581.2433] caught SIGTERM, shutting down normally.
Sep 30 20:53:01 compute-1 systemd[1]: Stopping Network Manager...
Sep 30 20:53:01 compute-1 NetworkManager[3950]: <info>  [1759265581.2450] dhcp4 (eth0): canceled DHCP transaction
Sep 30 20:53:01 compute-1 NetworkManager[3950]: <info>  [1759265581.2451] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:53:01 compute-1 NetworkManager[3950]: <info>  [1759265581.2451] dhcp4 (eth0): state changed no lease
Sep 30 20:53:01 compute-1 NetworkManager[3950]: <info>  [1759265581.2453] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 20:53:01 compute-1 NetworkManager[3950]: <info>  [1759265581.2518] exiting (success)
Sep 30 20:53:01 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:53:01 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:53:01 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 20:53:01 compute-1 systemd[1]: Stopped Network Manager.
Sep 30 20:53:01 compute-1 systemd[1]: NetworkManager.service: Consumed 16.874s CPU time, 4.3M memory peak, read 0B from disk, written 26.5K to disk.
Sep 30 20:53:01 compute-1 systemd[1]: Starting Network Manager...
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3060] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:3eba4184-9928-4449-a716-6939f6e53713)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3063] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3128] manager[0x55f8246ea090]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 20:53:01 compute-1 systemd[1]: Starting Hostname Service...
Sep 30 20:53:01 compute-1 systemd[1]: Started Hostname Service.
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3911] hostname: hostname: using hostnamed
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3911] hostname: static hostname changed from (none) to "compute-1"
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3916] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3921] manager[0x55f8246ea090]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3922] manager[0x55f8246ea090]: rfkill: WWAN hardware radio set enabled
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3941] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3950] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3950] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3950] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3951] manager: Networking is enabled by state file
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3952] settings: Loaded settings plugin: keyfile (internal)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3955] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3974] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3981] dhcp: init: Using DHCP client 'internal'
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3984] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3987] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3990] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.3995] device (lo): Activation: starting connection 'lo' (e0c59d25-6b1e-4298-8ada-e0f1bea62f04)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4000] device (eth0): carrier: link connected
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4003] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4006] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4007] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4016] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4021] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4026] device (eth1): carrier: link connected
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4028] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4032] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (e5996ec0-9f62-5678-a1f9-72d8777d08c6) (indicated)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4033] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4037] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4043] device (eth1): Activation: starting connection 'ci-private-network' (e5996ec0-9f62-5678-a1f9-72d8777d08c6)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4048] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 20:53:01 compute-1 systemd[1]: Started Network Manager.
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4055] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4057] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4058] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4060] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4063] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4065] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4067] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4069] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4075] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4077] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4095] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4112] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4121] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4125] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4128] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4134] device (lo): Activation: successful, device activated.
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4144] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 20:53:01 compute-1 systemd[1]: Starting Network Manager Wait Online...
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4220] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4225] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4226] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4229] manager: NetworkManager state is now CONNECTED_LOCAL
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4232] device (eth1): Activation: successful, device activated.
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4253] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4255] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4257] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4259] device (eth0): Activation: successful, device activated.
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4262] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 20:53:01 compute-1 NetworkManager[51724]: <info>  [1759265581.4264] manager: startup complete
Sep 30 20:53:01 compute-1 systemd[1]: Finished Network Manager Wait Online.
Sep 30 20:53:01 compute-1 sudo[51713]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:02 compute-1 sudo[51939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyxwepzfffklhbicrszdnaekdxgcgiyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265581.6943944-470-41220396710342/AnsiballZ_dnf.py'
Sep 30 20:53:02 compute-1 sudo[51939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:02 compute-1 python3.9[51941]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:53:07 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:53:07 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:53:07 compute-1 systemd[1]: Reloading.
Sep 30 20:53:07 compute-1 systemd-rc-local-generator[51992]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:53:07 compute-1 systemd-sysv-generator[51997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:53:07 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:53:08 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:53:08 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:53:08 compute-1 systemd[1]: run-r5ef662fd17be448dbf4cd3c0fa87d20d.service: Deactivated successfully.
Sep 30 20:53:08 compute-1 sudo[51939]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:09 compute-1 sudo[52402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upflujndgpnysvltqzhzvaiimxzwumwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265589.3772225-506-178041786510676/AnsiballZ_stat.py'
Sep 30 20:53:09 compute-1 sudo[52402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:09 compute-1 python3.9[52404]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:53:09 compute-1 sudo[52402]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:10 compute-1 sudo[52554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nccbtcznvktnaozdqsswswsmrccthkza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265590.2839503-533-105189928683937/AnsiballZ_ini_file.py'
Sep 30 20:53:10 compute-1 sudo[52554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:11 compute-1 python3.9[52556]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:11 compute-1 sudo[52554]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:11 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:53:11 compute-1 sudo[52708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwrnzpsaxhnavuadpveeziidztgoypii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265591.482989-563-20874822893943/AnsiballZ_ini_file.py'
Sep 30 20:53:11 compute-1 sudo[52708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:12 compute-1 python3.9[52710]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:12 compute-1 sudo[52708]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:12 compute-1 sudo[52860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udinksbeetscpskhbrbnmtvhlcevmisb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265592.2500303-563-160417149063215/AnsiballZ_ini_file.py'
Sep 30 20:53:12 compute-1 sudo[52860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:12 compute-1 python3.9[52862]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:12 compute-1 sudo[52860]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:13 compute-1 sudo[53012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djstswivzejjqctndbuxqajjpvzofgmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265593.0221767-608-160998156735783/AnsiballZ_ini_file.py'
Sep 30 20:53:13 compute-1 sudo[53012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:13 compute-1 python3.9[53014]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:13 compute-1 sudo[53012]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:14 compute-1 sudo[53164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzwwoxlekejgmvhhvmmvsgxvgrxuaxdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265593.8420315-608-190480947482799/AnsiballZ_ini_file.py'
Sep 30 20:53:14 compute-1 sudo[53164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:14 compute-1 python3.9[53166]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:14 compute-1 sudo[53164]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:14 compute-1 sudo[53316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmbgapzunmnfnvlolhsrkggapbacfaat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265594.5911257-653-14649069344874/AnsiballZ_stat.py'
Sep 30 20:53:14 compute-1 sudo[53316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:15 compute-1 python3.9[53318]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:53:15 compute-1 sudo[53316]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:15 compute-1 sudo[53439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wozxjlwibyifyxxhwprdnkwzcbjwypli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265594.5911257-653-14649069344874/AnsiballZ_copy.py'
Sep 30 20:53:15 compute-1 sudo[53439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:15 compute-1 python3.9[53441]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265594.5911257-653-14649069344874/.source _original_basename=._ozxdjdi follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:15 compute-1 sudo[53439]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:16 compute-1 sudo[53591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tltbjfkoiavtrjinsjokglammeylevfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265596.1058636-698-104273768449008/AnsiballZ_file.py'
Sep 30 20:53:16 compute-1 sudo[53591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:16 compute-1 python3.9[53593]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:16 compute-1 sudo[53591]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:17 compute-1 sudo[53743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnoxgyslqbnsnulyezmwbxrpyargysof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265596.925092-722-212148748463619/AnsiballZ_edpm_os_net_config_mappings.py'
Sep 30 20:53:17 compute-1 sudo[53743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:17 compute-1 python3.9[53745]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Sep 30 20:53:17 compute-1 sudo[53743]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:18 compute-1 sudo[53895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdrxlzguwsjodvzaqyuavdenwfowfdvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265597.95099-749-41446691204046/AnsiballZ_file.py'
Sep 30 20:53:18 compute-1 sudo[53895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:18 compute-1 python3.9[53897]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:18 compute-1 sudo[53895]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:19 compute-1 sudo[54047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaqfyxjtnnawizvomvkoythdopeeemxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265598.9818084-779-37348256907923/AnsiballZ_stat.py'
Sep 30 20:53:19 compute-1 sudo[54047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:19 compute-1 sudo[54047]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:20 compute-1 sudo[54170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmxmbqhbfljooeiddivgisbhdvpcxopa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265598.9818084-779-37348256907923/AnsiballZ_copy.py'
Sep 30 20:53:20 compute-1 sudo[54170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:20 compute-1 sudo[54170]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:21 compute-1 sudo[54322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnbplfsqbcicrcorqkmbwcymnwjdrvzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265600.5036376-824-176014421990077/AnsiballZ_slurp.py'
Sep 30 20:53:21 compute-1 sudo[54322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:21 compute-1 python3.9[54324]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Sep 30 20:53:21 compute-1 sudo[54322]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:22 compute-1 sudo[54497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mejxmrhjqhqnibmbpkqsaxxzhmsmegro ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265601.5721643-851-26830418351940/async_wrapper.py j373454384854 300 /home/zuul/.ansible/tmp/ansible-tmp-1759265601.5721643-851-26830418351940/AnsiballZ_edpm_os_net_config.py _'
Sep 30 20:53:22 compute-1 sudo[54497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:22 compute-1 ansible-async_wrapper.py[54499]: Invoked with j373454384854 300 /home/zuul/.ansible/tmp/ansible-tmp-1759265601.5721643-851-26830418351940/AnsiballZ_edpm_os_net_config.py _
Sep 30 20:53:22 compute-1 ansible-async_wrapper.py[54502]: Starting module and watcher
Sep 30 20:53:22 compute-1 ansible-async_wrapper.py[54502]: Start watching 54503 (300)
Sep 30 20:53:22 compute-1 ansible-async_wrapper.py[54503]: Start module (54503)
Sep 30 20:53:22 compute-1 ansible-async_wrapper.py[54499]: Return async_wrapper task started.
Sep 30 20:53:22 compute-1 sudo[54497]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:22 compute-1 python3.9[54504]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Sep 30 20:53:23 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Sep 30 20:53:23 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Sep 30 20:53:23 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Sep 30 20:53:23 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Sep 30 20:53:23 compute-1 kernel: cfg80211: failed to load regulatory.db
Sep 30 20:53:24 compute-1 NetworkManager[51724]: <info>  [1759265604.9652] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54505 uid=0 result="success"
Sep 30 20:53:24 compute-1 NetworkManager[51724]: <info>  [1759265604.9671] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0543] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0546] audit: op="connection-add" uuid="98bcfd79-0916-44b8-9646-c4a973bd9cca" name="br-ex-br" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0575] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0577] audit: op="connection-add" uuid="5e5a0406-458a-4287-b1df-6aa7d111e9ba" name="br-ex-port" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0599] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0601] audit: op="connection-add" uuid="cf074dfc-55ab-4ccc-8be8-adc6819e7030" name="eth1-port" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0625] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0627] audit: op="connection-add" uuid="03e82521-f9ef-47c5-ba05-eac053d254f0" name="vlan20-port" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0649] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0651] audit: op="connection-add" uuid="69d688fa-2f3d-41eb-b613-f2f36ffddd7e" name="vlan21-port" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0671] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0673] audit: op="connection-add" uuid="cf80bbca-4d14-4651-bacf-07042275a0ab" name="vlan22-port" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0710] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0740] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0742] audit: op="connection-add" uuid="4149abe1-b058-4fda-a265-671e69381ac3" name="br-ex-if" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0814] audit: op="connection-update" uuid="e5996ec0-9f62-5678-a1f9-72d8777d08c6" name="ci-private-network" args="ipv4.dns,ipv4.routes,ipv4.addresses,ipv4.method,ipv4.routing-rules,ipv4.never-default,connection.master,connection.controller,connection.port-type,connection.timestamp,connection.slave-type,ovs-external-ids.data,ipv6.addr-gen-mode,ipv6.dns,ipv6.routes,ipv6.addresses,ipv6.method,ipv6.routing-rules,ovs-interface.type" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0848] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0850] audit: op="connection-add" uuid="185005a1-9884-4ea3-9c78-fd751d0f984f" name="vlan20-if" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0885] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0888] audit: op="connection-add" uuid="0a677898-7cbf-4805-9089-351082a33443" name="vlan21-if" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0917] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0919] audit: op="connection-add" uuid="a6a7e0e7-d679-4b97-979f-c9184988661d" name="vlan22-if" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0944] audit: op="connection-delete" uuid="4847d465-cf78-3ee4-ab52-42c2398330d2" name="Wired connection 1" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0968] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0987] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0994] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (98bcfd79-0916-44b8-9646-c4a973bd9cca)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.0995] audit: op="connection-activate" uuid="98bcfd79-0916-44b8-9646-c4a973bd9cca" name="br-ex-br" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1000] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1014] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1021] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (5e5a0406-458a-4287-b1df-6aa7d111e9ba)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1026] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1038] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1045] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (cf074dfc-55ab-4ccc-8be8-adc6819e7030)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1049] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1062] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1069] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (03e82521-f9ef-47c5-ba05-eac053d254f0)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1073] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1087] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1095] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (69d688fa-2f3d-41eb-b613-f2f36ffddd7e)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1098] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1107] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1113] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (cf80bbca-4d14-4651-bacf-07042275a0ab)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1114] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1119] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1121] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1131] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1138] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1144] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (4149abe1-b058-4fda-a265-671e69381ac3)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1145] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1149] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1151] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1153] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1155] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1174] device (eth1): disconnecting for new activation request.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1175] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1178] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1181] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1183] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1186] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1190] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1194] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (185005a1-9884-4ea3-9c78-fd751d0f984f)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1195] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1198] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1200] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1201] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1204] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1209] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1213] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (0a677898-7cbf-4805-9089-351082a33443)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1214] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1216] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1219] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1220] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1222] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1227] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1231] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (a6a7e0e7-d679-4b97-979f-c9184988661d)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1231] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1235] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1236] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1238] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1239] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1254] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1256] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1260] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1261] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1269] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1273] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1277] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1280] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1282] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 kernel: ovs-system: entered promiscuous mode
Sep 30 20:53:25 compute-1 kernel: Timeout policy base is empty
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1297] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1303] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1306] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1308] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1315] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1319] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1323] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1325] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 systemd-udevd[54509]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1330] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1336] dhcp4 (eth0): canceled DHCP transaction
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1337] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1337] dhcp4 (eth0): state changed no lease
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1338] dhcp4 (eth0): activation: beginning transaction (no timeout)
Sep 30 20:53:25 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1350] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1355] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54505 uid=0 result="fail" reason="Device is not activated"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1391] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1401] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1407] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1411] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1419] device (eth1): disconnecting for new activation request.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1420] audit: op="connection-activate" uuid="e5996ec0-9f62-5678-a1f9-72d8777d08c6" name="ci-private-network" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.1927] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2006] device (eth1): Activation: starting connection 'ci-private-network' (e5996ec0-9f62-5678-a1f9-72d8777d08c6)
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2040] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 kernel: br-ex: entered promiscuous mode
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2047] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2057] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2060] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2062] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2065] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2067] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2069] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2071] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54505 uid=0 result="success"
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2080] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2091] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2098] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2105] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2111] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2118] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2124] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2129] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2135] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2140] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2147] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2153] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 kernel: vlan22: entered promiscuous mode
Sep 30 20:53:25 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2159] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2169] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2175] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 systemd-udevd[54511]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2185] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2210] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 kernel: vlan21: entered promiscuous mode
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2274] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2277] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2280] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2290] device (eth1): Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2298] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2308] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2329] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2334] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Sep 30 20:53:25 compute-1 kernel: vlan20: entered promiscuous mode
Sep 30 20:53:25 compute-1 systemd-udevd[54510]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2410] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2422] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2479] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2481] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2486] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2496] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2506] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2517] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2567] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2598] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2622] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2625] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:25 compute-1 NetworkManager[51724]: <info>  [1759265605.2635] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 20:53:26 compute-1 sudo[54835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyvlentwdgqbkaopdykqhtnbcecdowis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265605.732903-851-20243349138493/AnsiballZ_async_status.py'
Sep 30 20:53:26 compute-1 sudo[54835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:26 compute-1 python3.9[54837]: ansible-ansible.legacy.async_status Invoked with jid=j373454384854.54499 mode=status _async_dir=/root/.ansible_async
Sep 30 20:53:26 compute-1 sudo[54835]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:26 compute-1 NetworkManager[51724]: <info>  [1759265606.4029] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54505 uid=0 result="success"
Sep 30 20:53:26 compute-1 NetworkManager[51724]: <info>  [1759265606.6662] checkpoint[0x55f8246c0950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Sep 30 20:53:26 compute-1 NetworkManager[51724]: <info>  [1759265606.6666] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54505 uid=0 result="success"
Sep 30 20:53:27 compute-1 NetworkManager[51724]: <info>  [1759265607.0144] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54505 uid=0 result="success"
Sep 30 20:53:27 compute-1 NetworkManager[51724]: <info>  [1759265607.0167] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54505 uid=0 result="success"
Sep 30 20:53:27 compute-1 NetworkManager[51724]: <info>  [1759265607.2168] audit: op="networking-control" arg="global-dns-configuration" pid=54505 uid=0 result="success"
Sep 30 20:53:27 compute-1 NetworkManager[51724]: <info>  [1759265607.2200] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Sep 30 20:53:27 compute-1 NetworkManager[51724]: <info>  [1759265607.2239] audit: op="networking-control" arg="global-dns-configuration" pid=54505 uid=0 result="success"
Sep 30 20:53:27 compute-1 NetworkManager[51724]: <info>  [1759265607.2268] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54505 uid=0 result="success"
Sep 30 20:53:27 compute-1 NetworkManager[51724]: <info>  [1759265607.3698] checkpoint[0x55f8246c0a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Sep 30 20:53:27 compute-1 NetworkManager[51724]: <info>  [1759265607.3702] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54505 uid=0 result="success"
Sep 30 20:53:27 compute-1 ansible-async_wrapper.py[54503]: Module complete (54503)
Sep 30 20:53:27 compute-1 ansible-async_wrapper.py[54502]: Done in kid B.
Sep 30 20:53:29 compute-1 sudo[54941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnejzsgvxfxxafzgpjhzlqkmogmucxfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265605.732903-851-20243349138493/AnsiballZ_async_status.py'
Sep 30 20:53:29 compute-1 sudo[54941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:29 compute-1 python3.9[54943]: ansible-ansible.legacy.async_status Invoked with jid=j373454384854.54499 mode=status _async_dir=/root/.ansible_async
Sep 30 20:53:29 compute-1 sudo[54941]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:30 compute-1 sudo[55041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txxdmbzcyvtwuatzdeoibbkvtmgqdzmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265605.732903-851-20243349138493/AnsiballZ_async_status.py'
Sep 30 20:53:30 compute-1 sudo[55041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:30 compute-1 python3.9[55043]: ansible-ansible.legacy.async_status Invoked with jid=j373454384854.54499 mode=cleanup _async_dir=/root/.ansible_async
Sep 30 20:53:30 compute-1 sudo[55041]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:31 compute-1 sudo[55193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcwmowrdblxjtlojfwvpqsspzdyywxrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265610.8735976-932-265266232536907/AnsiballZ_stat.py'
Sep 30 20:53:31 compute-1 sudo[55193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:31 compute-1 irqbalance[788]: Cannot change IRQ 26 affinity: Operation not permitted
Sep 30 20:53:31 compute-1 irqbalance[788]: IRQ 26 affinity is now unmanaged
Sep 30 20:53:31 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 20:53:31 compute-1 python3.9[55195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:53:31 compute-1 sudo[55193]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:31 compute-1 sudo[55318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btswdpfvtxzrsjggirwhgwcbhowhobvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265610.8735976-932-265266232536907/AnsiballZ_copy.py'
Sep 30 20:53:31 compute-1 sudo[55318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:32 compute-1 python3.9[55320]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265610.8735976-932-265266232536907/.source.returncode _original_basename=.8eti8frv follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:32 compute-1 sudo[55318]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:32 compute-1 sudo[55470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjxevsoxbejtcjjtwjrcmyoqycoqsirw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265612.4146578-980-155890848433731/AnsiballZ_stat.py'
Sep 30 20:53:32 compute-1 sudo[55470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:32 compute-1 python3.9[55472]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:53:32 compute-1 sudo[55470]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:33 compute-1 sudo[55594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsetrhfzzktrxuphtjbxehehkzkucapt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265612.4146578-980-155890848433731/AnsiballZ_copy.py'
Sep 30 20:53:33 compute-1 sudo[55594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:33 compute-1 python3.9[55596]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265612.4146578-980-155890848433731/.source.cfg _original_basename=.oszyt8tm follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:33 compute-1 sudo[55594]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:34 compute-1 sudo[55746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oivniwubxksbmzkxetolxbkwhhrthzxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265613.9485478-1025-3066069681545/AnsiballZ_systemd.py'
Sep 30 20:53:34 compute-1 sudo[55746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:34 compute-1 python3.9[55748]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:53:35 compute-1 systemd[1]: Reloading Network Manager...
Sep 30 20:53:35 compute-1 NetworkManager[51724]: <info>  [1759265615.7688] audit: op="reload" arg="0" pid=55752 uid=0 result="success"
Sep 30 20:53:35 compute-1 NetworkManager[51724]: <info>  [1759265615.7694] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Sep 30 20:53:35 compute-1 systemd[1]: Reloaded Network Manager.
Sep 30 20:53:35 compute-1 sudo[55746]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:36 compute-1 sshd-session[47729]: Connection closed by 192.168.122.30 port 48066
Sep 30 20:53:36 compute-1 sshd-session[47726]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:53:36 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Sep 30 20:53:36 compute-1 systemd[1]: session-12.scope: Consumed 57.514s CPU time.
Sep 30 20:53:36 compute-1 systemd-logind[793]: Session 12 logged out. Waiting for processes to exit.
Sep 30 20:53:36 compute-1 systemd-logind[793]: Removed session 12.
Sep 30 20:53:41 compute-1 sshd-session[55783]: Accepted publickey for zuul from 192.168.122.30 port 36734 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:53:41 compute-1 systemd-logind[793]: New session 13 of user zuul.
Sep 30 20:53:41 compute-1 systemd[1]: Started Session 13 of User zuul.
Sep 30 20:53:41 compute-1 sshd-session[55783]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:53:42 compute-1 python3.9[55936]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:53:44 compute-1 python3.9[56091]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:53:45 compute-1 python3.9[56280]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:53:45 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:53:45 compute-1 sshd-session[55786]: Connection closed by 192.168.122.30 port 36734
Sep 30 20:53:45 compute-1 sshd-session[55783]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:53:45 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Sep 30 20:53:45 compute-1 systemd[1]: session-13.scope: Consumed 2.846s CPU time.
Sep 30 20:53:45 compute-1 systemd-logind[793]: Session 13 logged out. Waiting for processes to exit.
Sep 30 20:53:45 compute-1 systemd-logind[793]: Removed session 13.
Sep 30 20:53:50 compute-1 sshd-session[56310]: Accepted publickey for zuul from 192.168.122.30 port 60916 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:53:50 compute-1 systemd-logind[793]: New session 14 of user zuul.
Sep 30 20:53:50 compute-1 systemd[1]: Started Session 14 of User zuul.
Sep 30 20:53:50 compute-1 sshd-session[56310]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:53:52 compute-1 python3.9[56463]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:53:53 compute-1 python3.9[56618]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:53:54 compute-1 sudo[56772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npncsurfwhzmyohmxaswedqcxsfffopm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265633.7957325-86-156803289106383/AnsiballZ_setup.py'
Sep 30 20:53:54 compute-1 sudo[56772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:54 compute-1 python3.9[56774]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:53:54 compute-1 sudo[56772]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:55 compute-1 sudo[56856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpafbddcjxrvgmuwhaxtcmxcbztrxrlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265633.7957325-86-156803289106383/AnsiballZ_dnf.py'
Sep 30 20:53:55 compute-1 sudo[56856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:55 compute-1 python3.9[56858]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:53:56 compute-1 sudo[56856]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:57 compute-1 sudo[57010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfyckinnkvneqricqyqtcudhsstxzfxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265636.9218516-122-222085947923116/AnsiballZ_setup.py'
Sep 30 20:53:57 compute-1 sudo[57010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:57 compute-1 python3.9[57012]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:53:57 compute-1 sudo[57010]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:58 compute-1 sudo[57201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yauwizwkbqgpffnqmneezpxoazvdhjda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265638.3381886-155-180098631545533/AnsiballZ_file.py'
Sep 30 20:53:58 compute-1 sudo[57201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:59 compute-1 python3.9[57203]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:59 compute-1 sudo[57201]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:59 compute-1 sudo[57353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cftfhaalbyipqhlulmudiujcdyiqewec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265639.3840327-179-230979168556512/AnsiballZ_command.py'
Sep 30 20:53:59 compute-1 sudo[57353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:00 compute-1 python3.9[57355]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:54:00 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:54:00 compute-1 sudo[57353]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:00 compute-1 sudo[57518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzkcowstgitwrzfcbbzklqlbzrynknul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265640.4010634-203-163628678479672/AnsiballZ_stat.py'
Sep 30 20:54:00 compute-1 sudo[57518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:01 compute-1 python3.9[57520]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:01 compute-1 sudo[57518]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:01 compute-1 sudo[57596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmjrgrtbivpktpbipzrccerviyavkuig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265640.4010634-203-163628678479672/AnsiballZ_file.py'
Sep 30 20:54:01 compute-1 sudo[57596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:01 compute-1 python3.9[57598]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:01 compute-1 sudo[57596]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:02 compute-1 sudo[57748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgrvcdsbvbxvatwyvnqreutpfhhfxgie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265641.9554365-239-237346806745515/AnsiballZ_stat.py'
Sep 30 20:54:02 compute-1 sudo[57748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:02 compute-1 python3.9[57750]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:02 compute-1 sudo[57748]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:02 compute-1 sudo[57826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfqnfegvzmcfqmnzpbsgywnqpromuphy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265641.9554365-239-237346806745515/AnsiballZ_file.py'
Sep 30 20:54:02 compute-1 sudo[57826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:03 compute-1 python3.9[57828]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:03 compute-1 sudo[57826]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:03 compute-1 sudo[57978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnsvwiakmdlnsokcoxbdesvqpwenthbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265643.4084916-278-45691623743630/AnsiballZ_ini_file.py'
Sep 30 20:54:03 compute-1 sudo[57978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:04 compute-1 python3.9[57980]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:04 compute-1 sudo[57978]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:04 compute-1 sudo[58130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzgslguvesymdxchxoqcttxbbkgbxskm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265644.3852584-278-172702740176139/AnsiballZ_ini_file.py'
Sep 30 20:54:04 compute-1 sudo[58130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:04 compute-1 python3.9[58132]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:05 compute-1 sudo[58130]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:05 compute-1 sudo[58282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgomjumvxcrjhuzkjsoobsujergyipyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265645.1814756-278-81812873874165/AnsiballZ_ini_file.py'
Sep 30 20:54:05 compute-1 sudo[58282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:05 compute-1 python3.9[58284]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:05 compute-1 sudo[58282]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:06 compute-1 sudo[58434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnkdmyzdaclxwfmvxlatodzrtuyrbdpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265645.9948545-278-21506061018859/AnsiballZ_ini_file.py'
Sep 30 20:54:06 compute-1 sudo[58434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:06 compute-1 python3.9[58436]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:06 compute-1 sudo[58434]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:07 compute-1 sudo[58586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xenyltdmvxugdcccpsynuerogzswtbsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265646.9097333-371-261837364927414/AnsiballZ_dnf.py'
Sep 30 20:54:07 compute-1 sudo[58586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:07 compute-1 python3.9[58588]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:54:08 compute-1 sudo[58586]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:09 compute-1 sudo[58739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orqdskbbgjnrdqfzjigkvwepqzdxibax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265649.2926152-404-102937805998182/AnsiballZ_setup.py'
Sep 30 20:54:09 compute-1 sudo[58739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:09 compute-1 python3.9[58741]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:54:10 compute-1 sudo[58739]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:10 compute-1 sudo[58893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slbpiitbuyuijwmzyllfvxcbkplxkmfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265650.2417269-428-47462973939552/AnsiballZ_stat.py'
Sep 30 20:54:10 compute-1 sudo[58893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:10 compute-1 python3.9[58895]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:54:10 compute-1 sudo[58893]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:11 compute-1 sudo[59045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdpcgxlicwjrwxkmlfkrtaezjhksztht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265651.1639898-455-172459696572276/AnsiballZ_stat.py'
Sep 30 20:54:11 compute-1 sudo[59045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:11 compute-1 python3.9[59047]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:54:11 compute-1 sudo[59045]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:12 compute-1 sudo[59197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uknqklfhduqrawtmsehfxaynvxhaeplj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265652.0813828-485-176389209183495/AnsiballZ_service_facts.py'
Sep 30 20:54:12 compute-1 sudo[59197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:12 compute-1 python3.9[59199]: ansible-service_facts Invoked
Sep 30 20:54:12 compute-1 network[59216]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 20:54:12 compute-1 network[59217]: 'network-scripts' will be removed from distribution in near future.
Sep 30 20:54:12 compute-1 network[59218]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 20:54:17 compute-1 sudo[59197]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:18 compute-1 sudo[59503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkkqsakojesftlubpjitprkgbjgncnln ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759265658.35827-524-66694820204585/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759265658.35827-524-66694820204585/args'
Sep 30 20:54:18 compute-1 sudo[59503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:18 compute-1 sudo[59503]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:19 compute-1 sudo[59670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyolsnpfnmqzojivblhbspjanzopwwuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265659.1833503-557-116455706369329/AnsiballZ_dnf.py'
Sep 30 20:54:19 compute-1 sudo[59670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:19 compute-1 python3.9[59672]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:54:21 compute-1 sudo[59670]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:22 compute-1 sudo[59823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiykbrkujyuljzodvzkyrlmncmkbfedp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265661.5690317-596-45777305850515/AnsiballZ_package_facts.py'
Sep 30 20:54:22 compute-1 sudo[59823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:22 compute-1 python3.9[59825]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Sep 30 20:54:22 compute-1 sudo[59823]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:23 compute-1 sudo[59975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iocjoeztuepdsvtzetfanbmpbwgtpqou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265663.6308124-627-207863374653850/AnsiballZ_stat.py'
Sep 30 20:54:23 compute-1 sudo[59975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:24 compute-1 python3.9[59977]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:24 compute-1 sudo[59975]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:24 compute-1 sudo[60100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzkthlrolgqrbidbsdjoswmemcyydyyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265663.6308124-627-207863374653850/AnsiballZ_copy.py'
Sep 30 20:54:24 compute-1 sudo[60100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:24 compute-1 python3.9[60102]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265663.6308124-627-207863374653850/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:24 compute-1 sudo[60100]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:25 compute-1 sudo[60254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwocgpyljygovpzbkhxptsuldfeainff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265665.2241566-672-138487345418194/AnsiballZ_stat.py'
Sep 30 20:54:25 compute-1 sudo[60254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:25 compute-1 python3.9[60256]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:25 compute-1 sudo[60254]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:26 compute-1 sudo[60379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gehrltlfaceisxvyrflxehicwquvpnit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265665.2241566-672-138487345418194/AnsiballZ_copy.py'
Sep 30 20:54:26 compute-1 sudo[60379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:26 compute-1 python3.9[60381]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265665.2241566-672-138487345418194/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:26 compute-1 sudo[60379]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:27 compute-1 sudo[60533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoyxtjghfiukxdccwcwyuzwkhhvvztdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265667.4837-734-180665866932989/AnsiballZ_lineinfile.py'
Sep 30 20:54:27 compute-1 sudo[60533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:28 compute-1 python3.9[60535]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:28 compute-1 sudo[60533]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:29 compute-1 sudo[60687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glqkxckoxuieyrenlovrlzgwcnhtnwad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265669.2713754-779-136441440889299/AnsiballZ_setup.py'
Sep 30 20:54:29 compute-1 sudo[60687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:29 compute-1 python3.9[60689]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:54:30 compute-1 sudo[60687]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:30 compute-1 sudo[60771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjczcdgixtxmmscykitjztqllfrtbdbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265669.2713754-779-136441440889299/AnsiballZ_systemd.py'
Sep 30 20:54:30 compute-1 sudo[60771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:31 compute-1 python3.9[60773]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:54:31 compute-1 sudo[60771]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:32 compute-1 sudo[60925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbimwirqfmkltthimvsjdkxvgnzkdcra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265672.1877465-827-217759141455448/AnsiballZ_setup.py'
Sep 30 20:54:32 compute-1 sudo[60925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:32 compute-1 python3.9[60927]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:54:33 compute-1 sudo[60925]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:33 compute-1 sudo[61009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzgylsgvvkwbmokobjkfvwjtjedipjqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265672.1877465-827-217759141455448/AnsiballZ_systemd.py'
Sep 30 20:54:33 compute-1 sudo[61009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:33 compute-1 python3.9[61011]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:54:33 compute-1 chronyd[797]: chronyd exiting
Sep 30 20:54:33 compute-1 systemd[1]: Stopping NTP client/server...
Sep 30 20:54:33 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Sep 30 20:54:33 compute-1 systemd[1]: Stopped NTP client/server.
Sep 30 20:54:33 compute-1 systemd[1]: Starting NTP client/server...
Sep 30 20:54:34 compute-1 chronyd[61020]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 20:54:34 compute-1 chronyd[61020]: Frequency -26.497 +/- 0.232 ppm read from /var/lib/chrony/drift
Sep 30 20:54:34 compute-1 chronyd[61020]: Loaded seccomp filter (level 2)
Sep 30 20:54:34 compute-1 systemd[1]: Started NTP client/server.
Sep 30 20:54:34 compute-1 sudo[61009]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:34 compute-1 sshd-session[56313]: Connection closed by 192.168.122.30 port 60916
Sep 30 20:54:34 compute-1 sshd-session[56310]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:54:34 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Sep 30 20:54:34 compute-1 systemd[1]: session-14.scope: Consumed 30.409s CPU time.
Sep 30 20:54:34 compute-1 systemd-logind[793]: Session 14 logged out. Waiting for processes to exit.
Sep 30 20:54:34 compute-1 systemd-logind[793]: Removed session 14.
Sep 30 20:54:39 compute-1 sshd-session[61046]: Accepted publickey for zuul from 192.168.122.30 port 60232 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:54:39 compute-1 systemd-logind[793]: New session 15 of user zuul.
Sep 30 20:54:39 compute-1 systemd[1]: Started Session 15 of User zuul.
Sep 30 20:54:39 compute-1 sshd-session[61046]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:54:40 compute-1 python3.9[61199]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:54:41 compute-1 sudo[61353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgpvuwnpiuddoegkgjskrmdxabivlmcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265680.9060352-65-41559034518775/AnsiballZ_file.py'
Sep 30 20:54:41 compute-1 sudo[61353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:41 compute-1 python3.9[61355]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:41 compute-1 sudo[61353]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:42 compute-1 sudo[61528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcnuvcaseinfssjithbqrosuwroifxqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265681.9054096-89-267961248575217/AnsiballZ_stat.py'
Sep 30 20:54:42 compute-1 sudo[61528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:42 compute-1 python3.9[61530]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:42 compute-1 sudo[61528]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:43 compute-1 sudo[61606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzuemiuotuijoeqcgozjvexkslqafpqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265681.9054096-89-267961248575217/AnsiballZ_file.py'
Sep 30 20:54:43 compute-1 sudo[61606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:43 compute-1 python3.9[61608]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.p6o_l8gv recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:43 compute-1 sudo[61606]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:44 compute-1 sudo[61758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxnlqkgqhybvkcltgqnstffgrlqzkuvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265683.733081-149-258419902893426/AnsiballZ_stat.py'
Sep 30 20:54:44 compute-1 sudo[61758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:44 compute-1 python3.9[61760]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:44 compute-1 sudo[61758]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:45 compute-1 sudo[61881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqwqwttkonncldlcrckjeduwuuojmown ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265683.733081-149-258419902893426/AnsiballZ_copy.py'
Sep 30 20:54:45 compute-1 sudo[61881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:45 compute-1 python3.9[61883]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265683.733081-149-258419902893426/.source _original_basename=.53vrb2qz follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:45 compute-1 sudo[61881]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:46 compute-1 sudo[62033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnhvjlhiadsqxbheiynupjwwsmhlghgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265685.6544232-197-237030479249634/AnsiballZ_file.py'
Sep 30 20:54:46 compute-1 sudo[62033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:46 compute-1 python3.9[62035]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:46 compute-1 sudo[62033]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:46 compute-1 sudo[62185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arvizjvdjimzflzunvwazdveefvlpjqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265686.5532846-221-245779020333794/AnsiballZ_stat.py'
Sep 30 20:54:46 compute-1 sudo[62185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:47 compute-1 python3.9[62187]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:47 compute-1 sudo[62185]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:47 compute-1 sudo[62308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsjliwtgyhblllkpzjsusqvhpxqyolwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265686.5532846-221-245779020333794/AnsiballZ_copy.py'
Sep 30 20:54:47 compute-1 sudo[62308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:47 compute-1 python3.9[62310]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265686.5532846-221-245779020333794/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:47 compute-1 sudo[62308]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:48 compute-1 sudo[62460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fitniaepqxlowcficwyexwxnkqsdpvty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265688.044037-221-25182500224752/AnsiballZ_stat.py'
Sep 30 20:54:48 compute-1 sudo[62460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:48 compute-1 python3.9[62462]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:48 compute-1 sudo[62460]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:49 compute-1 sudo[62583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhatlmgwbkttyhjxietomdzdyoyhauvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265688.044037-221-25182500224752/AnsiballZ_copy.py'
Sep 30 20:54:49 compute-1 sudo[62583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:49 compute-1 python3.9[62585]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265688.044037-221-25182500224752/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:49 compute-1 sudo[62583]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:49 compute-1 sudo[62735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nljkmqbzvfpimnmhzyqwskzfgzdrvsmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265689.497356-308-60958082740973/AnsiballZ_file.py'
Sep 30 20:54:49 compute-1 sudo[62735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:50 compute-1 python3.9[62737]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:50 compute-1 sudo[62735]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:50 compute-1 sudo[62887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrkuckzxbiymdpznmhmucdkphffiohyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265690.432103-332-181550766409553/AnsiballZ_stat.py'
Sep 30 20:54:50 compute-1 sudo[62887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:51 compute-1 python3.9[62889]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:51 compute-1 sudo[62887]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:51 compute-1 sudo[63010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvstwxtctfpdjfajuwemnbprffiidahl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265690.432103-332-181550766409553/AnsiballZ_copy.py'
Sep 30 20:54:51 compute-1 sudo[63010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:51 compute-1 python3.9[63012]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265690.432103-332-181550766409553/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:51 compute-1 sudo[63010]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:52 compute-1 sudo[63162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxrnhmcdhgyruoltgnadcnrcttwlnuqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265691.960284-377-119844793175077/AnsiballZ_stat.py'
Sep 30 20:54:52 compute-1 sudo[63162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:52 compute-1 python3.9[63164]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:52 compute-1 sudo[63162]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:52 compute-1 sudo[63285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgukdhvmvixudcxpsxmqvashzcoihfmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265691.960284-377-119844793175077/AnsiballZ_copy.py'
Sep 30 20:54:52 compute-1 sudo[63285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:53 compute-1 python3.9[63287]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265691.960284-377-119844793175077/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:53 compute-1 sudo[63285]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:54 compute-1 sudo[63437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odbywxwfptnfbrwgkhudvukraaoywdde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265693.5412114-422-133846222177853/AnsiballZ_systemd.py'
Sep 30 20:54:54 compute-1 sudo[63437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:54 compute-1 python3.9[63439]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:54:54 compute-1 systemd[1]: Reloading.
Sep 30 20:54:54 compute-1 systemd-sysv-generator[63468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:54:54 compute-1 systemd-rc-local-generator[63463]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:54:54 compute-1 systemd[1]: Reloading.
Sep 30 20:54:55 compute-1 systemd-rc-local-generator[63505]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:54:55 compute-1 systemd-sysv-generator[63508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:54:55 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Sep 30 20:54:55 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Sep 30 20:54:55 compute-1 sudo[63437]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:55 compute-1 sudo[63665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awkbgsffnrlxrsjwxwladjqalbwubpat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265695.456835-446-122889784673203/AnsiballZ_stat.py'
Sep 30 20:54:55 compute-1 sudo[63665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:56 compute-1 python3.9[63667]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:56 compute-1 sudo[63665]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:56 compute-1 sudo[63788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqcdcmczxhbxxwdzpjczbsburywilvnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265695.456835-446-122889784673203/AnsiballZ_copy.py'
Sep 30 20:54:56 compute-1 sudo[63788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:56 compute-1 python3.9[63790]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265695.456835-446-122889784673203/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:56 compute-1 sudo[63788]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:57 compute-1 sudo[63940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khxomdbqescdinhouqgfdtpgagqlvxiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265696.8967838-491-252102363665848/AnsiballZ_stat.py'
Sep 30 20:54:57 compute-1 sudo[63940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:57 compute-1 python3.9[63942]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:57 compute-1 sudo[63940]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:57 compute-1 sudo[64063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcrojugjqxwdionoftbwzrtzjyjzmrlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265696.8967838-491-252102363665848/AnsiballZ_copy.py'
Sep 30 20:54:57 compute-1 sudo[64063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:57 compute-1 python3.9[64065]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265696.8967838-491-252102363665848/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:57 compute-1 sudo[64063]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:58 compute-1 sudo[64215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhjasbjtcqzhfbfpzoujacsuxipfwahd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265698.3420463-536-194511593015841/AnsiballZ_systemd.py'
Sep 30 20:54:58 compute-1 sudo[64215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:59 compute-1 python3.9[64217]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:54:59 compute-1 systemd[1]: Reloading.
Sep 30 20:54:59 compute-1 systemd-rc-local-generator[64241]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:54:59 compute-1 systemd-sysv-generator[64247]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:54:59 compute-1 systemd[1]: Reloading.
Sep 30 20:54:59 compute-1 systemd-sysv-generator[64287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:54:59 compute-1 systemd-rc-local-generator[64284]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:54:59 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 20:54:59 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 20:54:59 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 20:54:59 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 20:54:59 compute-1 sudo[64215]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:00 compute-1 python3.9[64444]: ansible-ansible.builtin.service_facts Invoked
Sep 30 20:55:00 compute-1 network[64461]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 20:55:00 compute-1 network[64462]: 'network-scripts' will be removed from distribution in near future.
Sep 30 20:55:00 compute-1 network[64463]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 20:55:06 compute-1 sudo[64725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uazfemjeodmhretuevorxwyaindmhtam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265705.685046-584-110661838009202/AnsiballZ_systemd.py'
Sep 30 20:55:06 compute-1 sudo[64725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:06 compute-1 python3.9[64727]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:55:06 compute-1 systemd[1]: Reloading.
Sep 30 20:55:06 compute-1 systemd-sysv-generator[64763]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:55:06 compute-1 systemd-rc-local-generator[64759]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:55:06 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Sep 30 20:55:06 compute-1 iptables.init[64768]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Sep 30 20:55:07 compute-1 iptables.init[64768]: iptables: Flushing firewall rules: [  OK  ]
Sep 30 20:55:07 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Sep 30 20:55:07 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Sep 30 20:55:07 compute-1 sudo[64725]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:07 compute-1 sudo[64963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmnlgwqrrhbmfkgrrbgxdjgbzjodceyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265707.2508788-584-75819076948303/AnsiballZ_systemd.py'
Sep 30 20:55:07 compute-1 sudo[64963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:07 compute-1 python3.9[64965]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:55:07 compute-1 sudo[64963]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:08 compute-1 sudo[65117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmlapkqrslzvautjakezzemngsrrumkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265708.411359-632-243285967285798/AnsiballZ_systemd.py'
Sep 30 20:55:08 compute-1 sudo[65117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:09 compute-1 python3.9[65119]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:55:09 compute-1 systemd[1]: Reloading.
Sep 30 20:55:09 compute-1 systemd-rc-local-generator[65142]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:55:09 compute-1 systemd-sysv-generator[65147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:55:09 compute-1 systemd[1]: Starting Netfilter Tables...
Sep 30 20:55:09 compute-1 systemd[1]: Finished Netfilter Tables.
Sep 30 20:55:09 compute-1 sudo[65117]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:10 compute-1 sudo[65309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcroqjosulaafmkwekdpumuhclohcumj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265709.7990105-656-220678116390521/AnsiballZ_command.py'
Sep 30 20:55:10 compute-1 sudo[65309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:10 compute-1 python3.9[65311]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:55:10 compute-1 sudo[65309]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:11 compute-1 sudo[65462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfuclezedlivrxbfqobqnxdmffakoukm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265711.1568904-698-192279704442555/AnsiballZ_stat.py'
Sep 30 20:55:11 compute-1 sudo[65462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:11 compute-1 python3.9[65464]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:55:11 compute-1 sudo[65462]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:12 compute-1 sudo[65587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmlrtmxtcaeldlqasbfsxnxvzmwquvlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265711.1568904-698-192279704442555/AnsiballZ_copy.py'
Sep 30 20:55:12 compute-1 sudo[65587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:12 compute-1 python3.9[65589]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265711.1568904-698-192279704442555/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:55:12 compute-1 sudo[65587]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:13 compute-1 python3.9[65740]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:55:13 compute-1 polkitd[6517]: Registered Authentication Agent for unix-process:65742:240911 (system bus name :1.552 [/usr/bin/pkttyagent --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Sep 30 20:55:38 compute-1 polkitd[6517]: Unregistered Authentication Agent for unix-process:65742:240911 (system bus name :1.552, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Sep 30 20:55:38 compute-1 polkit-agent-helper-1[65754]: pam_unix(polkit-1:auth): conversation failed
Sep 30 20:55:38 compute-1 polkit-agent-helper-1[65754]: pam_unix(polkit-1:auth): auth could not identify password for [root]
Sep 30 20:55:38 compute-1 polkitd[6517]: Operator of unix-process:65742:240911 FAILED to authenticate to gain authorization for action org.freedesktop.systemd1.manage-units for system-bus-name::1.551 [<unknown>] (owned by unix-user:zuul)
Sep 30 20:55:39 compute-1 sshd-session[61049]: Connection closed by 192.168.122.30 port 60232
Sep 30 20:55:39 compute-1 sshd-session[61046]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:55:39 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Sep 30 20:55:39 compute-1 systemd[1]: session-15.scope: Consumed 23.643s CPU time.
Sep 30 20:55:39 compute-1 systemd-logind[793]: Session 15 logged out. Waiting for processes to exit.
Sep 30 20:55:39 compute-1 systemd-logind[793]: Removed session 15.
Sep 30 20:55:51 compute-1 sshd-session[65780]: Accepted publickey for zuul from 192.168.122.30 port 57718 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:55:51 compute-1 systemd-logind[793]: New session 16 of user zuul.
Sep 30 20:55:51 compute-1 systemd[1]: Started Session 16 of User zuul.
Sep 30 20:55:51 compute-1 sshd-session[65780]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:55:53 compute-1 python3.9[65933]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:55:54 compute-1 sudo[66087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgneermumiwpgseqlwbhzbcbplqcklmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265753.5468113-65-35332495411535/AnsiballZ_file.py'
Sep 30 20:55:54 compute-1 sudo[66087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:54 compute-1 python3.9[66089]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:55:54 compute-1 sudo[66087]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:55 compute-1 sudo[66262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibcjnqoixwomtjjkskgaksjfsrddsjtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265754.6116328-89-66905499297775/AnsiballZ_stat.py'
Sep 30 20:55:55 compute-1 sudo[66262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:55 compute-1 python3.9[66264]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:55:55 compute-1 sudo[66262]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:55 compute-1 sudo[66340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftlwtmcrjecahfqgnpyaioncoffjejbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265754.6116328-89-66905499297775/AnsiballZ_file.py'
Sep 30 20:55:55 compute-1 sudo[66340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:55 compute-1 python3.9[66342]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.xfg25xdi recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:55:55 compute-1 sudo[66340]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:56 compute-1 sudo[66492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkxruruvwxcawejwkqjtqampyxqxtwdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265756.382035-149-185387203739676/AnsiballZ_stat.py'
Sep 30 20:55:56 compute-1 sudo[66492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:56 compute-1 python3.9[66494]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:55:56 compute-1 sudo[66492]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:57 compute-1 sudo[66570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxwnjgbzwdxvginrqxdykplszeifgywb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265756.382035-149-185387203739676/AnsiballZ_file.py'
Sep 30 20:55:57 compute-1 sudo[66570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:57 compute-1 python3.9[66572]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.tzyta5l6 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:55:57 compute-1 sudo[66570]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:58 compute-1 sudo[66722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glvpajdepleqfcfidsyvdmhfbacqzvkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265757.7960854-188-275294193011656/AnsiballZ_file.py'
Sep 30 20:55:58 compute-1 sudo[66722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:58 compute-1 python3.9[66724]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:55:58 compute-1 sudo[66722]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:59 compute-1 sudo[66874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjluprgvxaggflhakdnckhwzegujeawy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265758.6245468-212-162221247772054/AnsiballZ_stat.py'
Sep 30 20:55:59 compute-1 sudo[66874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:59 compute-1 python3.9[66876]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:55:59 compute-1 sudo[66874]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:59 compute-1 sudo[66952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayzhlovgzjhfzdyxhwbysammqroxpvfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265758.6245468-212-162221247772054/AnsiballZ_file.py'
Sep 30 20:55:59 compute-1 sudo[66952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:59 compute-1 python3.9[66954]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:55:59 compute-1 sudo[66952]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:00 compute-1 sudo[67104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgmoakjhohcduvwrizyvukndcjwbgrzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265759.9433444-212-157946670559986/AnsiballZ_stat.py'
Sep 30 20:56:00 compute-1 sudo[67104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:00 compute-1 python3.9[67106]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:00 compute-1 sudo[67104]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:00 compute-1 sudo[67182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atyvyavdxorbspcpbgmponklyxvdnktb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265759.9433444-212-157946670559986/AnsiballZ_file.py'
Sep 30 20:56:00 compute-1 sudo[67182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:00 compute-1 python3.9[67184]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:56:00 compute-1 sudo[67182]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:01 compute-1 sudo[67334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnfshpcxoeoelvxqebkzzekmeshwlgnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265761.1903994-281-64255000163357/AnsiballZ_file.py'
Sep 30 20:56:01 compute-1 sudo[67334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:01 compute-1 python3.9[67336]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:01 compute-1 sudo[67334]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:02 compute-1 sudo[67486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gztljvdjoyqazityqpdrootwqgwnvkjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265762.1374512-305-211418550055554/AnsiballZ_stat.py'
Sep 30 20:56:02 compute-1 sudo[67486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:02 compute-1 python3.9[67488]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:02 compute-1 sudo[67486]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:02 compute-1 sudo[67564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmtfvzngmjavuijimupmibjdrwoshtcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265762.1374512-305-211418550055554/AnsiballZ_file.py'
Sep 30 20:56:02 compute-1 sudo[67564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:03 compute-1 python3.9[67566]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:03 compute-1 sudo[67564]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:03 compute-1 sudo[67716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqordienszlxylpeiqwmpbpauerwvfyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265763.4239888-341-162994658814050/AnsiballZ_stat.py'
Sep 30 20:56:03 compute-1 sudo[67716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:03 compute-1 python3.9[67718]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:03 compute-1 sudo[67716]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:04 compute-1 sudo[67794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iagxtbxdlykmrbjnapbshddzqcrebtbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265763.4239888-341-162994658814050/AnsiballZ_file.py'
Sep 30 20:56:04 compute-1 sudo[67794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:04 compute-1 python3.9[67796]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:04 compute-1 sudo[67794]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:05 compute-1 sudo[67946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imckmjqbomttxhmvpfxmjgevopnawhjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265764.684155-377-257992174229925/AnsiballZ_systemd.py'
Sep 30 20:56:05 compute-1 sudo[67946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:05 compute-1 python3.9[67948]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:56:05 compute-1 systemd[1]: Reloading.
Sep 30 20:56:05 compute-1 systemd-rc-local-generator[67973]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:56:05 compute-1 systemd-sysv-generator[67977]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:56:06 compute-1 sudo[67946]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:06 compute-1 sudo[68134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbtmqsivamojgcbaiuhodvcxivqhpfyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265766.3548715-401-81910246326875/AnsiballZ_stat.py'
Sep 30 20:56:06 compute-1 sudo[68134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:07 compute-1 python3.9[68136]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:07 compute-1 sudo[68134]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:07 compute-1 sudo[68212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pewogjveqkxvrxflwzdiknncfldzdmcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265766.3548715-401-81910246326875/AnsiballZ_file.py'
Sep 30 20:56:07 compute-1 sudo[68212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:07 compute-1 python3.9[68214]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:07 compute-1 sudo[68212]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:08 compute-1 sudo[68364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdcclbqbsyxdwhhsbrtexqnaiwzfsyuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265767.8665636-437-270568920328867/AnsiballZ_stat.py'
Sep 30 20:56:08 compute-1 sudo[68364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:08 compute-1 python3.9[68366]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:08 compute-1 sudo[68364]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:08 compute-1 sudo[68442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfwcryytebrtdxduabptfofwdhmaoeuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265767.8665636-437-270568920328867/AnsiballZ_file.py'
Sep 30 20:56:08 compute-1 sudo[68442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:08 compute-1 python3.9[68444]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:08 compute-1 sudo[68442]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:09 compute-1 sudo[68594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdnatzsspybxqcitreelcnjmobohjcmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265769.257543-473-47696573871517/AnsiballZ_systemd.py'
Sep 30 20:56:09 compute-1 sudo[68594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:09 compute-1 python3.9[68596]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:56:09 compute-1 systemd[1]: Reloading.
Sep 30 20:56:10 compute-1 systemd-rc-local-generator[68623]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:56:10 compute-1 systemd-sysv-generator[68627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:56:10 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 20:56:10 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 20:56:10 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 20:56:10 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 20:56:10 compute-1 sudo[68594]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:11 compute-1 python3.9[68788]: ansible-ansible.builtin.service_facts Invoked
Sep 30 20:56:11 compute-1 network[68805]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 20:56:11 compute-1 network[68806]: 'network-scripts' will be removed from distribution in near future.
Sep 30 20:56:11 compute-1 network[68807]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 20:56:17 compute-1 sudo[69068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrbllqwrnwqazckzvrnsrqyuazlshkex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265776.7941048-551-184360843861041/AnsiballZ_stat.py'
Sep 30 20:56:17 compute-1 sudo[69068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:17 compute-1 python3.9[69070]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:17 compute-1 sudo[69068]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:17 compute-1 sudo[69146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slylyqtpqjsrtrotdrkbmfjyvrzmkdpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265776.7941048-551-184360843861041/AnsiballZ_file.py'
Sep 30 20:56:17 compute-1 sudo[69146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:17 compute-1 python3.9[69148]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:18 compute-1 sudo[69146]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:18 compute-1 sudo[69298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugkpnnkytcaprldhmjmqckccktzshbnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265778.309087-590-11722312056689/AnsiballZ_file.py'
Sep 30 20:56:18 compute-1 sudo[69298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:18 compute-1 python3.9[69300]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:18 compute-1 sudo[69298]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:19 compute-1 sudo[69450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mshihcsklqynhjzbnbwllmvptvxeudph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265779.0880003-614-212244387544311/AnsiballZ_stat.py'
Sep 30 20:56:19 compute-1 sudo[69450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:19 compute-1 python3.9[69452]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:19 compute-1 sudo[69450]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:20 compute-1 sudo[69573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhtbxlfkzpzluqfvntzznsxmawkprzgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265779.0880003-614-212244387544311/AnsiballZ_copy.py'
Sep 30 20:56:20 compute-1 sudo[69573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:20 compute-1 python3.9[69575]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265779.0880003-614-212244387544311/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:20 compute-1 sudo[69573]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:21 compute-1 sudo[69725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndsemzpfwtiquqdymhzexongwycoojuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265780.9924138-668-151704730710912/AnsiballZ_timezone.py'
Sep 30 20:56:21 compute-1 sudo[69725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:21 compute-1 python3.9[69727]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 20:56:21 compute-1 systemd[1]: Starting Time & Date Service...
Sep 30 20:56:21 compute-1 systemd[1]: Started Time & Date Service.
Sep 30 20:56:21 compute-1 sudo[69725]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:22 compute-1 sudo[69881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucpodpzedjcptiszpvpjctedckvzmvhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265782.144235-695-21233856853387/AnsiballZ_file.py'
Sep 30 20:56:22 compute-1 sudo[69881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:22 compute-1 python3.9[69883]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:22 compute-1 sudo[69881]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:23 compute-1 sudo[70033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvdhwjvayfaqldqsysudxgglltitjqbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265783.0274365-719-152749415615278/AnsiballZ_stat.py'
Sep 30 20:56:23 compute-1 sudo[70033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:23 compute-1 python3.9[70035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:23 compute-1 sudo[70033]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:24 compute-1 sudo[70156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgjepvrhugohkuhsvsqkazkojxolrtkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265783.0274365-719-152749415615278/AnsiballZ_copy.py'
Sep 30 20:56:24 compute-1 sudo[70156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:24 compute-1 python3.9[70158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265783.0274365-719-152749415615278/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:24 compute-1 sudo[70156]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:25 compute-1 sudo[70308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixgcorncadayrpydkbntambifwhsaxlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265784.761201-764-133805440896312/AnsiballZ_stat.py'
Sep 30 20:56:25 compute-1 sudo[70308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:25 compute-1 python3.9[70310]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:25 compute-1 sudo[70308]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:25 compute-1 sudo[70431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfrsncverpulnqrhvleoorecruccofoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265784.761201-764-133805440896312/AnsiballZ_copy.py'
Sep 30 20:56:25 compute-1 sudo[70431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:25 compute-1 python3.9[70433]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265784.761201-764-133805440896312/.source.yaml _original_basename=.6e6nngbc follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:25 compute-1 sudo[70431]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:26 compute-1 sudo[70583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnowwuytcfxwdcrhweefaiyorkvyoqni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265786.3523088-809-82339254112869/AnsiballZ_stat.py'
Sep 30 20:56:26 compute-1 sudo[70583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:26 compute-1 python3.9[70585]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:26 compute-1 sudo[70583]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:27 compute-1 sudo[70706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwshtfinwriblfromzeeqjwcjwrtylxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265786.3523088-809-82339254112869/AnsiballZ_copy.py'
Sep 30 20:56:27 compute-1 sudo[70706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:27 compute-1 python3.9[70708]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265786.3523088-809-82339254112869/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:27 compute-1 sudo[70706]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:28 compute-1 sudo[70858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueeyaemenxvzdenaugyuduwavyxfxkbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265788.0097454-854-45344709340935/AnsiballZ_command.py'
Sep 30 20:56:28 compute-1 sudo[70858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:28 compute-1 python3.9[70860]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:56:28 compute-1 sudo[70858]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:29 compute-1 sudo[71011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrqoxtjzqcrsxfhmfmidligbdiplojnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265788.9449973-878-118140111252555/AnsiballZ_command.py'
Sep 30 20:56:29 compute-1 sudo[71011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:29 compute-1 python3.9[71013]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:56:29 compute-1 sudo[71011]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:30 compute-1 sudo[71164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjzxxqlcywkvrubrvuypooedxbtetzns ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759265789.9627612-902-214851990875944/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 20:56:30 compute-1 sudo[71164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:30 compute-1 python3[71166]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 20:56:30 compute-1 sudo[71164]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:31 compute-1 sudo[71316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emeplvuewlsvqzqhxiosxbljvijrnkux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265791.1745293-926-33111241269958/AnsiballZ_stat.py'
Sep 30 20:56:31 compute-1 sudo[71316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:31 compute-1 python3.9[71318]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:31 compute-1 sudo[71316]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:32 compute-1 sudo[71439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akrfrsjrydyuwywghtxdluanuqjmtchu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265791.1745293-926-33111241269958/AnsiballZ_copy.py'
Sep 30 20:56:32 compute-1 sudo[71439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:32 compute-1 python3.9[71441]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265791.1745293-926-33111241269958/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:32 compute-1 sudo[71439]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:33 compute-1 sudo[71591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hszkmeqttjtkqftujggbdcggxdktmugw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265792.708249-971-53688138308618/AnsiballZ_stat.py'
Sep 30 20:56:33 compute-1 sudo[71591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:33 compute-1 python3.9[71593]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:33 compute-1 sudo[71591]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:33 compute-1 sudo[71714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogchuentgwkmioxpyyowojukvzbvhplp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265792.708249-971-53688138308618/AnsiballZ_copy.py'
Sep 30 20:56:33 compute-1 sudo[71714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:33 compute-1 python3.9[71716]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265792.708249-971-53688138308618/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:34 compute-1 sudo[71714]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:34 compute-1 sudo[71866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qezccfgrupxtbatizsdyvpbotwnhuljm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265794.2300155-1016-26629963735181/AnsiballZ_stat.py'
Sep 30 20:56:34 compute-1 sudo[71866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:34 compute-1 python3.9[71868]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:34 compute-1 sudo[71866]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:35 compute-1 sudo[71989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isscvegnlksaiynkzcxrhzdgxjdcukte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265794.2300155-1016-26629963735181/AnsiballZ_copy.py'
Sep 30 20:56:35 compute-1 sudo[71989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:35 compute-1 python3.9[71991]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265794.2300155-1016-26629963735181/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:35 compute-1 sudo[71989]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:36 compute-1 sshd-session[71992]: Invalid user  from 47.86.37.20 port 37570
Sep 30 20:56:36 compute-1 sudo[72143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvzsurzknehwrafmmhkrtmwdwbtzydts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265796.040914-1061-12264215352729/AnsiballZ_stat.py'
Sep 30 20:56:36 compute-1 sudo[72143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:36 compute-1 python3.9[72145]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:36 compute-1 sudo[72143]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:37 compute-1 sudo[72266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcsgriqwsijvxeejnmmmqdhozfrahkdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265796.040914-1061-12264215352729/AnsiballZ_copy.py'
Sep 30 20:56:37 compute-1 sudo[72266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:37 compute-1 python3.9[72268]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265796.040914-1061-12264215352729/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:37 compute-1 sudo[72266]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:38 compute-1 sudo[72418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcwpsoaefhydjdxhudzbomqlqynsoyhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265797.6490362-1106-233006916491277/AnsiballZ_stat.py'
Sep 30 20:56:38 compute-1 sudo[72418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:38 compute-1 python3.9[72420]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:38 compute-1 sudo[72418]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:38 compute-1 sudo[72541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tivqqjpusctcqgcgfuqggsgntzzqavaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265797.6490362-1106-233006916491277/AnsiballZ_copy.py'
Sep 30 20:56:38 compute-1 sudo[72541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:38 compute-1 python3.9[72543]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265797.6490362-1106-233006916491277/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:38 compute-1 sudo[72541]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:39 compute-1 sudo[72693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvbfcefmlodrbbmgoqvqdxotrkvkpwvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265799.219244-1151-162067243366861/AnsiballZ_file.py'
Sep 30 20:56:39 compute-1 sudo[72693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:39 compute-1 python3.9[72695]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:39 compute-1 sudo[72693]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:40 compute-1 sudo[72845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfpidqmqkyqignmtiesusypvyzifjxmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265800.1113873-1175-181976347865707/AnsiballZ_command.py'
Sep 30 20:56:40 compute-1 sudo[72845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:40 compute-1 python3.9[72847]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:56:40 compute-1 sudo[72845]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:41 compute-1 sudo[73004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilagmwxtsrzpzpkjuiikenxinwulldcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265801.0754466-1199-226838476972230/AnsiballZ_blockinfile.py'
Sep 30 20:56:41 compute-1 sudo[73004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:41 compute-1 python3.9[73006]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:41 compute-1 sudo[73004]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:42 compute-1 sudo[73157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-turzqfjzqslhrjjxmyxqtlenrtvqajte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265802.2333744-1226-15555058436962/AnsiballZ_file.py'
Sep 30 20:56:42 compute-1 sudo[73157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:42 compute-1 python3.9[73159]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:42 compute-1 sudo[73157]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:43 compute-1 sshd-session[71992]: Connection closed by invalid user  47.86.37.20 port 37570 [preauth]
Sep 30 20:56:43 compute-1 sudo[73309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptjuwijrdraubxqnazgluhkbgzcbfpju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265803.0514548-1226-155354236948729/AnsiballZ_file.py'
Sep 30 20:56:43 compute-1 sudo[73309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:43 compute-1 chronyd[61020]: Selected source 162.159.200.1 (pool.ntp.org)
Sep 30 20:56:43 compute-1 python3.9[73311]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:43 compute-1 sudo[73309]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:44 compute-1 sudo[73461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzzwnrcxazwmjdmgmdjqqunulyewjsvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265803.8145764-1271-79613747115395/AnsiballZ_mount.py'
Sep 30 20:56:44 compute-1 sudo[73461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:44 compute-1 python3.9[73463]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 20:56:44 compute-1 sudo[73461]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:44 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 20:56:44 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 20:56:45 compute-1 sudo[73615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usqpkzlhfroktcrgkurlqdqhnfafsriy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265804.7264001-1271-277544410231180/AnsiballZ_mount.py'
Sep 30 20:56:45 compute-1 sudo[73615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:45 compute-1 python3.9[73617]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 20:56:45 compute-1 sudo[73615]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:46 compute-1 sshd-session[65783]: Connection closed by 192.168.122.30 port 57718
Sep 30 20:56:46 compute-1 sshd-session[65780]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:56:46 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Sep 30 20:56:46 compute-1 systemd[1]: session-16.scope: Consumed 36.623s CPU time.
Sep 30 20:56:46 compute-1 systemd-logind[793]: Session 16 logged out. Waiting for processes to exit.
Sep 30 20:56:46 compute-1 systemd-logind[793]: Removed session 16.
Sep 30 20:56:51 compute-1 sshd-session[73644]: Accepted publickey for zuul from 192.168.122.30 port 47038 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:56:51 compute-1 systemd-logind[793]: New session 17 of user zuul.
Sep 30 20:56:51 compute-1 systemd[1]: Started Session 17 of User zuul.
Sep 30 20:56:51 compute-1 sshd-session[73644]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:56:51 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 20:56:51 compute-1 sudo[73799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxnjhuohflqhwnegiscqpqqhlhzosvgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265811.3947394-24-241561147958939/AnsiballZ_tempfile.py'
Sep 30 20:56:51 compute-1 sudo[73799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:52 compute-1 python3.9[73801]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Sep 30 20:56:52 compute-1 sudo[73799]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:52 compute-1 sudo[73951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhqvtlnijkizcnznbxexhtgrpuoitvnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265812.4183736-60-241594468958937/AnsiballZ_stat.py'
Sep 30 20:56:52 compute-1 sudo[73951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:53 compute-1 python3.9[73953]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:56:53 compute-1 sudo[73951]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:54 compute-1 sudo[74103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvyrghkibieuxlfujyutpsgklkaaebmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265813.524117-90-248553126611016/AnsiballZ_setup.py'
Sep 30 20:56:54 compute-1 sudo[74103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:54 compute-1 python3.9[74105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:56:54 compute-1 sudo[74103]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:55 compute-1 sudo[74255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suovetvbyntjpxngbnatbdrpfywgddnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265815.2346601-116-256587527217651/AnsiballZ_blockinfile.py'
Sep 30 20:56:55 compute-1 sudo[74255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:55 compute-1 python3.9[74257]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwbguriNnKe1J2ZDq7gor8K7Q+lfGJj0jSgjJsKEAo86u38uv5C6xHzMNeAjrvv5lEpFfzr0c3it3JW+dozBbfDykJ0wkEOeR0LHleEuFWTmPiG4RPVZ1m1J78yNSFWhD5VykkBwAqirHFrpywQIhDpw77JTQCi/xxeNvkj+vXx01l2nVO8cvDgHufzI+lR13XAFYs2zFc66/eOej+HaLiLK9IJQxGbnRa6+QZHQ/W2ou3kUAJYqnXBI8i1n0DyPWcfnDiKxmNlhvOFCVrRYKldToQO2oDNq7UFG3hUDzhT6NKABZTCQ/V/AXTwChfyV+kOR3+MOcdbvsymKiU5eIqWfFljY8pnxR//zYoaSfKNoi6fCUnilUdrEMOEzqDDgVhnS7ml2uz9JTRHuHbN8onUffLvBa7V/nLvlcv6WSbHHiQ/+janZ294/vCQuDTupMIAIvWj8cWHV3qv/Y/quONpe+lDn2/vxourJmj+K/mlEnSWCs49ztenPV15oUk51c=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHhbJv8lx2yVUO6nnQVlESK5ivpJT0r/PkYxvcd6qUmW
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBH6lp5638PMrvExSWZKHaxHkYDLYTl4v6jLIL5XRhmXjHAbQo8UkrhUVofrCF0y9aTPCQ4m6QhJ9ntO/Pb9rWgE=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCdFkN8U9xxznTs4l1dWKSgqtKSYoqjAq8N76IdRfa7xWuGXIDRX+Z/XS7MRDpugTVs4N689TRx9Piour7pbLLbVTdHtKAVY1ZFbO52TOv/Ya6j0Royn16s8ReajdozIptjKmHy9G2FpdOX7C4Y20cMciRCgKF+Uk1cb0iX7vYYWIprMI2dgvdoP3rAj0PkVPXji+oCGf4tEApwuWhji+GnWIVl/vFVGzg0S/OILQhkMHPBMMFdTA5/Xg4Z/liXQoQ4zDyYzjvYLXLSky6ySf/RJ58ny5ps0tRkbkvC0rwt37FhZbmlGFIZg33S799Zjt5rvLF5JSwuXGUsXu8EJQBHo5+QJwXxOcVlBJtbEkk/A5ashHxFcql24pTE/TJfJyvpvM3rRhZR8I/8DmYiyhblB27IxHhXGGeoQ17NZFuGnTwfGeyShJ642Bm+i/bAWgoanfixF1edEObUsck8KEpBwun3G/SLIba+hLXnGRGtjbEnsn7rANIXVoHUeM4xMEE=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEj+XpmwzMgMg+VNuVHNqnvVSOmbrJ0iinPB93cL5gcO
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGEM3NBdfKRXE3PLB677JgGmO3w3HhbkbYxeBS7PkJCAqAqklzpLc5E0r4ovcfzPiQaR/ONvG1z+RYgVwf+jfWU=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCWm45T0rx8eEGSOQf1IMie1aUJ+iwjTlwiWrmoGXyOX+gHYFcAwEzYr7tsCLrK7pQ2HBdBK6ZtTkFBMiRtZrZIzZ9bSytf3lXlayZia2khpG4ghK/3E9JZ4ThQmEAGxzPaFT+MCqKmmeWpsp5RQ8atdu+RkIEt95H8nJyBQYl2J9/PauZdleaGqWV7ah8ftqHvSfMtzljAlJqsazcPIq+1WteG18MMZGKoaGbNluITShBILFneVDfzdgT+BfoMOI3UgEO5EEOsf0+VW6Hd9nL3myviOEWD6FPOGUD0eofeXmCI94vefLVl7jMnyd2iEeNjhIE1lB70kyyQCvqvRgfmWJ7fEyeebIJ0y4YrfVLENILH2N1Q7OFvYZHxGEZFAtyPkWKsaHdPUlFCD7VsEK1NQMkLsmmRzp8umfqerUlpAerD1JkAv4kjRita1IOu1/qy562Ohzwhlfdc4zUprZwVOWtsETDD7Wvu4H5YKJQnxzFlWVeemsl8popivMpCaX0=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIM8lvg/9vtwYg/yl4PrNQ2IuRQZaYQt0XQPWZcbyMzH9
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD+C/rnBvt2r65Gl0M5vYcnCxgN/0t0Q/XcUf4UnaG+S2BBadWzDkctg8AqKsRNiacbXLRPVhzNMBUwp9JQsW5s=
                                             create=True mode=0644 path=/tmp/ansible.dw4_rb4z state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:55 compute-1 sudo[74255]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:56 compute-1 sudo[74407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psjhtdlggoizinpotnwawqkabjzdszct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265816.332733-139-144791782613584/AnsiballZ_command.py'
Sep 30 20:56:56 compute-1 sudo[74407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:57 compute-1 python3.9[74409]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.dw4_rb4z' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:56:57 compute-1 sudo[74407]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:57 compute-1 sudo[74561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrkrfrzqsfkokeuodcsjikrgjezkweiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265817.4160624-163-257080679548004/AnsiballZ_file.py'
Sep 30 20:56:57 compute-1 sudo[74561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:58 compute-1 python3.9[74563]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.dw4_rb4z state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:58 compute-1 sudo[74561]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:58 compute-1 sshd-session[73647]: Connection closed by 192.168.122.30 port 47038
Sep 30 20:56:58 compute-1 sshd-session[73644]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:56:58 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Sep 30 20:56:58 compute-1 systemd[1]: session-17.scope: Consumed 4.268s CPU time.
Sep 30 20:56:58 compute-1 systemd-logind[793]: Session 17 logged out. Waiting for processes to exit.
Sep 30 20:56:58 compute-1 systemd-logind[793]: Removed session 17.
Sep 30 20:57:03 compute-1 sshd-session[74588]: Accepted publickey for zuul from 192.168.122.30 port 37044 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:57:03 compute-1 systemd-logind[793]: New session 18 of user zuul.
Sep 30 20:57:03 compute-1 systemd[1]: Started Session 18 of User zuul.
Sep 30 20:57:03 compute-1 sshd-session[74588]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:57:04 compute-1 python3.9[74741]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:57:06 compute-1 sudo[74895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llmgdugqxjrolrmcnidmmuvlifsegavz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265825.305425-62-202247018297737/AnsiballZ_systemd.py'
Sep 30 20:57:06 compute-1 sudo[74895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:06 compute-1 python3.9[74897]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 20:57:06 compute-1 sudo[74895]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:06 compute-1 sudo[75049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruipvkvbuliovpautrptioyoacttenjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265826.6089525-86-246964879445687/AnsiballZ_systemd.py'
Sep 30 20:57:06 compute-1 sudo[75049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:07 compute-1 python3.9[75051]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:57:07 compute-1 sudo[75049]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:08 compute-1 sudo[75202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuzablptjxaogtnwqoobhleibwcsghjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265827.6643517-113-116078797494111/AnsiballZ_command.py'
Sep 30 20:57:08 compute-1 sudo[75202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:08 compute-1 python3.9[75204]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:57:08 compute-1 sudo[75202]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:09 compute-1 sudo[75355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vramhhlqxhntdguquvdlvzstixmvylqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265828.6091547-137-216925260148020/AnsiballZ_stat.py'
Sep 30 20:57:09 compute-1 sudo[75355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:09 compute-1 python3.9[75357]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:57:09 compute-1 sudo[75355]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:09 compute-1 sudo[75509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddfjochydefiwwzynftcgyclbdaqujwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265829.6330564-161-213254463269886/AnsiballZ_command.py'
Sep 30 20:57:09 compute-1 sudo[75509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:10 compute-1 python3.9[75511]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:57:10 compute-1 sudo[75509]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:11 compute-1 sudo[75664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akfsubvbfsderekisvkymdqxurjjcoiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265830.484431-185-244788876963929/AnsiballZ_file.py'
Sep 30 20:57:11 compute-1 sudo[75664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:11 compute-1 python3.9[75666]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:11 compute-1 sudo[75664]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:11 compute-1 sshd-session[74591]: Connection closed by 192.168.122.30 port 37044
Sep 30 20:57:11 compute-1 sshd-session[74588]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:57:11 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Sep 30 20:57:11 compute-1 systemd[1]: session-18.scope: Consumed 5.385s CPU time.
Sep 30 20:57:11 compute-1 systemd-logind[793]: Session 18 logged out. Waiting for processes to exit.
Sep 30 20:57:11 compute-1 systemd-logind[793]: Removed session 18.
Sep 30 20:57:16 compute-1 sshd-session[75691]: Accepted publickey for zuul from 192.168.122.30 port 46190 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:57:16 compute-1 systemd-logind[793]: New session 19 of user zuul.
Sep 30 20:57:16 compute-1 systemd[1]: Started Session 19 of User zuul.
Sep 30 20:57:16 compute-1 sshd-session[75691]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:57:17 compute-1 python3.9[75844]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:57:18 compute-1 sudo[75998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agzipetoityzklpwtvqfnanytreoticz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265838.317304-68-51037563577599/AnsiballZ_setup.py'
Sep 30 20:57:18 compute-1 sudo[75998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:18 compute-1 python3.9[76000]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:57:19 compute-1 sudo[75998]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:19 compute-1 sudo[76082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emfknwvdhkqwepxmztmvptbfmnieaemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265838.317304-68-51037563577599/AnsiballZ_dnf.py'
Sep 30 20:57:19 compute-1 sudo[76082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:19 compute-1 python3.9[76084]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:57:20 compute-1 sudo[76082]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:21 compute-1 python3.9[76235]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:57:23 compute-1 python3.9[76386]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 20:57:24 compute-1 python3.9[76536]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:57:25 compute-1 python3.9[76686]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:57:25 compute-1 sshd-session[75694]: Connection closed by 192.168.122.30 port 46190
Sep 30 20:57:25 compute-1 sshd-session[75691]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:57:25 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Sep 30 20:57:25 compute-1 systemd[1]: session-19.scope: Consumed 6.662s CPU time.
Sep 30 20:57:25 compute-1 systemd-logind[793]: Session 19 logged out. Waiting for processes to exit.
Sep 30 20:57:25 compute-1 systemd-logind[793]: Removed session 19.
Sep 30 20:57:28 compute-1 sshd-session[76711]: Connection closed by authenticating user sshd 194.0.234.19 port 33626 [preauth]
Sep 30 20:57:30 compute-1 sshd-session[76713]: Accepted publickey for zuul from 192.168.122.30 port 58460 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:57:30 compute-1 systemd-logind[793]: New session 20 of user zuul.
Sep 30 20:57:30 compute-1 systemd[1]: Started Session 20 of User zuul.
Sep 30 20:57:30 compute-1 sshd-session[76713]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:57:31 compute-1 python3.9[76866]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:57:33 compute-1 sudo[77020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqpusomlqgylrnbvslpbswubphxujdne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265853.0014613-117-74599601733391/AnsiballZ_file.py'
Sep 30 20:57:33 compute-1 sudo[77020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:33 compute-1 python3.9[77022]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:33 compute-1 sudo[77020]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:34 compute-1 sudo[77172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdpegtyuxetrslvgqtzfbpvfqwsfyspz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265853.8888876-117-36984007251047/AnsiballZ_file.py'
Sep 30 20:57:34 compute-1 sudo[77172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:34 compute-1 python3.9[77174]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:34 compute-1 sudo[77172]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:35 compute-1 sudo[77324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtyuomgvgeymdvridqmavappwvliqbjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265854.6697059-161-220515982210783/AnsiballZ_stat.py'
Sep 30 20:57:35 compute-1 sudo[77324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:35 compute-1 python3.9[77326]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:35 compute-1 sudo[77324]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:35 compute-1 sudo[77447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmmetdlvzdkxzqqvpexjztrztokzneuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265854.6697059-161-220515982210783/AnsiballZ_copy.py'
Sep 30 20:57:35 compute-1 sudo[77447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:36 compute-1 python3.9[77449]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265854.6697059-161-220515982210783/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=cbeda61f5170bbf7616e0783486c3dd9e308e127 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:36 compute-1 sudo[77447]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:36 compute-1 sudo[77599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgolqcpliphosoxytahainjokfnswytb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265856.2077067-161-214390177072696/AnsiballZ_stat.py'
Sep 30 20:57:36 compute-1 sudo[77599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:36 compute-1 python3.9[77601]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:36 compute-1 sudo[77599]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:37 compute-1 sudo[77722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtayfcxgjuwuefvjfeicmewxfkunsdty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265856.2077067-161-214390177072696/AnsiballZ_copy.py'
Sep 30 20:57:37 compute-1 sudo[77722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:37 compute-1 python3.9[77724]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265856.2077067-161-214390177072696/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=32428e3eac4ecab15356e47557337caf0347c55c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:37 compute-1 sudo[77722]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:37 compute-1 sudo[77874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrvgupsqiisauwpizdpfojpnhupjakjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265857.4752727-161-108720797455139/AnsiballZ_stat.py'
Sep 30 20:57:37 compute-1 sudo[77874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:37 compute-1 python3.9[77876]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:37 compute-1 sudo[77874]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:38 compute-1 sudo[77997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reyypowlhgonuddyglbcuabsmqxzlmua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265857.4752727-161-108720797455139/AnsiballZ_copy.py'
Sep 30 20:57:38 compute-1 sudo[77997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:38 compute-1 python3.9[77999]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265857.4752727-161-108720797455139/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=24e3f1ca6e13f25897bcfcf9f3457513062f39df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:38 compute-1 sudo[77997]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:39 compute-1 sudo[78149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-socmhfgulbznqnipegktwnvntpvhkkqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265858.777197-288-245499761734140/AnsiballZ_file.py'
Sep 30 20:57:39 compute-1 sudo[78149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:39 compute-1 python3.9[78151]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:39 compute-1 sudo[78149]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:39 compute-1 sudo[78301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xplsbtycujshxzgaamzugnyqliraqder ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265859.4660547-288-248015615455687/AnsiballZ_file.py'
Sep 30 20:57:39 compute-1 sudo[78301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:40 compute-1 python3.9[78303]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:40 compute-1 sudo[78301]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:40 compute-1 sudo[78453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvblpnwuzofwkilivmvbescwmytxesoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265860.2292914-335-105501904328185/AnsiballZ_stat.py'
Sep 30 20:57:40 compute-1 sudo[78453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:40 compute-1 python3.9[78455]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:40 compute-1 sudo[78453]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:41 compute-1 sudo[78576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgjkvzljwkzgesbofvvinejqdxvuowgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265860.2292914-335-105501904328185/AnsiballZ_copy.py'
Sep 30 20:57:41 compute-1 sudo[78576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:41 compute-1 python3.9[78578]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265860.2292914-335-105501904328185/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=1988a49c36ab0563bc8ee68ea79c6979b5cba4fb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:41 compute-1 sudo[78576]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:41 compute-1 sudo[78728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hplwdqfenebizlcbpfvghvxsknvgvxaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265861.592341-335-256405259240907/AnsiballZ_stat.py'
Sep 30 20:57:41 compute-1 sudo[78728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:42 compute-1 python3.9[78730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:42 compute-1 sudo[78728]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:42 compute-1 sudo[78851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clnbfqfokifkifkzuevfeqewzntlheze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265861.592341-335-256405259240907/AnsiballZ_copy.py'
Sep 30 20:57:42 compute-1 sudo[78851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:42 compute-1 python3.9[78853]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265861.592341-335-256405259240907/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=c427c888ca1014d618bead98d3cdf4a14d714172 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:42 compute-1 sudo[78851]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:43 compute-1 sudo[79003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfnqqfodfsgleniuulugeagawmjgdscx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265862.8369439-335-253056116414345/AnsiballZ_stat.py'
Sep 30 20:57:43 compute-1 sudo[79003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:43 compute-1 python3.9[79005]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:43 compute-1 sudo[79003]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:43 compute-1 sudo[79126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eguhicgdycxkrimdnosdxjzfppersjqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265862.8369439-335-253056116414345/AnsiballZ_copy.py'
Sep 30 20:57:43 compute-1 sudo[79126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:43 compute-1 python3.9[79128]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265862.8369439-335-253056116414345/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=11d36286378153bcc71c1ab24a24bdb478cffe62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:43 compute-1 sudo[79126]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:44 compute-1 sudo[79278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plvajbmyruzskydrdgalneoniittzjew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265864.080739-460-266806677896764/AnsiballZ_file.py'
Sep 30 20:57:44 compute-1 sudo[79278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:44 compute-1 python3.9[79280]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:44 compute-1 sudo[79278]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:45 compute-1 sudo[79430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymlrdwhlbaydfucrdibaxtjuecljxwgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265864.811946-460-138132635161471/AnsiballZ_file.py'
Sep 30 20:57:45 compute-1 sudo[79430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:45 compute-1 python3.9[79432]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:45 compute-1 sudo[79430]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:45 compute-1 sudo[79582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocaifrlcttcgscataptgbkepzdugplki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265865.5986278-507-97390332877879/AnsiballZ_stat.py'
Sep 30 20:57:45 compute-1 sudo[79582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:46 compute-1 python3.9[79584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:46 compute-1 sudo[79582]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:46 compute-1 sudo[79705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctskqiktaepqnvyhvwhabgjeinbptacr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265865.5986278-507-97390332877879/AnsiballZ_copy.py'
Sep 30 20:57:46 compute-1 sudo[79705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:46 compute-1 python3.9[79707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265865.5986278-507-97390332877879/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=ca31085461bc577d9f3629a25c74bb41989a29be backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:46 compute-1 sudo[79705]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:47 compute-1 sudo[79857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsochaxdmphczkyhtfdplqcqppkcpjvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265867.03111-507-174511191021252/AnsiballZ_stat.py'
Sep 30 20:57:47 compute-1 sudo[79857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:47 compute-1 python3.9[79859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:47 compute-1 sudo[79857]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:48 compute-1 sudo[79980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-makhmxwbksqpouphjvfszvpjxkhyfnwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265867.03111-507-174511191021252/AnsiballZ_copy.py'
Sep 30 20:57:48 compute-1 sudo[79980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:48 compute-1 python3.9[79982]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265867.03111-507-174511191021252/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=19543d2d8d9468ce3b30d32ed411afaff23b13eb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:48 compute-1 sudo[79980]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:48 compute-1 sudo[80132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzqfkmotlzzsrjsimmjznkrbmnyozvmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265868.3558779-507-172131782598413/AnsiballZ_stat.py'
Sep 30 20:57:48 compute-1 sudo[80132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:48 compute-1 python3.9[80134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:48 compute-1 sudo[80132]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:49 compute-1 sudo[80255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaivjmubssyaqnytjdrffvpvqycubbdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265868.3558779-507-172131782598413/AnsiballZ_copy.py'
Sep 30 20:57:49 compute-1 sudo[80255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:49 compute-1 python3.9[80257]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265868.3558779-507-172131782598413/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=9696b679eb033e91617141fd0924b37cf7637bda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:49 compute-1 sudo[80255]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:49 compute-1 sshd-session[80334]: banner exchange: Connection from 88.214.25.123 port 65419: invalid format
Sep 30 20:57:50 compute-1 sudo[80408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdwspcutbjicbmztjfukbdihmsssmklg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265869.7270482-641-193105928169244/AnsiballZ_file.py'
Sep 30 20:57:50 compute-1 sudo[80408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:50 compute-1 python3.9[80410]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:50 compute-1 sudo[80408]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:50 compute-1 sudo[80560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzpdxkrsvzdhumfuzoqanzirewyucjst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265870.4290535-641-135252670813731/AnsiballZ_file.py'
Sep 30 20:57:50 compute-1 sudo[80560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:50 compute-1 python3.9[80562]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:50 compute-1 sudo[80560]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:51 compute-1 sudo[80712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urhtphjkkgmhhufhbisjiimeybsnrttg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265871.1962404-681-58601500947772/AnsiballZ_stat.py'
Sep 30 20:57:51 compute-1 sudo[80712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:51 compute-1 python3.9[80714]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:51 compute-1 sudo[80712]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:52 compute-1 sudo[80835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldvzlrhhemtbkipbdnkumqqnwslefoki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265871.1962404-681-58601500947772/AnsiballZ_copy.py'
Sep 30 20:57:52 compute-1 sudo[80835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:52 compute-1 python3.9[80837]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265871.1962404-681-58601500947772/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=a96e4268050ffac96592b2c71b656edf5af46464 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:52 compute-1 sudo[80835]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:52 compute-1 sudo[80987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udzjxjorxyityljzcoextupdbongmzpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265872.4853354-681-153562343418422/AnsiballZ_stat.py'
Sep 30 20:57:52 compute-1 sudo[80987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:52 compute-1 python3.9[80989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:53 compute-1 sudo[80987]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:53 compute-1 sudo[81110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbkkvbenukpfsivyreezrsrkbcztkdmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265872.4853354-681-153562343418422/AnsiballZ_copy.py'
Sep 30 20:57:53 compute-1 sudo[81110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:53 compute-1 python3.9[81112]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265872.4853354-681-153562343418422/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=19543d2d8d9468ce3b30d32ed411afaff23b13eb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:53 compute-1 sudo[81110]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:54 compute-1 sudo[81262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaigqdfsfvopluktehkgyrokhpofhjgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265873.7833586-681-182442877250059/AnsiballZ_stat.py'
Sep 30 20:57:54 compute-1 sudo[81262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:54 compute-1 python3.9[81264]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:54 compute-1 sudo[81262]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:54 compute-1 sudo[81385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wetavgslijraihbsfbxmvlekpkhmxloo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265873.7833586-681-182442877250059/AnsiballZ_copy.py'
Sep 30 20:57:54 compute-1 sudo[81385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:55 compute-1 python3.9[81387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265873.7833586-681-182442877250059/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=331f520cf31dd139966a0c2305afcff39ba71f3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:55 compute-1 sudo[81385]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:56 compute-1 sudo[81537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvpliiuvtkitxvgbfolkcdyceytwlmxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265875.925236-838-136185587631709/AnsiballZ_file.py'
Sep 30 20:57:56 compute-1 sudo[81537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:56 compute-1 python3.9[81539]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:56 compute-1 sudo[81537]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:57 compute-1 sudo[81689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiqlpcvohmaeifthundjdnomzzxvjjpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265876.7240229-889-240887499105420/AnsiballZ_stat.py'
Sep 30 20:57:57 compute-1 sudo[81689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:57 compute-1 python3.9[81691]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:57 compute-1 sudo[81689]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:57 compute-1 sudo[81812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbvqdqsoghsgwjcnggihvpffvrfzgttz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265876.7240229-889-240887499105420/AnsiballZ_copy.py'
Sep 30 20:57:57 compute-1 sudo[81812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:57 compute-1 python3.9[81814]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265876.7240229-889-240887499105420/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:57 compute-1 sudo[81812]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:58 compute-1 sudo[81964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meqjgiwoiqikfnzbheidnymfmokkdcan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265878.1272771-938-231595499103195/AnsiballZ_file.py'
Sep 30 20:57:58 compute-1 sudo[81964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:58 compute-1 python3.9[81966]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:58 compute-1 sudo[81964]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:59 compute-1 sudo[82116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uawwjkaxwtrnblqausarseqjqzuxbtnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265878.8849604-964-162507885253010/AnsiballZ_stat.py'
Sep 30 20:57:59 compute-1 sudo[82116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:59 compute-1 python3.9[82118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:59 compute-1 sudo[82116]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:59 compute-1 sudo[82239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmszmifhwkcyjlsyyhwmqnkaexeveskt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265878.8849604-964-162507885253010/AnsiballZ_copy.py'
Sep 30 20:57:59 compute-1 sudo[82239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:00 compute-1 python3.9[82241]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265878.8849604-964-162507885253010/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:00 compute-1 sudo[82239]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:00 compute-1 sudo[82391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xytseanbmjdvtwgmbqdmsdkdbppcwepv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265880.386728-1012-147382578809084/AnsiballZ_file.py'
Sep 30 20:58:00 compute-1 sudo[82391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:00 compute-1 python3.9[82393]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:00 compute-1 sudo[82391]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:01 compute-1 sudo[82543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tenoqwmdsbujiqfperaqapeeazllicep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265881.1829095-1037-198644173135962/AnsiballZ_stat.py'
Sep 30 20:58:01 compute-1 sudo[82543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:01 compute-1 python3.9[82545]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:01 compute-1 sudo[82543]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:02 compute-1 sudo[82666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfrngptfjbocvjtiljlnexpdyrmgcdbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265881.1829095-1037-198644173135962/AnsiballZ_copy.py'
Sep 30 20:58:02 compute-1 sudo[82666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:02 compute-1 python3.9[82668]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265881.1829095-1037-198644173135962/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:02 compute-1 sudo[82666]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:03 compute-1 sudo[82818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtrlfpqsmcgolyyzhcmtjbsjqwqwevdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265882.9392157-1087-232097559327872/AnsiballZ_file.py'
Sep 30 20:58:03 compute-1 sudo[82818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:03 compute-1 python3.9[82820]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:03 compute-1 sudo[82818]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:04 compute-1 sudo[82970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzjdsqbyzenckvkykiycfczpljpsmveb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265883.814638-1113-253347382038002/AnsiballZ_stat.py'
Sep 30 20:58:04 compute-1 sudo[82970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:04 compute-1 python3.9[82972]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:04 compute-1 sudo[82970]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:04 compute-1 sudo[83093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwoottnogxhlcvcwhmogckzqcafvbfsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265883.814638-1113-253347382038002/AnsiballZ_copy.py'
Sep 30 20:58:04 compute-1 sudo[83093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:04 compute-1 python3.9[83095]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265883.814638-1113-253347382038002/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:04 compute-1 sudo[83093]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:05 compute-1 sudo[83245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mysrtuucxkcqjavketnwdtdqvdpszlbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265885.1991599-1160-262413944123601/AnsiballZ_file.py'
Sep 30 20:58:05 compute-1 sudo[83245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:05 compute-1 python3.9[83247]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:05 compute-1 sudo[83245]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:06 compute-1 sudo[83397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anycgckpwepmdvvyzqizddmjruwhutpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265885.890298-1183-75127145811980/AnsiballZ_stat.py'
Sep 30 20:58:06 compute-1 sudo[83397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:06 compute-1 python3.9[83399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:06 compute-1 sudo[83397]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:06 compute-1 sudo[83520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eafphgoxgltjxvejcbbfbrqtxkemxuxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265885.890298-1183-75127145811980/AnsiballZ_copy.py'
Sep 30 20:58:06 compute-1 sudo[83520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:07 compute-1 python3.9[83522]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265885.890298-1183-75127145811980/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:07 compute-1 sudo[83520]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:07 compute-1 sudo[83672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hulxiznjqiokyuodhykfbqqkzydracnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265887.390894-1234-265059356656835/AnsiballZ_file.py'
Sep 30 20:58:07 compute-1 sudo[83672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:08 compute-1 python3.9[83674]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:08 compute-1 sudo[83672]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:08 compute-1 sudo[83824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmfmbsnyftqriogzamhurahuygvqujen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265888.2528596-1261-182745063462836/AnsiballZ_stat.py'
Sep 30 20:58:08 compute-1 sudo[83824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:08 compute-1 python3.9[83826]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:08 compute-1 sudo[83824]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:09 compute-1 sudo[83947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqoxfldqwmedowsmsmrhfldrfygkoaid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265888.2528596-1261-182745063462836/AnsiballZ_copy.py'
Sep 30 20:58:09 compute-1 sudo[83947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:09 compute-1 python3.9[83949]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265888.2528596-1261-182745063462836/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:09 compute-1 sudo[83947]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:10 compute-1 sudo[84099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bekczlnngxhkxchuvgisdmqhyvdggyns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265889.745457-1308-227785048954663/AnsiballZ_file.py'
Sep 30 20:58:10 compute-1 sudo[84099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:10 compute-1 python3.9[84101]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:10 compute-1 sudo[84099]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:10 compute-1 sudo[84251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eokuabxcdhiwbuavqjpbsgxrcifiayti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265890.530038-1336-22754529843826/AnsiballZ_stat.py'
Sep 30 20:58:10 compute-1 sudo[84251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:11 compute-1 python3.9[84253]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:11 compute-1 sudo[84251]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:11 compute-1 sudo[84374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xclzowqufwtmlcbtgxftjfrgeknzxobb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265890.530038-1336-22754529843826/AnsiballZ_copy.py'
Sep 30 20:58:11 compute-1 sudo[84374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:11 compute-1 python3.9[84376]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265890.530038-1336-22754529843826/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:11 compute-1 sudo[84374]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:12 compute-1 sshd-session[76716]: Connection closed by 192.168.122.30 port 58460
Sep 30 20:58:12 compute-1 sshd-session[76713]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:58:12 compute-1 systemd-logind[793]: Session 20 logged out. Waiting for processes to exit.
Sep 30 20:58:12 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Sep 30 20:58:12 compute-1 systemd[1]: session-20.scope: Consumed 32.358s CPU time.
Sep 30 20:58:12 compute-1 systemd-logind[793]: Removed session 20.
Sep 30 20:58:13 compute-1 PackageKit[31666]: daemon quit
Sep 30 20:58:13 compute-1 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 20:58:17 compute-1 sshd-session[84402]: Accepted publickey for zuul from 192.168.122.30 port 33284 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:58:17 compute-1 systemd-logind[793]: New session 21 of user zuul.
Sep 30 20:58:17 compute-1 systemd[1]: Started Session 21 of User zuul.
Sep 30 20:58:17 compute-1 sshd-session[84402]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:58:18 compute-1 python3.9[84555]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:58:19 compute-1 sudo[84709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psagxuvrbfzwbkbwxeumowhzaymepksx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265898.9662821-68-45920218047517/AnsiballZ_file.py'
Sep 30 20:58:19 compute-1 sudo[84709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:19 compute-1 python3.9[84711]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:19 compute-1 sudo[84709]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:20 compute-1 sudo[84861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqrwogqqoieezozagyjsbuopbzbmrvka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265899.8408976-68-272375992201864/AnsiballZ_file.py'
Sep 30 20:58:20 compute-1 sudo[84861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:20 compute-1 python3.9[84863]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:20 compute-1 sudo[84861]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:21 compute-1 python3.9[85013]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:58:21 compute-1 sudo[85163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buqmiejbxkapbxuphurthtpfmlbdmjkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265901.4449205-137-254092584389557/AnsiballZ_seboolean.py'
Sep 30 20:58:21 compute-1 sudo[85163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:22 compute-1 python3.9[85165]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 20:58:23 compute-1 sudo[85163]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:23 compute-1 sudo[85319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxqtonyofplzsllvcldplvefkodlsbps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265903.6332667-167-112900888545079/AnsiballZ_setup.py'
Sep 30 20:58:23 compute-1 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Sep 30 20:58:23 compute-1 sudo[85319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:24 compute-1 python3.9[85321]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:58:24 compute-1 sudo[85319]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:24 compute-1 sudo[85403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvnwnlnakajdzszgteurjzohylujlhrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265903.6332667-167-112900888545079/AnsiballZ_dnf.py'
Sep 30 20:58:24 compute-1 sudo[85403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:25 compute-1 python3.9[85405]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:58:26 compute-1 sudo[85403]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:27 compute-1 sudo[85556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hognlnvprlajxoiwiqqejcpburlowmay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265906.6021008-203-111366648339772/AnsiballZ_systemd.py'
Sep 30 20:58:27 compute-1 sudo[85556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:27 compute-1 python3.9[85558]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 20:58:27 compute-1 sudo[85556]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:28 compute-1 sudo[85711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyfjkmkajdcrwykqwzzveftjyaxccvxn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759265907.8647199-227-136974645941841/AnsiballZ_edpm_nftables_snippet.py'
Sep 30 20:58:28 compute-1 sudo[85711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:28 compute-1 python3[85713]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Sep 30 20:58:28 compute-1 sudo[85711]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:29 compute-1 sudo[85863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krwvjmmosuwrmirrersxqnurrqvgqaog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265908.9043348-254-118175416727121/AnsiballZ_file.py'
Sep 30 20:58:29 compute-1 sudo[85863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:29 compute-1 python3.9[85865]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:29 compute-1 sudo[85863]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:30 compute-1 sudo[86015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyshcocngyveinfgeaqxbygeszxqyjuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265909.6624415-278-186267949979990/AnsiballZ_stat.py'
Sep 30 20:58:30 compute-1 sudo[86015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:30 compute-1 python3.9[86017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:30 compute-1 sudo[86015]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:30 compute-1 sudo[86093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbfoqvcwzmhpzpqbhgmitucphxbmvanl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265909.6624415-278-186267949979990/AnsiballZ_file.py'
Sep 30 20:58:30 compute-1 sudo[86093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:30 compute-1 python3.9[86095]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:30 compute-1 sudo[86093]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:31 compute-1 sudo[86245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrakbhhfjjrquigrrplogdaamlifqsmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265911.063001-314-146022461163711/AnsiballZ_stat.py'
Sep 30 20:58:31 compute-1 sudo[86245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:31 compute-1 python3.9[86247]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:31 compute-1 sudo[86245]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:31 compute-1 sudo[86323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgbhdaxqthioqqqzwybatayaosvjazyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265911.063001-314-146022461163711/AnsiballZ_file.py'
Sep 30 20:58:31 compute-1 sudo[86323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:32 compute-1 python3.9[86325]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qzg0wo0a recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:32 compute-1 sudo[86323]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:32 compute-1 sudo[86475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evyugsoajfrkvqikpofhqcxyaugburfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265912.279856-350-69236222730363/AnsiballZ_stat.py'
Sep 30 20:58:32 compute-1 sudo[86475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:32 compute-1 python3.9[86477]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:32 compute-1 sudo[86475]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:33 compute-1 sudo[86553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rijhuhsjxirlxlwvkfjkknmzrspnbozx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265912.279856-350-69236222730363/AnsiballZ_file.py'
Sep 30 20:58:33 compute-1 sudo[86553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:33 compute-1 python3.9[86555]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:33 compute-1 sudo[86553]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:34 compute-1 sudo[86705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trymftmjujkqylgtbbycqlghwlriinja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265913.5846088-389-263047428188691/AnsiballZ_command.py'
Sep 30 20:58:34 compute-1 sudo[86705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:34 compute-1 python3.9[86707]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:34 compute-1 sudo[86705]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:35 compute-1 sudo[86858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arpistoelvodyxgrphieudgniwypewgb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759265914.6292927-413-248214620983225/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 20:58:35 compute-1 sudo[86858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:35 compute-1 python3[86860]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 20:58:35 compute-1 sudo[86858]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:36 compute-1 sudo[87010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avmeidfmtbfzzryvkzeodrwndkhctznw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265915.637817-437-105984742337773/AnsiballZ_stat.py'
Sep 30 20:58:36 compute-1 sudo[87010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:36 compute-1 python3.9[87012]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:36 compute-1 sudo[87010]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:36 compute-1 sudo[87135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-forgdbxmfmqkzvrqdkecxstzingkwznx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265915.637817-437-105984742337773/AnsiballZ_copy.py'
Sep 30 20:58:36 compute-1 sudo[87135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:37 compute-1 python3.9[87137]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265915.637817-437-105984742337773/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:37 compute-1 sudo[87135]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:37 compute-1 sudo[87287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umqccleshyosexcrjmnbnlxdgxgqdrea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265917.2770612-482-270359040088077/AnsiballZ_stat.py'
Sep 30 20:58:37 compute-1 sudo[87287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:37 compute-1 python3.9[87289]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:37 compute-1 sudo[87287]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:38 compute-1 sudo[87412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkydsocgmrishdearnfkhqiapxcgqenf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265917.2770612-482-270359040088077/AnsiballZ_copy.py'
Sep 30 20:58:38 compute-1 sudo[87412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:38 compute-1 python3.9[87414]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265917.2770612-482-270359040088077/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:38 compute-1 sudo[87412]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:39 compute-1 sudo[87564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byqtxxhdosujcjnbdwknohikxnylbqyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265918.9279869-527-61499637794183/AnsiballZ_stat.py'
Sep 30 20:58:39 compute-1 sudo[87564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:39 compute-1 python3.9[87566]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:39 compute-1 sudo[87564]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:40 compute-1 sudo[87689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zysblhidtnjcxgslrtzmzrmkoixtpcnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265918.9279869-527-61499637794183/AnsiballZ_copy.py'
Sep 30 20:58:40 compute-1 sudo[87689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:40 compute-1 python3.9[87691]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265918.9279869-527-61499637794183/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:40 compute-1 sudo[87689]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:40 compute-1 sudo[87841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqtqttujulsekgvqoodahdfonyyqldpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265920.4277456-572-122891769675319/AnsiballZ_stat.py'
Sep 30 20:58:40 compute-1 sudo[87841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:41 compute-1 python3.9[87843]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:41 compute-1 sudo[87841]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:41 compute-1 sudo[87966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjuhdijrdixpynohijrhqgbydrparpsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265920.4277456-572-122891769675319/AnsiballZ_copy.py'
Sep 30 20:58:41 compute-1 sudo[87966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:41 compute-1 python3.9[87968]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265920.4277456-572-122891769675319/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:41 compute-1 sudo[87966]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:42 compute-1 sudo[88118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bonyanjjdchkdluxjmtphymzvzlapclq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265921.8997796-617-4885775081787/AnsiballZ_stat.py'
Sep 30 20:58:42 compute-1 sudo[88118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:42 compute-1 python3.9[88120]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:42 compute-1 sudo[88118]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:43 compute-1 sudo[88243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gclrgrfcjayuqxidptsgxmndskkkvrur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265921.8997796-617-4885775081787/AnsiballZ_copy.py'
Sep 30 20:58:43 compute-1 sudo[88243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:43 compute-1 python3.9[88245]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265921.8997796-617-4885775081787/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:43 compute-1 sudo[88243]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:43 compute-1 sudo[88395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbnjqayaxkauuoskrnoaecqvojpjuvnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265923.4885094-662-116492897715479/AnsiballZ_file.py'
Sep 30 20:58:43 compute-1 sudo[88395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:43 compute-1 python3.9[88397]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:44 compute-1 sudo[88395]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:44 compute-1 sudo[88547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcndroxvptsufzhfyxoeknsmipdujphy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265924.209059-686-222471556456314/AnsiballZ_command.py'
Sep 30 20:58:44 compute-1 sudo[88547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:44 compute-1 python3.9[88549]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:44 compute-1 sudo[88547]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:45 compute-1 sudo[88702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oazkboubyzcphdlptokqavtikvfnqabf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265925.078712-710-193185880136619/AnsiballZ_blockinfile.py'
Sep 30 20:58:45 compute-1 sudo[88702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:45 compute-1 python3.9[88704]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:45 compute-1 sudo[88702]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:46 compute-1 sudo[88854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqodesutdxotvygstmcmnfxckypvesuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265926.2476053-737-58293721961309/AnsiballZ_command.py'
Sep 30 20:58:46 compute-1 sudo[88854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:46 compute-1 python3.9[88856]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:46 compute-1 sudo[88854]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:47 compute-1 sudo[89007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldcpdjgaajqearyvkvxcrzwmwgyuxlba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265927.060831-761-6249141830415/AnsiballZ_stat.py'
Sep 30 20:58:47 compute-1 sudo[89007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:47 compute-1 python3.9[89009]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:58:47 compute-1 sudo[89007]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:48 compute-1 sudo[89161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahfecrztjdogwfdgytoiwpwlkczxbpet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265927.7930923-785-268517694649739/AnsiballZ_command.py'
Sep 30 20:58:48 compute-1 sudo[89161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:48 compute-1 python3.9[89163]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:48 compute-1 sudo[89161]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:49 compute-1 sudo[89316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjsyruciypxxjrfsujezjiichnxxjank ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265929.037061-809-123643661665985/AnsiballZ_file.py'
Sep 30 20:58:49 compute-1 sudo[89316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:49 compute-1 python3.9[89318]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:49 compute-1 sudo[89316]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:50 compute-1 python3.9[89468]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:58:51 compute-1 sudo[89619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibrhczhhyimyfdghrtoxsyahyqoqoopl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265931.6669002-929-109624255095370/AnsiballZ_command.py'
Sep 30 20:58:51 compute-1 sudo[89619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:52 compute-1 python3.9[89621]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:74:f6:ca:ec" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:52 compute-1 ovs-vsctl[89622]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:74:f6:ca:ec external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Sep 30 20:58:52 compute-1 sudo[89619]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:52 compute-1 sudo[89772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acrbdouwdrucjnoevrguzwycchfhklus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265932.5349069-956-168304131082054/AnsiballZ_command.py'
Sep 30 20:58:52 compute-1 sudo[89772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:53 compute-1 python3.9[89774]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:53 compute-1 sudo[89772]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:53 compute-1 sudo[89927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmouyeouktuevmlumrrwmuyfoxckbltw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265933.4180546-980-244198938825827/AnsiballZ_command.py'
Sep 30 20:58:53 compute-1 sudo[89927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:53 compute-1 python3.9[89929]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:53 compute-1 ovs-vsctl[89930]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Sep 30 20:58:54 compute-1 sudo[89927]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:54 compute-1 python3.9[90080]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:58:55 compute-1 sudo[90232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kocnlcqdrwsfuqxrdemfpvonoetfqscc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265935.0967767-1031-151005866637159/AnsiballZ_file.py'
Sep 30 20:58:55 compute-1 sudo[90232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:55 compute-1 python3.9[90234]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:55 compute-1 sudo[90232]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:56 compute-1 sudo[90384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxeupsyxxefhopomjzrtdamglemuphco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265935.961007-1055-167075111030439/AnsiballZ_stat.py'
Sep 30 20:58:56 compute-1 sudo[90384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:56 compute-1 python3.9[90386]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:56 compute-1 sudo[90384]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:56 compute-1 sudo[90462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iswawofrhxjfmgeyhixkzvguktacllrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265935.961007-1055-167075111030439/AnsiballZ_file.py'
Sep 30 20:58:56 compute-1 sudo[90462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:57 compute-1 python3.9[90464]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:57 compute-1 sudo[90462]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:57 compute-1 sudo[90614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmphlzjgloyaxlhjcknocywyitdqwdlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265937.1890247-1055-138335096785810/AnsiballZ_stat.py'
Sep 30 20:58:57 compute-1 sudo[90614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:57 compute-1 python3.9[90616]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:57 compute-1 sudo[90614]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:57 compute-1 sudo[90692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glsadlkqahmdcukqgstcachcqxbgttak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265937.1890247-1055-138335096785810/AnsiballZ_file.py'
Sep 30 20:58:57 compute-1 sudo[90692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:58 compute-1 python3.9[90694]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:58 compute-1 sudo[90692]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:58 compute-1 sudo[90844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxcekgbqmccjzhxoyhucdzsbapvzhtzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265938.6397545-1124-34000080694086/AnsiballZ_file.py'
Sep 30 20:58:58 compute-1 sudo[90844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:59 compute-1 python3.9[90846]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:59 compute-1 sudo[90844]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:59 compute-1 sudo[90996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noqrjwawbtxkzbfrqktzgagavbbndbna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265939.5452135-1148-196139043185152/AnsiballZ_stat.py'
Sep 30 20:58:59 compute-1 sudo[90996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:00 compute-1 python3.9[90998]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:00 compute-1 sudo[90996]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:00 compute-1 sudo[91074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhgtqcjbpuwjaajogidxmbabwtkekoae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265939.5452135-1148-196139043185152/AnsiballZ_file.py'
Sep 30 20:59:00 compute-1 sudo[91074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:00 compute-1 python3.9[91076]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:00 compute-1 sudo[91074]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:01 compute-1 sudo[91226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhtxuqorexfllvzwzdvzhsmdivupowzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265941.31037-1184-88418180048460/AnsiballZ_stat.py'
Sep 30 20:59:01 compute-1 sudo[91226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:01 compute-1 python3.9[91228]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:01 compute-1 sudo[91226]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:02 compute-1 sudo[91304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtomwesjveodfsaqzyacsbckxymxrqrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265941.31037-1184-88418180048460/AnsiballZ_file.py'
Sep 30 20:59:02 compute-1 sudo[91304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:02 compute-1 python3.9[91306]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:02 compute-1 sudo[91304]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:02 compute-1 sudo[91456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbuwnqnxzkcccskjalbtpsplvtsgngzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265942.6421285-1220-232107595972058/AnsiballZ_systemd.py'
Sep 30 20:59:02 compute-1 sudo[91456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:03 compute-1 python3.9[91458]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:59:03 compute-1 systemd[1]: Reloading.
Sep 30 20:59:03 compute-1 systemd-sysv-generator[91486]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:03 compute-1 systemd-rc-local-generator[91481]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:03 compute-1 sudo[91456]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:04 compute-1 sudo[91645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruivaxihxrnldnixbhzlodxllskujbxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265943.8760047-1244-28469360276408/AnsiballZ_stat.py'
Sep 30 20:59:04 compute-1 sudo[91645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:04 compute-1 python3.9[91647]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:04 compute-1 sudo[91645]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:04 compute-1 sudo[91723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwqqyfvxirvdlhuoyxiokvhzaadcbujp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265943.8760047-1244-28469360276408/AnsiballZ_file.py'
Sep 30 20:59:04 compute-1 sudo[91723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:04 compute-1 python3.9[91725]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:04 compute-1 sudo[91723]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:05 compute-1 sudo[91875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shaumkspuqxecqrgzjlhzjschmsjvexe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265945.314753-1280-162101379531414/AnsiballZ_stat.py'
Sep 30 20:59:05 compute-1 sudo[91875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:05 compute-1 python3.9[91877]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:05 compute-1 sudo[91875]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:06 compute-1 sudo[91953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvwnsncryyrfdpevrcpaqibmrikzmfsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265945.314753-1280-162101379531414/AnsiballZ_file.py'
Sep 30 20:59:06 compute-1 sudo[91953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:06 compute-1 python3.9[91955]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:06 compute-1 sudo[91953]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:07 compute-1 sudo[92105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toughvxxfljgafbketvysinqwoorcyzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265946.671353-1316-15658887272315/AnsiballZ_systemd.py'
Sep 30 20:59:07 compute-1 sudo[92105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:07 compute-1 python3.9[92107]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:59:07 compute-1 systemd[1]: Reloading.
Sep 30 20:59:07 compute-1 systemd-sysv-generator[92136]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:07 compute-1 systemd-rc-local-generator[92133]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:08 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 20:59:08 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 20:59:08 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 20:59:08 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 20:59:08 compute-1 sudo[92105]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:09 compute-1 sudo[92297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thruhfzivrrmlxdkxverjkvkwxtbpqyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265948.987898-1346-219012437032794/AnsiballZ_file.py'
Sep 30 20:59:09 compute-1 sudo[92297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:09 compute-1 python3.9[92299]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:09 compute-1 sudo[92297]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:10 compute-1 sudo[92449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icsghnjmnbghqdbezdwxmdfptjgytygu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265949.899573-1370-244432426172415/AnsiballZ_stat.py'
Sep 30 20:59:10 compute-1 sudo[92449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:10 compute-1 python3.9[92451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:10 compute-1 sudo[92449]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:10 compute-1 sudo[92572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voeiwlyypvkxbgllxweyyqnxliifsvgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265949.899573-1370-244432426172415/AnsiballZ_copy.py'
Sep 30 20:59:10 compute-1 sudo[92572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:11 compute-1 python3.9[92574]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265949.899573-1370-244432426172415/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:11 compute-1 sudo[92572]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:11 compute-1 sudo[92724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvjvwzxqrmegdwaynevbbskopukaqwfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265951.5415719-1421-245295150966085/AnsiballZ_file.py'
Sep 30 20:59:11 compute-1 sudo[92724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:12 compute-1 python3.9[92726]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:12 compute-1 sudo[92724]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:12 compute-1 sudo[92876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppkczpdajqeoukbwirramagmctfuzwik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265952.4668322-1445-62251091310252/AnsiballZ_stat.py'
Sep 30 20:59:12 compute-1 sudo[92876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:12 compute-1 python3.9[92878]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:12 compute-1 sudo[92876]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:13 compute-1 sudo[92999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcbazllcailqzdloncqfkjqsmaupzqut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265952.4668322-1445-62251091310252/AnsiballZ_copy.py'
Sep 30 20:59:13 compute-1 sudo[92999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:13 compute-1 python3.9[93001]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265952.4668322-1445-62251091310252/.source.json _original_basename=.fe5fvgrq follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:13 compute-1 sudo[92999]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:14 compute-1 sudo[93151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beoviiorbflwgivgxlfmwrkvwlxpysdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265954.0631735-1490-227209409246343/AnsiballZ_file.py'
Sep 30 20:59:14 compute-1 sudo[93151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:14 compute-1 python3.9[93153]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:14 compute-1 sudo[93151]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:15 compute-1 sudo[93303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gexvipbteyufqtntmjdfqpqemiboceis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265955.0084372-1514-133191142887211/AnsiballZ_stat.py'
Sep 30 20:59:15 compute-1 sudo[93303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:15 compute-1 sudo[93303]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:16 compute-1 sudo[93426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcmjybkqiiarxpvdouejpmciywhtjqtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265955.0084372-1514-133191142887211/AnsiballZ_copy.py'
Sep 30 20:59:16 compute-1 sudo[93426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:16 compute-1 sudo[93426]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:17 compute-1 sudo[93578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncihasubpgfoveanyysktczlbfplwkgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265956.580182-1565-272189931250348/AnsiballZ_container_config_data.py'
Sep 30 20:59:17 compute-1 sudo[93578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:17 compute-1 python3.9[93580]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Sep 30 20:59:17 compute-1 sudo[93578]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:18 compute-1 sudo[93730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qycueiinvsfrvkyvavfnjjnxyydmullv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265957.584335-1592-64857737467832/AnsiballZ_container_config_hash.py'
Sep 30 20:59:18 compute-1 sudo[93730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:18 compute-1 python3.9[93732]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 20:59:18 compute-1 sudo[93730]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:19 compute-1 sudo[93882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjscckejagfueuirbxityjkahveqpiyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265958.6430058-1619-228307435752782/AnsiballZ_podman_container_info.py'
Sep 30 20:59:19 compute-1 sudo[93882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:19 compute-1 python3.9[93884]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 20:59:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:59:19 compute-1 sudo[93882]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:20 compute-1 sudo[94046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcurziumbsqbrbhmpheeazzeqzspnjep ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759265960.0563896-1658-74115782525355/AnsiballZ_edpm_container_manage.py'
Sep 30 20:59:20 compute-1 sudo[94046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:20 compute-1 python3[94048]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 20:59:20 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:59:20 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:59:21 compute-1 podman[94085]: 2025-09-30 20:59:21.017812053 +0000 UTC m=+0.069691170 container create 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 20:59:21 compute-1 podman[94085]: 2025-09-30 20:59:20.979338202 +0000 UTC m=+0.031217359 image pull 7ffac6b06b247caf26cf673b775a5f070f2fa1a6008cf0b0964af7e905ba86a5 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Sep 30 20:59:21 compute-1 python3[94048]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Sep 30 20:59:21 compute-1 sudo[94046]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:21 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:59:22 compute-1 sudo[94273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edcqajzfcenekfwkuxwmlqddfpimnryp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265962.0067036-1682-154763613078817/AnsiballZ_stat.py'
Sep 30 20:59:22 compute-1 sudo[94273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:22 compute-1 python3.9[94275]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:59:22 compute-1 sudo[94273]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:23 compute-1 sudo[94427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtmvjzzxyjgzaazdcrkbgnpfneqmqmfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265962.8480837-1709-116531301516220/AnsiballZ_file.py'
Sep 30 20:59:23 compute-1 sudo[94427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:23 compute-1 python3.9[94429]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:23 compute-1 sudo[94427]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:23 compute-1 sudo[94503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgnqszwxjawuakixmunetktqumrmmqou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265962.8480837-1709-116531301516220/AnsiballZ_stat.py'
Sep 30 20:59:23 compute-1 sudo[94503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:23 compute-1 python3.9[94505]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:59:23 compute-1 sudo[94503]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:24 compute-1 sudo[94654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhjimlminyehrtjliccmitbcwwgzstuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265963.9026654-1709-251541955752479/AnsiballZ_copy.py'
Sep 30 20:59:24 compute-1 sudo[94654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:24 compute-1 python3.9[94656]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759265963.9026654-1709-251541955752479/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:24 compute-1 sudo[94654]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:24 compute-1 sudo[94730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okfalpfhohvoftyzhqiwhsdgqzbnnpnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265963.9026654-1709-251541955752479/AnsiballZ_systemd.py'
Sep 30 20:59:24 compute-1 sudo[94730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:25 compute-1 python3.9[94732]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 20:59:25 compute-1 systemd[1]: Reloading.
Sep 30 20:59:25 compute-1 systemd-rc-local-generator[94760]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:25 compute-1 systemd-sysv-generator[94763]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:25 compute-1 sudo[94730]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:25 compute-1 sudo[94841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojjkefantyrzdzixahyifrupixnqiucg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265963.9026654-1709-251541955752479/AnsiballZ_systemd.py'
Sep 30 20:59:25 compute-1 sudo[94841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:26 compute-1 python3.9[94843]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:59:26 compute-1 systemd[1]: Reloading.
Sep 30 20:59:26 compute-1 systemd-rc-local-generator[94875]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:26 compute-1 systemd-sysv-generator[94879]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:26 compute-1 systemd[1]: Starting ovn_controller container...
Sep 30 20:59:26 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Sep 30 20:59:26 compute-1 systemd[1]: Started libcrun container.
Sep 30 20:59:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9a8dcdc1e3d051f2223a8aa444031f8a594582d934ff95ec1eb3bfbf26943fa/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 20:59:26 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489.
Sep 30 20:59:26 compute-1 podman[94886]: 2025-09-30 20:59:26.662483737 +0000 UTC m=+0.217498716 container init 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 20:59:26 compute-1 ovn_controller[94902]: + sudo -E kolla_set_configs
Sep 30 20:59:26 compute-1 podman[94886]: 2025-09-30 20:59:26.697487886 +0000 UTC m=+0.252502785 container start 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250923)
Sep 30 20:59:26 compute-1 systemd[1]: Created slice User Slice of UID 0.
Sep 30 20:59:26 compute-1 edpm-start-podman-container[94886]: ovn_controller
Sep 30 20:59:26 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 20:59:26 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 20:59:26 compute-1 systemd[1]: Starting User Manager for UID 0...
Sep 30 20:59:26 compute-1 systemd[94934]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 20:59:26 compute-1 edpm-start-podman-container[94885]: Creating additional drop-in dependency for "ovn_controller" (4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489)
Sep 30 20:59:26 compute-1 systemd[1]: Reloading.
Sep 30 20:59:26 compute-1 podman[94908]: 2025-09-30 20:59:26.820942054 +0000 UTC m=+0.101314841 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 20:59:26 compute-1 systemd-rc-local-generator[94989]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:26 compute-1 systemd-sysv-generator[94992]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:26 compute-1 systemd[94934]: Queued start job for default target Main User Target.
Sep 30 20:59:26 compute-1 systemd[94934]: Created slice User Application Slice.
Sep 30 20:59:26 compute-1 systemd[94934]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 20:59:26 compute-1 systemd[94934]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 20:59:26 compute-1 systemd[94934]: Reached target Paths.
Sep 30 20:59:26 compute-1 systemd[94934]: Reached target Timers.
Sep 30 20:59:26 compute-1 systemd[94934]: Starting D-Bus User Message Bus Socket...
Sep 30 20:59:26 compute-1 systemd[94934]: Starting Create User's Volatile Files and Directories...
Sep 30 20:59:26 compute-1 systemd[94934]: Finished Create User's Volatile Files and Directories.
Sep 30 20:59:26 compute-1 systemd[94934]: Listening on D-Bus User Message Bus Socket.
Sep 30 20:59:26 compute-1 systemd[94934]: Reached target Sockets.
Sep 30 20:59:26 compute-1 systemd[94934]: Reached target Basic System.
Sep 30 20:59:26 compute-1 systemd[94934]: Reached target Main User Target.
Sep 30 20:59:26 compute-1 systemd[94934]: Startup finished in 168ms.
Sep 30 20:59:27 compute-1 systemd[1]: Started User Manager for UID 0.
Sep 30 20:59:27 compute-1 systemd[1]: Started ovn_controller container.
Sep 30 20:59:27 compute-1 systemd[1]: 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489-10c1a326462e994.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 20:59:27 compute-1 systemd[1]: 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489-10c1a326462e994.service: Failed with result 'exit-code'.
Sep 30 20:59:27 compute-1 systemd[1]: Started Session c1 of User root.
Sep 30 20:59:27 compute-1 sudo[94841]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:27 compute-1 ovn_controller[94902]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 20:59:27 compute-1 ovn_controller[94902]: INFO:__main__:Validating config file
Sep 30 20:59:27 compute-1 ovn_controller[94902]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 20:59:27 compute-1 ovn_controller[94902]: INFO:__main__:Writing out command to execute
Sep 30 20:59:27 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Sep 30 20:59:27 compute-1 ovn_controller[94902]: ++ cat /run_command
Sep 30 20:59:27 compute-1 ovn_controller[94902]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 20:59:27 compute-1 ovn_controller[94902]: + ARGS=
Sep 30 20:59:27 compute-1 ovn_controller[94902]: + sudo kolla_copy_cacerts
Sep 30 20:59:27 compute-1 systemd[1]: Started Session c2 of User root.
Sep 30 20:59:27 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Sep 30 20:59:27 compute-1 ovn_controller[94902]: + [[ ! -n '' ]]
Sep 30 20:59:27 compute-1 ovn_controller[94902]: + . kolla_extend_start
Sep 30 20:59:27 compute-1 ovn_controller[94902]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 20:59:27 compute-1 ovn_controller[94902]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Sep 30 20:59:27 compute-1 ovn_controller[94902]: + umask 0022
Sep 30 20:59:27 compute-1 ovn_controller[94902]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.1771] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.1779] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.1789] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.1795] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.1799] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 20:59:27 compute-1 kernel: br-int: entered promiscuous mode
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00014|main|INFO|OVS feature set changed, force recompute.
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00022|main|INFO|OVS feature set changed, force recompute.
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Sep 30 20:59:27 compute-1 systemd-udevd[95035]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 20:59:27 compute-1 ovn_controller[94902]: 2025-09-30T20:59:27Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.2608] manager: (ovn-3b817c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.2615] manager: (ovn-7db753-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Sep 30 20:59:27 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.2810] device (genev_sys_6081): carrier: link connected
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.2813] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Sep 30 20:59:27 compute-1 NetworkManager[51724]: <info>  [1759265967.9166] manager: (ovn-1c612d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Sep 30 20:59:28 compute-1 sudo[95166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amwtwgaxqgbavufapwqevxvtltbxujpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265967.911126-1793-116951691638495/AnsiballZ_command.py'
Sep 30 20:59:28 compute-1 sudo[95166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:28 compute-1 python3.9[95168]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:59:28 compute-1 ovs-vsctl[95169]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Sep 30 20:59:28 compute-1 sudo[95166]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:30 compute-1 sudo[95319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arqavwgdfgkltmwszwurogtutsxtzgvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265968.7126768-1817-271260308715400/AnsiballZ_command.py'
Sep 30 20:59:30 compute-1 sudo[95319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:30 compute-1 python3.9[95321]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:59:30 compute-1 ovs-vsctl[95323]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Sep 30 20:59:30 compute-1 sudo[95319]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:31 compute-1 sudo[95474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdrrkqwxguavouwdhlxuxtoutssfczmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265970.850888-1859-155663439880852/AnsiballZ_command.py'
Sep 30 20:59:31 compute-1 sudo[95474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:31 compute-1 python3.9[95476]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:59:31 compute-1 ovs-vsctl[95477]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Sep 30 20:59:31 compute-1 sudo[95474]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:31 compute-1 sshd-session[84405]: Connection closed by 192.168.122.30 port 33284
Sep 30 20:59:31 compute-1 sshd-session[84402]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:59:31 compute-1 systemd[1]: session-21.scope: Deactivated successfully.
Sep 30 20:59:31 compute-1 systemd[1]: session-21.scope: Consumed 50.108s CPU time.
Sep 30 20:59:31 compute-1 systemd-logind[793]: Session 21 logged out. Waiting for processes to exit.
Sep 30 20:59:31 compute-1 systemd-logind[793]: Removed session 21.
Sep 30 20:59:36 compute-1 sshd-session[95502]: Accepted publickey for zuul from 192.168.122.30 port 45222 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:59:36 compute-1 systemd-logind[793]: New session 23 of user zuul.
Sep 30 20:59:36 compute-1 systemd[1]: Started Session 23 of User zuul.
Sep 30 20:59:36 compute-1 sshd-session[95502]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:59:37 compute-1 systemd[1]: Stopping User Manager for UID 0...
Sep 30 20:59:37 compute-1 systemd[94934]: Activating special unit Exit the Session...
Sep 30 20:59:37 compute-1 systemd[94934]: Stopped target Main User Target.
Sep 30 20:59:37 compute-1 systemd[94934]: Stopped target Basic System.
Sep 30 20:59:37 compute-1 systemd[94934]: Stopped target Paths.
Sep 30 20:59:37 compute-1 systemd[94934]: Stopped target Sockets.
Sep 30 20:59:37 compute-1 systemd[94934]: Stopped target Timers.
Sep 30 20:59:37 compute-1 systemd[94934]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 20:59:37 compute-1 systemd[94934]: Closed D-Bus User Message Bus Socket.
Sep 30 20:59:37 compute-1 systemd[94934]: Stopped Create User's Volatile Files and Directories.
Sep 30 20:59:37 compute-1 systemd[94934]: Removed slice User Application Slice.
Sep 30 20:59:37 compute-1 systemd[94934]: Reached target Shutdown.
Sep 30 20:59:37 compute-1 systemd[94934]: Finished Exit the Session.
Sep 30 20:59:37 compute-1 systemd[94934]: Reached target Exit the Session.
Sep 30 20:59:37 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 20:59:37 compute-1 systemd[1]: Stopped User Manager for UID 0.
Sep 30 20:59:37 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 20:59:37 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 20:59:37 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 20:59:37 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 20:59:37 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 20:59:37 compute-1 python3.9[95657]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:59:39 compute-1 sudo[95811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttxrrbcebfbddoggbymrztugwjkkeqhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265978.545826-68-275228582220706/AnsiballZ_file.py'
Sep 30 20:59:39 compute-1 sudo[95811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:39 compute-1 python3.9[95813]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:39 compute-1 sudo[95811]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:39 compute-1 sudo[95963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvqsygxomebjmviittszbdxjovlcsyjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265979.4142473-68-45609661324447/AnsiballZ_file.py'
Sep 30 20:59:39 compute-1 sudo[95963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:39 compute-1 python3.9[95965]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:39 compute-1 sudo[95963]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:40 compute-1 sudo[96115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnpptvpcydyrnxasvikummlihtngtdgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265980.121398-68-152518801584897/AnsiballZ_file.py'
Sep 30 20:59:40 compute-1 sudo[96115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:40 compute-1 python3.9[96117]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:40 compute-1 sudo[96115]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:41 compute-1 sudo[96267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psintglhbdttbcfedbolqkoqdoikabup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265980.8948956-68-230195754931037/AnsiballZ_file.py'
Sep 30 20:59:41 compute-1 sudo[96267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:41 compute-1 python3.9[96269]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:41 compute-1 sudo[96267]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:41 compute-1 sudo[96419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emzrplzocjlrgkdzltezpjtqcswimliw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265981.5240707-68-196485248255122/AnsiballZ_file.py'
Sep 30 20:59:41 compute-1 sudo[96419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:42 compute-1 python3.9[96421]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:42 compute-1 sudo[96419]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:43 compute-1 python3.9[96571]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:59:43 compute-1 sudo[96721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvqkpkxestghzrnqbubnqqqqhjmlxuby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265983.5230312-200-187504292178124/AnsiballZ_seboolean.py'
Sep 30 20:59:44 compute-1 sudo[96721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:44 compute-1 python3.9[96723]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 20:59:44 compute-1 sudo[96721]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:45 compute-1 python3.9[96874]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:46 compute-1 python3.9[96995]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265985.1207397-224-180329256232186/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:47 compute-1 python3.9[97145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:47 compute-1 python3.9[97266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265986.7234116-269-278884662779944/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:48 compute-1 sudo[97416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gatwhvzupieejjeiiwbhxksshsxdkbyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265988.2122443-320-251935945479457/AnsiballZ_setup.py'
Sep 30 20:59:48 compute-1 sudo[97416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:48 compute-1 python3.9[97418]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:59:49 compute-1 sudo[97416]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:49 compute-1 sudo[97500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cveetvhwcutiwnejadkluegzbmfoqdmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265988.2122443-320-251935945479457/AnsiballZ_dnf.py'
Sep 30 20:59:49 compute-1 sudo[97500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:49 compute-1 python3.9[97502]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:59:50 compute-1 sudo[97500]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:51 compute-1 sudo[97653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pblivrqklqpmhoggvuesynpewepcovvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265991.2442298-356-206635804696966/AnsiballZ_systemd.py'
Sep 30 20:59:51 compute-1 sudo[97653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:52 compute-1 python3.9[97655]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 20:59:52 compute-1 sudo[97653]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:53 compute-1 python3.9[97808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:53 compute-1 python3.9[97929]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265992.5575492-380-125195000832422/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:54 compute-1 python3.9[98079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:54 compute-1 python3.9[98200]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265993.7542949-380-203785791047459/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:56 compute-1 python3.9[98350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:57 compute-1 python3.9[98471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265996.106101-512-170809357853568/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:57 compute-1 ovn_controller[94902]: 2025-09-30T20:59:57Z|00025|memory|INFO|16128 kB peak resident set size after 30.0 seconds
Sep 30 20:59:57 compute-1 ovn_controller[94902]: 2025-09-30T20:59:57Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Sep 30 20:59:57 compute-1 podman[98472]: 2025-09-30 20:59:57.169178495 +0000 UTC m=+0.085358041 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Sep 30 20:59:57 compute-1 python3.9[98647]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:58 compute-1 python3.9[98768]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265997.2055438-512-21856757303608/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:59 compute-1 python3.9[98918]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:59:59 compute-1 sudo[99070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upqcjkzsjwtoxadpkxnyhezdqkdqaccz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265999.6647785-626-189432863771245/AnsiballZ_file.py'
Sep 30 20:59:59 compute-1 sudo[99070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:00 compute-1 python3.9[99072]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:00 compute-1 sudo[99070]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:00 compute-1 sudo[99222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppfrgcmzoytfmqktnshvkwdvedyqdxuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266000.4981892-650-10575630975995/AnsiballZ_stat.py'
Sep 30 21:00:00 compute-1 sudo[99222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:00 compute-1 python3.9[99224]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:01 compute-1 sudo[99222]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:01 compute-1 sudo[99300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wybqtktmsoclwxzcgihrmgrudsmoqfoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266000.4981892-650-10575630975995/AnsiballZ_file.py'
Sep 30 21:00:01 compute-1 sudo[99300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:01 compute-1 python3.9[99302]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:01 compute-1 sudo[99300]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:01 compute-1 sudo[99452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzmhfolnhplfdqialllzlfuofzaruxrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266001.5764406-650-177124041060945/AnsiballZ_stat.py'
Sep 30 21:00:01 compute-1 sudo[99452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:02 compute-1 python3.9[99454]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:02 compute-1 sudo[99452]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:02 compute-1 sudo[99530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzyiialthetsyupwfqyufkvqsbhdkczr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266001.5764406-650-177124041060945/AnsiballZ_file.py'
Sep 30 21:00:02 compute-1 sudo[99530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:02 compute-1 python3.9[99532]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:02 compute-1 sudo[99530]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:03 compute-1 sudo[99682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aboukxccvwxruriypbetoipybizraemf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266003.158079-719-208730718000797/AnsiballZ_file.py'
Sep 30 21:00:03 compute-1 sudo[99682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:03 compute-1 python3.9[99684]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:03 compute-1 sudo[99682]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:04 compute-1 sudo[99834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oukjaumtwyhagmkvlbbigkipeqewzdha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266003.8669417-743-110324823911612/AnsiballZ_stat.py'
Sep 30 21:00:04 compute-1 sudo[99834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:04 compute-1 python3.9[99836]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:04 compute-1 sudo[99834]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:04 compute-1 sudo[99912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icoayhinjxnpaqnvnuhgzdwvinfyabhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266003.8669417-743-110324823911612/AnsiballZ_file.py'
Sep 30 21:00:04 compute-1 sudo[99912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:04 compute-1 python3.9[99914]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:04 compute-1 sudo[99912]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:05 compute-1 sudo[100064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahqahydodnugviwmxzrjghjkooegmlfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266005.1223-779-230458349043797/AnsiballZ_stat.py'
Sep 30 21:00:05 compute-1 sudo[100064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:05 compute-1 python3.9[100066]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:05 compute-1 sudo[100064]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:05 compute-1 sudo[100142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bryywaoalfvwjjyxxuhlwwjwyaqxiuba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266005.1223-779-230458349043797/AnsiballZ_file.py'
Sep 30 21:00:05 compute-1 sudo[100142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:06 compute-1 python3.9[100144]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:06 compute-1 sudo[100142]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:06 compute-1 sudo[100294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsdhqhtxvoibgedqibpygazxttlxxume ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266006.297157-815-129218744225377/AnsiballZ_systemd.py'
Sep 30 21:00:06 compute-1 sudo[100294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:06 compute-1 python3.9[100296]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:06 compute-1 systemd[1]: Reloading.
Sep 30 21:00:06 compute-1 systemd-rc-local-generator[100319]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:06 compute-1 systemd-sysv-generator[100323]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:07 compute-1 sudo[100294]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:07 compute-1 sudo[100483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nghijftmpzpcxetmjqgchbtsubapzccy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266007.7283955-839-194080563084113/AnsiballZ_stat.py'
Sep 30 21:00:07 compute-1 sudo[100483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:08 compute-1 python3.9[100485]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:08 compute-1 sudo[100483]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:08 compute-1 sudo[100561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xolyfbnhmgesvptjxqahxkircgkwgojg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266007.7283955-839-194080563084113/AnsiballZ_file.py'
Sep 30 21:00:08 compute-1 sudo[100561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:08 compute-1 python3.9[100563]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:08 compute-1 sudo[100561]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:09 compute-1 sudo[100714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kziltfnvtlyswymeuxzaudrjiquaphmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266009.0573702-875-234709894353905/AnsiballZ_stat.py'
Sep 30 21:00:09 compute-1 sudo[100714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:09 compute-1 python3.9[100716]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:09 compute-1 sudo[100714]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:09 compute-1 sudo[100792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqwzdqduvhlujawoudzzlipngnrgtnvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266009.0573702-875-234709894353905/AnsiballZ_file.py'
Sep 30 21:00:09 compute-1 sudo[100792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:10 compute-1 python3.9[100794]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:10 compute-1 sudo[100792]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:10 compute-1 sudo[100944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghmarxzyfdlxxqquxddskcsjuuldzvlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266010.372821-911-142314828461013/AnsiballZ_systemd.py'
Sep 30 21:00:10 compute-1 sudo[100944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:11 compute-1 python3.9[100946]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:11 compute-1 systemd[1]: Reloading.
Sep 30 21:00:11 compute-1 systemd-sysv-generator[100973]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:11 compute-1 systemd-rc-local-generator[100968]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:12 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 21:00:12 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 21:00:12 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 21:00:12 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 21:00:12 compute-1 sudo[100944]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:13 compute-1 sudo[101137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prsjzirkhbgjguiricumijmnafwldqxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266012.812501-941-92474368011143/AnsiballZ_file.py'
Sep 30 21:00:13 compute-1 sudo[101137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:13 compute-1 python3.9[101139]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:13 compute-1 sudo[101137]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:13 compute-1 sudo[101289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmcxgbgwyzuvoeyselyoqwgnhzqaenco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266013.6091726-965-19111295886992/AnsiballZ_stat.py'
Sep 30 21:00:13 compute-1 sudo[101289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:14 compute-1 python3.9[101291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:14 compute-1 sudo[101289]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:14 compute-1 sudo[101412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smzsjuxzoemosucpaqlbghraredpedaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266013.6091726-965-19111295886992/AnsiballZ_copy.py'
Sep 30 21:00:14 compute-1 sudo[101412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:14 compute-1 python3.9[101414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266013.6091726-965-19111295886992/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:14 compute-1 sudo[101412]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:15 compute-1 sudo[101564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdwrmqydxdfptqatpfujkofodzpqxisw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266015.2381027-1016-271521365612964/AnsiballZ_file.py'
Sep 30 21:00:15 compute-1 sudo[101564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:15 compute-1 python3.9[101566]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:15 compute-1 sudo[101564]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:16 compute-1 sudo[101716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfyhdmnofyihfxmawkydlqjqsmmvkxqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266016.0293674-1040-78228472516460/AnsiballZ_stat.py'
Sep 30 21:00:16 compute-1 sudo[101716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:16 compute-1 python3.9[101718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:16 compute-1 sudo[101716]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:16 compute-1 sudo[101839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkbsvmyokiiokshggycejmoxxopwnxjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266016.0293674-1040-78228472516460/AnsiballZ_copy.py'
Sep 30 21:00:16 compute-1 sudo[101839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:17 compute-1 python3.9[101841]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266016.0293674-1040-78228472516460/.source.json _original_basename=.9fnj3gbq follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:17 compute-1 sudo[101839]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:17 compute-1 sudo[101991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjbsifgqpftvkfyzkewxifcwdpueyyif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266017.6248271-1085-96704288017009/AnsiballZ_file.py'
Sep 30 21:00:17 compute-1 sudo[101991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:18 compute-1 python3.9[101993]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:18 compute-1 sudo[101991]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:18 compute-1 sudo[102143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acuainscnlzksculhcwqfgpqqmzomsxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266018.4593678-1109-104750386147901/AnsiballZ_stat.py'
Sep 30 21:00:18 compute-1 sudo[102143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:19 compute-1 sudo[102143]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:19 compute-1 sudo[102266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roqvmvqxdkxzrrqhyodsgobttixzkeam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266018.4593678-1109-104750386147901/AnsiballZ_copy.py'
Sep 30 21:00:19 compute-1 sudo[102266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:19 compute-1 sudo[102266]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:20 compute-1 sudo[102418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvuegbfyapgnxmrtjiucgzabqulmxnar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266020.1902936-1160-147233144791759/AnsiballZ_container_config_data.py'
Sep 30 21:00:20 compute-1 sudo[102418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:20 compute-1 python3.9[102420]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Sep 30 21:00:20 compute-1 sudo[102418]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:21 compute-1 sudo[102570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqxddahviaipfbmvqxpcgqxdzgyozurq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266021.1651046-1187-10681850068186/AnsiballZ_container_config_hash.py'
Sep 30 21:00:21 compute-1 sudo[102570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:21 compute-1 python3.9[102572]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:00:21 compute-1 sudo[102570]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:22 compute-1 sudo[102722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uumzhzetmwtpfnnnxreucnpsvvfyadex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266022.1426861-1214-84893769100316/AnsiballZ_podman_container_info.py'
Sep 30 21:00:22 compute-1 sudo[102722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:22 compute-1 python3.9[102724]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 21:00:23 compute-1 sudo[102722]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:24 compute-1 sudo[102901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzykeaeisbuifhhmiqwjxbttozyymuig ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266023.796802-1253-70445306886865/AnsiballZ_edpm_container_manage.py'
Sep 30 21:00:24 compute-1 sudo[102901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:24 compute-1 python3[102903]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:00:29 compute-1 podman[102961]: 2025-09-30 21:00:29.961727854 +0000 UTC m=+1.800199681 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:00:30 compute-1 podman[102917]: 2025-09-30 21:00:30.992356223 +0000 UTC m=+6.239567509 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:00:31 compute-1 podman[103042]: 2025-09-30 21:00:31.16987131 +0000 UTC m=+0.045138894 container create ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, managed_by=edpm_ansible)
Sep 30 21:00:31 compute-1 podman[103042]: 2025-09-30 21:00:31.144353804 +0000 UTC m=+0.019621408 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:00:31 compute-1 python3[102903]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:00:31 compute-1 sudo[102901]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:32 compute-1 sudo[103230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcqohtcnursipaypvekxrdibycacvial ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266032.5414329-1277-78937809903763/AnsiballZ_stat.py'
Sep 30 21:00:32 compute-1 sudo[103230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:33 compute-1 python3.9[103232]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:00:33 compute-1 sudo[103230]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:33 compute-1 sudo[103384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofmbklyowbqzqunzfeueoogppaeepaci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266033.356798-1304-189906865038323/AnsiballZ_file.py'
Sep 30 21:00:33 compute-1 sudo[103384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:33 compute-1 python3.9[103386]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:33 compute-1 sudo[103384]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:34 compute-1 sudo[103460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbagivvecjcovurexdwcebhrmutomxpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266033.356798-1304-189906865038323/AnsiballZ_stat.py'
Sep 30 21:00:34 compute-1 sudo[103460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:34 compute-1 python3.9[103462]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:00:34 compute-1 sudo[103460]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:34 compute-1 sudo[103611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzmpifvcchbpamhywkmoomoleqgndiua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266034.263308-1304-250543576879352/AnsiballZ_copy.py'
Sep 30 21:00:34 compute-1 sudo[103611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:34 compute-1 python3.9[103613]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266034.263308-1304-250543576879352/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:34 compute-1 sudo[103611]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:35 compute-1 sudo[103687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqbdlxyirmjxnvxrbbjbfecdnfuzteds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266034.263308-1304-250543576879352/AnsiballZ_systemd.py'
Sep 30 21:00:35 compute-1 sudo[103687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:35 compute-1 python3.9[103689]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:00:35 compute-1 systemd[1]: Reloading.
Sep 30 21:00:35 compute-1 systemd-rc-local-generator[103716]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:35 compute-1 systemd-sysv-generator[103721]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:35 compute-1 sudo[103687]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:36 compute-1 sudo[103798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaatqbaqdlnppwdfqwrnbgllwxbdqpnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266034.263308-1304-250543576879352/AnsiballZ_systemd.py'
Sep 30 21:00:36 compute-1 sudo[103798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:36 compute-1 python3.9[103800]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:36 compute-1 systemd[1]: Reloading.
Sep 30 21:00:36 compute-1 systemd-rc-local-generator[103826]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:36 compute-1 systemd-sysv-generator[103833]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:36 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Sep 30 21:00:36 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:00:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f85be88ce11ac137d5cd13a4d29a548fd3b6e547e114cf873523122027a59d2d/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Sep 30 21:00:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f85be88ce11ac137d5cd13a4d29a548fd3b6e547e114cf873523122027a59d2d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:00:36 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89.
Sep 30 21:00:36 compute-1 podman[103841]: 2025-09-30 21:00:36.789663354 +0000 UTC m=+0.134052008 container init ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: + sudo -E kolla_set_configs
Sep 30 21:00:36 compute-1 podman[103841]: 2025-09-30 21:00:36.820806952 +0000 UTC m=+0.165195616 container start ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:00:36 compute-1 edpm-start-podman-container[103841]: ovn_metadata_agent
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Validating config file
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Copying service configuration files
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Writing out command to execute
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Setting permission for /var/lib/neutron
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Setting permission for /var/lib/neutron/external
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: ++ cat /run_command
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: + CMD=neutron-ovn-metadata-agent
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: + ARGS=
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: + sudo kolla_copy_cacerts
Sep 30 21:00:36 compute-1 edpm-start-podman-container[103840]: Creating additional drop-in dependency for "ovn_metadata_agent" (ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89)
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: + [[ ! -n '' ]]
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: + . kolla_extend_start
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: Running command: 'neutron-ovn-metadata-agent'
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: + umask 0022
Sep 30 21:00:36 compute-1 ovn_metadata_agent[103856]: + exec neutron-ovn-metadata-agent
Sep 30 21:00:36 compute-1 systemd[1]: Reloading.
Sep 30 21:00:36 compute-1 podman[103863]: 2025-09-30 21:00:36.92548938 +0000 UTC m=+0.094653306 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:00:37 compute-1 systemd-rc-local-generator[103933]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:37 compute-1 systemd-sysv-generator[103936]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:37 compute-1 systemd[1]: Started ovn_metadata_agent container.
Sep 30 21:00:37 compute-1 sudo[103798]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:38 compute-1 sshd-session[95505]: Connection closed by 192.168.122.30 port 45222
Sep 30 21:00:38 compute-1 sshd-session[95502]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:00:38 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Sep 30 21:00:38 compute-1 systemd[1]: session-23.scope: Consumed 48.634s CPU time.
Sep 30 21:00:38 compute-1 systemd-logind[793]: Session 23 logged out. Waiting for processes to exit.
Sep 30 21:00:38 compute-1 systemd-logind[793]: Removed session 23.
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.623 103861 INFO neutron.common.config [-] Logging enabled!
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.623 103861 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.623 103861 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.624 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.624 103861 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.624 103861 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.624 103861 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.624 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.624 103861 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.625 103861 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.625 103861 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.625 103861 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.625 103861 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.625 103861 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.625 103861 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.625 103861 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.625 103861 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.625 103861 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.626 103861 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.627 103861 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.628 103861 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.629 103861 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.630 103861 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.630 103861 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.630 103861 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.630 103861 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.630 103861 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.630 103861 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.630 103861 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.630 103861 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.630 103861 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.631 103861 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.631 103861 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.631 103861 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.631 103861 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.631 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.631 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.631 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.631 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.631 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.632 103861 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.633 103861 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.633 103861 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.633 103861 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.633 103861 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.633 103861 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.633 103861 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.633 103861 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.633 103861 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.633 103861 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.634 103861 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.635 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.636 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.636 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.636 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.636 103861 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.636 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.636 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.636 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.636 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.636 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.637 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.637 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.637 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.637 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.637 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.637 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.637 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.637 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.637 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.638 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.638 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.638 103861 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.638 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.638 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.638 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.638 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.638 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.638 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.639 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.640 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.640 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.640 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.640 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.640 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.640 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.640 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.640 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.640 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.641 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.641 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.641 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.641 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.641 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.641 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.641 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.641 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.641 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.642 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.642 103861 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.642 103861 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.642 103861 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.642 103861 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.642 103861 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.642 103861 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.642 103861 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.642 103861 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.643 103861 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.643 103861 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.643 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.643 103861 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.643 103861 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.643 103861 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.643 103861 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.643 103861 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.643 103861 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.644 103861 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.644 103861 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.644 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.644 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.644 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.644 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.644 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.644 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.644 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.645 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.646 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.646 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.646 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.646 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.646 103861 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.646 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.646 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.646 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.647 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.647 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.647 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.647 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.647 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.647 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.647 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.647 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.647 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.648 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.649 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.649 103861 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.649 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.649 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.649 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.649 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.649 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.649 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.650 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.650 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.650 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.650 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.650 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.650 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.650 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.650 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.651 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.651 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.651 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.651 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.651 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.651 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.651 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.651 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.652 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.652 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.652 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.652 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.652 103861 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.652 103861 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.652 103861 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.653 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.653 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.653 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.653 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.653 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.653 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.653 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.653 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.653 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.654 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.655 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.655 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.655 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.655 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.655 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.655 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.655 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.655 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.655 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.656 103861 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.667 103861 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.667 103861 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.667 103861 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.667 103861 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.668 103861 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.682 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 78438f8f-1ac2-4393-90b7-0b62e0665947 (UUID: 78438f8f-1ac2-4393-90b7-0b62e0665947) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.707 103861 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.707 103861 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.708 103861 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.708 103861 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.711 103861 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.718 103861 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.729 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '78438f8f-1ac2-4393-90b7-0b62e0665947'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], external_ids={}, name=78438f8f-1ac2-4393-90b7-0b62e0665947, nb_cfg_timestamp=1759265975198, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.730 103861 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f966f0e1bb0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.731 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.731 103861 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.731 103861 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.731 103861 INFO oslo_service.service [-] Starting 1 workers
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.735 103861 DEBUG oslo_service.service [-] Started child 103970 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.738 103861 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpp97rw1eu/privsep.sock']
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.740 103970 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-242093'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.783 103970 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.784 103970 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.784 103970 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.788 103970 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.800 103970 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 21:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:38.812 103970 INFO eventlet.wsgi.server [-] (103970) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Sep 30 21:00:39 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.383 103861 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.384 103861 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpp97rw1eu/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.278 103975 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.285 103975 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.289 103975 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.289 103975 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103975
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.388 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[f042469b-a8e6-41a6-a37f-39d00655d85c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.925 103975 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.925 103975 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:00:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:39.925 103975 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.462 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[55627327-2383-4813-ab5c-0740e49f32c7]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.464 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, column=external_ids, values=({'neutron:ovn-metadata-id': 'd7be5214-a0a5-5c53-996e-23931480b225'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.473 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.479 103861 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.479 103861 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.479 103861 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.479 103861 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.479 103861 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.479 103861 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.479 103861 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.479 103861 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.479 103861 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.480 103861 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.480 103861 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.480 103861 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.480 103861 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.480 103861 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.480 103861 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.480 103861 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.480 103861 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.480 103861 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.481 103861 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.481 103861 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.481 103861 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.481 103861 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.481 103861 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.481 103861 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.481 103861 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.481 103861 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.481 103861 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.482 103861 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.482 103861 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.482 103861 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.482 103861 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.482 103861 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.482 103861 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.482 103861 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.482 103861 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.483 103861 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.483 103861 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.483 103861 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.483 103861 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.483 103861 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.483 103861 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.483 103861 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.483 103861 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.483 103861 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.484 103861 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.485 103861 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.485 103861 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.485 103861 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.485 103861 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.485 103861 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.485 103861 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.485 103861 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.485 103861 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.485 103861 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.486 103861 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.486 103861 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.486 103861 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.486 103861 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.486 103861 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.486 103861 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.486 103861 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.486 103861 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.486 103861 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.487 103861 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.487 103861 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.487 103861 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.487 103861 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.487 103861 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.487 103861 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.487 103861 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.487 103861 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.487 103861 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.488 103861 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.489 103861 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.490 103861 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.490 103861 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.490 103861 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.490 103861 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.490 103861 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.490 103861 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.490 103861 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.490 103861 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.491 103861 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.492 103861 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.492 103861 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.492 103861 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.492 103861 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.492 103861 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.492 103861 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.492 103861 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.492 103861 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.492 103861 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.493 103861 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.493 103861 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.493 103861 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.493 103861 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.493 103861 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.493 103861 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.493 103861 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.493 103861 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.493 103861 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.494 103861 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.494 103861 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.494 103861 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.494 103861 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.494 103861 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.494 103861 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.494 103861 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.494 103861 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.494 103861 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.495 103861 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.496 103861 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.497 103861 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.497 103861 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.497 103861 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.497 103861 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.497 103861 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.497 103861 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.497 103861 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.497 103861 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.498 103861 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.498 103861 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.498 103861 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.498 103861 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.498 103861 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.498 103861 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.498 103861 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.498 103861 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.498 103861 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.499 103861 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.499 103861 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.499 103861 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.499 103861 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.499 103861 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.499 103861 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.499 103861 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.499 103861 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.499 103861 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.500 103861 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.501 103861 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.502 103861 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.502 103861 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.502 103861 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.502 103861 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.502 103861 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.502 103861 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.502 103861 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.502 103861 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.502 103861 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.503 103861 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.504 103861 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.505 103861 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.506 103861 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.507 103861 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.507 103861 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.507 103861 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.507 103861 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.507 103861 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.507 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.507 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.507 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.507 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.508 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.509 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.510 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.510 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.510 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.510 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.510 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.510 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.510 103861 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.510 103861 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.510 103861 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.511 103861 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.511 103861 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:00:40.511 103861 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:00:43 compute-1 sshd-session[103980]: Accepted publickey for zuul from 192.168.122.30 port 55814 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:00:43 compute-1 systemd-logind[793]: New session 24 of user zuul.
Sep 30 21:00:43 compute-1 systemd[1]: Started Session 24 of User zuul.
Sep 30 21:00:43 compute-1 sshd-session[103980]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:00:44 compute-1 python3.9[104133]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 21:00:45 compute-1 sudo[104287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkryerzbmbqiukipqrlxcjfxwtiiylki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266045.410143-68-259486748835394/AnsiballZ_command.py'
Sep 30 21:00:45 compute-1 sudo[104287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:46 compute-1 python3.9[104289]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:00:46 compute-1 sudo[104287]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:47 compute-1 sudo[104452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oubndhnlepwnhqpocltenyxdtakkptux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266046.723207-101-165631181989594/AnsiballZ_systemd_service.py'
Sep 30 21:00:47 compute-1 sudo[104452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:47 compute-1 python3.9[104454]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:00:47 compute-1 systemd[1]: Reloading.
Sep 30 21:00:47 compute-1 systemd-rc-local-generator[104477]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:47 compute-1 systemd-sysv-generator[104483]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:47 compute-1 sudo[104452]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:48 compute-1 python3.9[104639]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:00:49 compute-1 network[104656]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:00:49 compute-1 network[104657]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:00:49 compute-1 network[104658]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:00:57 compute-1 sudo[104920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aemupkzidkujxzakxfbbpdrrzwmcphyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266057.51976-158-159881063381366/AnsiballZ_systemd_service.py'
Sep 30 21:00:57 compute-1 sudo[104920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:58 compute-1 python3.9[104922]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:58 compute-1 sudo[104920]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:58 compute-1 sudo[105073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cntgjljpwgnpkmwsqfaydpopcbfzllkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266058.41538-158-242708261198981/AnsiballZ_systemd_service.py'
Sep 30 21:00:58 compute-1 sudo[105073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:59 compute-1 python3.9[105075]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:59 compute-1 sudo[105073]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:59 compute-1 sudo[105226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfdctskknknytcvqhzzexmayzymejill ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266059.2920427-158-13200911986110/AnsiballZ_systemd_service.py'
Sep 30 21:00:59 compute-1 sudo[105226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:59 compute-1 python3.9[105228]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:59 compute-1 sudo[105226]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:00 compute-1 sudo[105379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aigfygnqlboxsiuijjwzsmvvvdsmnyem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266060.104158-158-19187507212207/AnsiballZ_systemd_service.py'
Sep 30 21:01:00 compute-1 sudo[105379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:00 compute-1 python3.9[105381]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:01:01 compute-1 CROND[105410]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 21:01:01 compute-1 run-parts[105413]: (/etc/cron.hourly) starting 0anacron
Sep 30 21:01:01 compute-1 podman[105383]: 2025-09-30 21:01:01.032782565 +0000 UTC m=+0.176648868 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2)
Sep 30 21:01:01 compute-1 anacron[105422]: Anacron started on 2025-09-30
Sep 30 21:01:01 compute-1 anacron[105422]: Will run job `cron.daily' in 24 min.
Sep 30 21:01:01 compute-1 anacron[105422]: Will run job `cron.weekly' in 44 min.
Sep 30 21:01:01 compute-1 anacron[105422]: Will run job `cron.monthly' in 64 min.
Sep 30 21:01:01 compute-1 anacron[105422]: Jobs will be executed sequentially
Sep 30 21:01:01 compute-1 run-parts[105424]: (/etc/cron.hourly) finished 0anacron
Sep 30 21:01:01 compute-1 CROND[105406]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 21:01:01 compute-1 sudo[105379]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:02 compute-1 sudo[105575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syjwgohpynuavjxoyciimmvqpkhchfmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266062.0202065-158-240909573061674/AnsiballZ_systemd_service.py'
Sep 30 21:01:02 compute-1 sudo[105575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:02 compute-1 python3.9[105577]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:01:02 compute-1 sudo[105575]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:03 compute-1 sudo[105728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhrdnwgbfvzucrpdpreoalrwhgmbxgtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266062.9927616-158-5831124361258/AnsiballZ_systemd_service.py'
Sep 30 21:01:03 compute-1 sudo[105728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:03 compute-1 python3.9[105730]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:01:03 compute-1 sudo[105728]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:04 compute-1 sudo[105881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quysqgmienlxdevzftnunlhgzwlgqenx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266063.8632355-158-38705066575897/AnsiballZ_systemd_service.py'
Sep 30 21:01:04 compute-1 sudo[105881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:04 compute-1 python3.9[105883]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:01:04 compute-1 sudo[105881]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:05 compute-1 sudo[106034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhcysakcohseebelzbjsuahzyxhhcrdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266065.0150352-314-229844042979414/AnsiballZ_file.py'
Sep 30 21:01:05 compute-1 sudo[106034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:05 compute-1 python3.9[106036]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:05 compute-1 sudo[106034]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:06 compute-1 sudo[106186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clyyfubpugssfbfexjxxblokiujmobsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266065.7866778-314-59723634677336/AnsiballZ_file.py'
Sep 30 21:01:06 compute-1 sudo[106186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:06 compute-1 python3.9[106188]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:06 compute-1 sudo[106186]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:06 compute-1 sudo[106338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zopxpfxfupwsorlupvoryyctnlsxikiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266066.5692809-314-31793687196140/AnsiballZ_file.py'
Sep 30 21:01:06 compute-1 sudo[106338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:07 compute-1 python3.9[106340]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:07 compute-1 sudo[106338]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:07 compute-1 podman[106464]: 2025-09-30 21:01:07.421032831 +0000 UTC m=+0.046313163 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:01:07 compute-1 sudo[106507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxstpriezjzylijmcifglddvtvdabmyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266067.1864662-314-203029507963696/AnsiballZ_file.py'
Sep 30 21:01:07 compute-1 sudo[106507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:07 compute-1 python3.9[106511]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:07 compute-1 sudo[106507]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:08 compute-1 sudo[106662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxndkaouxgdxrzgdevvhzioqlzznvaaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266067.8040137-314-164562137907976/AnsiballZ_file.py'
Sep 30 21:01:08 compute-1 sudo[106662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:08 compute-1 python3.9[106664]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:08 compute-1 sudo[106662]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:08 compute-1 sudo[106814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nevolqhlfleztqidzykwogogtqqgdadd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266068.3965511-314-254643817053597/AnsiballZ_file.py'
Sep 30 21:01:08 compute-1 sudo[106814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:08 compute-1 python3.9[106816]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:08 compute-1 sudo[106814]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:09 compute-1 sudo[106966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-himgrkzeqjfqghgjljiyrfivpevdkthh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266068.9471188-314-89742261753563/AnsiballZ_file.py'
Sep 30 21:01:09 compute-1 sudo[106966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:09 compute-1 python3.9[106968]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:09 compute-1 sudo[106966]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:10 compute-1 sudo[107118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joujluvbiereydydxnktxhtmuezmjdms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266070.233093-464-54026860673217/AnsiballZ_file.py'
Sep 30 21:01:10 compute-1 sudo[107118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:10 compute-1 python3.9[107120]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:10 compute-1 sudo[107118]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:11 compute-1 sudo[107270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcvradzhmzwvxqkwqfdffcnqgngqqjnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266070.8055265-464-119658319066021/AnsiballZ_file.py'
Sep 30 21:01:11 compute-1 sudo[107270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:11 compute-1 python3.9[107272]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:11 compute-1 sudo[107270]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:11 compute-1 sudo[107422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nypdbafcithomoweassvcgfupraxebwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266071.4412982-464-189072606048311/AnsiballZ_file.py'
Sep 30 21:01:11 compute-1 sudo[107422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:11 compute-1 python3.9[107424]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:11 compute-1 sudo[107422]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:12 compute-1 sudo[107574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jguebseiichmjoedtrwlgbvelehuxwya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266072.1056948-464-144146205109519/AnsiballZ_file.py'
Sep 30 21:01:12 compute-1 sudo[107574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:12 compute-1 python3.9[107576]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:12 compute-1 sudo[107574]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:13 compute-1 sudo[107726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsrrtrhesuabmebtpdfhgczmnbqxmdoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266072.787714-464-190317653000764/AnsiballZ_file.py'
Sep 30 21:01:13 compute-1 sudo[107726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:13 compute-1 python3.9[107728]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:13 compute-1 sudo[107726]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:13 compute-1 sudo[107878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpdrtsgdbzedhkfcmwnlsvsmsrcvjxkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266073.464561-464-45819248870851/AnsiballZ_file.py'
Sep 30 21:01:13 compute-1 sudo[107878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:13 compute-1 python3.9[107880]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:14 compute-1 sudo[107878]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:14 compute-1 sudo[108030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccilyznhylbbxaiiapzzhxjyaezvclay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266074.173186-464-157892303568606/AnsiballZ_file.py'
Sep 30 21:01:14 compute-1 sudo[108030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:14 compute-1 python3.9[108032]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:14 compute-1 sudo[108030]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:15 compute-1 sudo[108182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uacgzlszlhkevtsjfmpdxefcrbzgwica ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266075.4618895-617-64166207461440/AnsiballZ_command.py'
Sep 30 21:01:15 compute-1 sudo[108182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:15 compute-1 python3.9[108184]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:15 compute-1 sudo[108182]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:16 compute-1 python3.9[108336]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:01:17 compute-1 sudo[108486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctleamxeiqkhoslsrffklmmscrxdpayy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266077.2747242-671-211034990140078/AnsiballZ_systemd_service.py'
Sep 30 21:01:17 compute-1 sudo[108486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:17 compute-1 python3.9[108488]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:01:17 compute-1 systemd[1]: Reloading.
Sep 30 21:01:17 compute-1 systemd-sysv-generator[108520]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:01:17 compute-1 systemd-rc-local-generator[108517]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:01:18 compute-1 sudo[108486]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:18 compute-1 sudo[108674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncyqvskltqsnynipvvacasaxrsgjendd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266078.3620281-695-270166557225/AnsiballZ_command.py'
Sep 30 21:01:18 compute-1 sudo[108674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:18 compute-1 python3.9[108676]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:18 compute-1 sudo[108674]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:19 compute-1 sudo[108827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iywjptrigiptqwnowtqklkptmgunxbzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266079.0782523-695-190980712631709/AnsiballZ_command.py'
Sep 30 21:01:19 compute-1 sudo[108827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:19 compute-1 python3.9[108829]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:19 compute-1 sudo[108827]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:20 compute-1 sudo[108980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ackcijplmgrhluibhmvgevkxegoliwst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266079.8165746-695-164354265321558/AnsiballZ_command.py'
Sep 30 21:01:20 compute-1 sudo[108980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:20 compute-1 python3.9[108982]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:20 compute-1 sudo[108980]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:20 compute-1 sudo[109133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjtshxefyhgucrzxajbkdmjxhimuevnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266080.518084-695-141343919271561/AnsiballZ_command.py'
Sep 30 21:01:20 compute-1 sudo[109133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:21 compute-1 python3.9[109135]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:21 compute-1 sudo[109133]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:21 compute-1 sudo[109286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woeuwjrplbkiyyeggzhgtfqyudwbtogr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266081.3054662-695-205110966572213/AnsiballZ_command.py'
Sep 30 21:01:21 compute-1 sudo[109286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:21 compute-1 python3.9[109288]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:21 compute-1 sudo[109286]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:22 compute-1 sudo[109439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yogadupojrikvonvsgypukeqphhpcykv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266081.906217-695-259580491885298/AnsiballZ_command.py'
Sep 30 21:01:22 compute-1 sudo[109439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:22 compute-1 python3.9[109441]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:22 compute-1 sudo[109439]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:23 compute-1 sudo[109592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjsgjsvkrcrgomhevlyoxhbizziqhpom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266082.6325738-695-20435818646777/AnsiballZ_command.py'
Sep 30 21:01:23 compute-1 sudo[109592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:23 compute-1 python3.9[109594]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:23 compute-1 sudo[109592]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:25 compute-1 sudo[109745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azjnmnscrfrzxbkmenwieurqqlbzdprt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266084.7363274-857-8531444391668/AnsiballZ_getent.py'
Sep 30 21:01:25 compute-1 sudo[109745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:25 compute-1 python3.9[109747]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Sep 30 21:01:25 compute-1 sudo[109745]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:26 compute-1 sudo[109898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfvvrtiajyswsmfprgyivusapuhdxgnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266085.6556423-881-238548016423596/AnsiballZ_group.py'
Sep 30 21:01:26 compute-1 sudo[109898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:26 compute-1 python3.9[109900]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 21:01:26 compute-1 groupadd[109901]: group added to /etc/group: name=libvirt, GID=42473
Sep 30 21:01:26 compute-1 groupadd[109901]: group added to /etc/gshadow: name=libvirt
Sep 30 21:01:26 compute-1 groupadd[109901]: new group: name=libvirt, GID=42473
Sep 30 21:01:26 compute-1 sudo[109898]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:27 compute-1 sudo[110056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uetzlnzanicpmulxwzjugatdcnklkzyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266086.7996516-905-200030399815097/AnsiballZ_user.py'
Sep 30 21:01:27 compute-1 sudo[110056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:27 compute-1 python3.9[110058]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 21:01:27 compute-1 useradd[110060]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 21:01:27 compute-1 sudo[110056]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:28 compute-1 sudo[110216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fspvqhyxgmhcuilvkraqhqyzvzfjzmwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266088.091292-938-46715724327626/AnsiballZ_setup.py'
Sep 30 21:01:28 compute-1 sudo[110216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:28 compute-1 python3.9[110218]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 21:01:29 compute-1 sudo[110216]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:29 compute-1 sudo[110300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atcekhdbbcwwuponrishhvkoduoppial ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266088.091292-938-46715724327626/AnsiballZ_dnf.py'
Sep 30 21:01:29 compute-1 sudo[110300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:29 compute-1 python3.9[110302]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 21:01:31 compute-1 podman[110306]: 2025-09-30 21:01:31.282478843 +0000 UTC m=+0.112380505 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:01:38 compute-1 podman[110383]: 2025-09-30 21:01:38.221108717 +0000 UTC m=+0.062678426 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:01:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:01:38.658 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:01:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:01:38.659 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:01:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:01:38.659 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:01:55 compute-1 kernel: SELinux:  Converting 2752 SID table entries...
Sep 30 21:01:55 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 21:01:55 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 21:01:55 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 21:01:55 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 21:01:55 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 21:01:55 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 21:01:55 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 21:02:02 compute-1 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Sep 30 21:02:02 compute-1 podman[110546]: 2025-09-30 21:02:02.3211553 +0000 UTC m=+0.136203165 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:02:05 compute-1 kernel: SELinux:  Converting 2752 SID table entries...
Sep 30 21:02:05 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 21:02:05 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 21:02:05 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 21:02:05 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 21:02:05 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 21:02:05 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 21:02:05 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 21:02:09 compute-1 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Sep 30 21:02:09 compute-1 podman[110580]: 2025-09-30 21:02:09.243730368 +0000 UTC m=+0.075925947 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 21:02:33 compute-1 podman[120934]: 2025-09-30 21:02:33.232900755 +0000 UTC m=+0.080939845 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 21:02:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:02:38.659 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:02:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:02:38.661 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:02:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:02:38.662 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:02:40 compute-1 podman[124873]: 2025-09-30 21:02:40.263055132 +0000 UTC m=+0.098843798 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:02:58 compute-1 kernel: SELinux:  Converting 2753 SID table entries...
Sep 30 21:02:58 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 21:02:58 compute-1 kernel: SELinux:  policy capability open_perms=1
Sep 30 21:02:58 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 21:02:58 compute-1 kernel: SELinux:  policy capability always_check_network=0
Sep 30 21:02:58 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 21:02:58 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 21:02:58 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 21:03:01 compute-1 groupadd[127407]: group added to /etc/group: name=dnsmasq, GID=992
Sep 30 21:03:01 compute-1 groupadd[127407]: group added to /etc/gshadow: name=dnsmasq
Sep 30 21:03:01 compute-1 groupadd[127407]: new group: name=dnsmasq, GID=992
Sep 30 21:03:01 compute-1 useradd[127414]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Sep 30 21:03:01 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Sep 30 21:03:01 compute-1 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Sep 30 21:03:01 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Sep 30 21:03:01 compute-1 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Sep 30 21:03:02 compute-1 groupadd[127427]: group added to /etc/group: name=clevis, GID=991
Sep 30 21:03:02 compute-1 groupadd[127427]: group added to /etc/gshadow: name=clevis
Sep 30 21:03:02 compute-1 groupadd[127427]: new group: name=clevis, GID=991
Sep 30 21:03:02 compute-1 useradd[127434]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Sep 30 21:03:02 compute-1 usermod[127444]: add 'clevis' to group 'tss'
Sep 30 21:03:02 compute-1 usermod[127444]: add 'clevis' to shadow group 'tss'
Sep 30 21:03:03 compute-1 podman[127454]: 2025-09-30 21:03:03.436098554 +0000 UTC m=+0.116030459 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Sep 30 21:03:04 compute-1 polkitd[6517]: Reloading rules
Sep 30 21:03:04 compute-1 polkitd[6517]: Collecting garbage unconditionally...
Sep 30 21:03:04 compute-1 polkitd[6517]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 21:03:04 compute-1 polkitd[6517]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 21:03:04 compute-1 polkitd[6517]: Finished loading, compiling and executing 4 rules
Sep 30 21:03:04 compute-1 polkitd[6517]: Reloading rules
Sep 30 21:03:04 compute-1 polkitd[6517]: Collecting garbage unconditionally...
Sep 30 21:03:04 compute-1 polkitd[6517]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 21:03:04 compute-1 polkitd[6517]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 21:03:04 compute-1 polkitd[6517]: Finished loading, compiling and executing 4 rules
Sep 30 21:03:06 compute-1 groupadd[127657]: group added to /etc/group: name=ceph, GID=167
Sep 30 21:03:06 compute-1 groupadd[127657]: group added to /etc/gshadow: name=ceph
Sep 30 21:03:06 compute-1 groupadd[127657]: new group: name=ceph, GID=167
Sep 30 21:03:06 compute-1 useradd[127663]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Sep 30 21:03:09 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Sep 30 21:03:09 compute-1 sshd[1006]: Received signal 15; terminating.
Sep 30 21:03:09 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Sep 30 21:03:09 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Sep 30 21:03:09 compute-1 systemd[1]: sshd.service: Consumed 10.978s CPU time, read 532.0K from disk, written 200.0K to disk.
Sep 30 21:03:09 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Sep 30 21:03:09 compute-1 systemd[1]: Stopping sshd-keygen.target...
Sep 30 21:03:09 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 21:03:09 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 21:03:09 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 21:03:09 compute-1 systemd[1]: Reached target sshd-keygen.target.
Sep 30 21:03:09 compute-1 systemd[1]: Starting OpenSSH server daemon...
Sep 30 21:03:09 compute-1 sshd[128182]: Server listening on 0.0.0.0 port 22.
Sep 30 21:03:09 compute-1 sshd[128182]: Server listening on :: port 22.
Sep 30 21:03:09 compute-1 systemd[1]: Started OpenSSH server daemon.
Sep 30 21:03:10 compute-1 podman[128287]: 2025-09-30 21:03:10.405056663 +0000 UTC m=+0.077546150 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 21:03:11 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 21:03:11 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 21:03:11 compute-1 systemd[1]: Reloading.
Sep 30 21:03:11 compute-1 systemd-rc-local-generator[128456]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:11 compute-1 systemd-sysv-generator[128459]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:11 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 21:03:13 compute-1 systemd[1]: Starting PackageKit Daemon...
Sep 30 21:03:13 compute-1 PackageKit[130706]: daemon start
Sep 30 21:03:13 compute-1 systemd[1]: Started PackageKit Daemon.
Sep 30 21:03:13 compute-1 sudo[110300]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:20 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 21:03:20 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 21:03:20 compute-1 systemd[1]: man-db-cache-update.service: Consumed 11.439s CPU time.
Sep 30 21:03:20 compute-1 systemd[1]: run-r7d5cf91c3cc54fedaa87fb6358766a10.service: Deactivated successfully.
Sep 30 21:03:21 compute-1 sshd-session[134783]: banner exchange: Connection from 123.56.157.254 port 54049: invalid format
Sep 30 21:03:23 compute-1 sudo[136854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeladakjamgympgvjudjlbeiqbbpzbbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266202.5689023-974-226826646740481/AnsiballZ_systemd.py'
Sep 30 21:03:23 compute-1 sudo[136854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:23 compute-1 python3.9[136856]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:24 compute-1 systemd[1]: Reloading.
Sep 30 21:03:24 compute-1 systemd-rc-local-generator[136884]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:24 compute-1 systemd-sysv-generator[136888]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:24 compute-1 sudo[136854]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:25 compute-1 sudo[137044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjvcfasmiavpwzkrccpelsayffzbrnyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266204.9060843-974-120237012544690/AnsiballZ_systemd.py'
Sep 30 21:03:25 compute-1 sudo[137044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:25 compute-1 python3.9[137046]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:26 compute-1 systemd[1]: Reloading.
Sep 30 21:03:26 compute-1 systemd-rc-local-generator[137077]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:26 compute-1 systemd-sysv-generator[137082]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:27 compute-1 sudo[137044]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:27 compute-1 sudo[137234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsydxzvkwexgfuoifwiggeekfwkcjaxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266207.1596549-974-94362171273058/AnsiballZ_systemd.py'
Sep 30 21:03:27 compute-1 sudo[137234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:27 compute-1 python3.9[137236]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:27 compute-1 systemd[1]: Reloading.
Sep 30 21:03:28 compute-1 systemd-sysv-generator[137270]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:28 compute-1 systemd-rc-local-generator[137263]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:28 compute-1 sudo[137234]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:28 compute-1 sudo[137425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvnwfupdtbxwfllabvqnwjowdozdkoqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266208.369197-974-243959440629294/AnsiballZ_systemd.py'
Sep 30 21:03:28 compute-1 sudo[137425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:29 compute-1 python3.9[137427]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:29 compute-1 systemd[1]: Reloading.
Sep 30 21:03:29 compute-1 systemd-rc-local-generator[137457]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:29 compute-1 systemd-sysv-generator[137460]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:29 compute-1 sudo[137425]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:30 compute-1 sudo[137615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rduzcgdalxwizbygcnqhtzsypinivall ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266209.7362716-1061-52529406758968/AnsiballZ_systemd.py'
Sep 30 21:03:30 compute-1 sudo[137615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:30 compute-1 python3.9[137617]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:30 compute-1 systemd[1]: Reloading.
Sep 30 21:03:30 compute-1 systemd-sysv-generator[137651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:30 compute-1 systemd-rc-local-generator[137647]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:30 compute-1 sudo[137615]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:31 compute-1 sudo[137805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wytpiicxanxxjpdwursjpudtwhzrqdxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266210.9776833-1061-2465335115546/AnsiballZ_systemd.py'
Sep 30 21:03:31 compute-1 sudo[137805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:31 compute-1 python3.9[137807]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:31 compute-1 systemd[1]: Reloading.
Sep 30 21:03:31 compute-1 systemd-rc-local-generator[137836]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:31 compute-1 systemd-sysv-generator[137840]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:31 compute-1 sudo[137805]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:32 compute-1 sudo[137995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szeivblkrrtcsdozecrcstedmdgfycjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266212.025835-1061-130443144380722/AnsiballZ_systemd.py'
Sep 30 21:03:32 compute-1 sudo[137995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:32 compute-1 python3.9[137997]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:33 compute-1 systemd[1]: Reloading.
Sep 30 21:03:33 compute-1 podman[138000]: 2025-09-30 21:03:33.835459457 +0000 UTC m=+0.106889599 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:03:33 compute-1 systemd-rc-local-generator[138057]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:33 compute-1 systemd-sysv-generator[138062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:34 compute-1 sudo[137995]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:34 compute-1 sudo[138213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxvwljeurfeltmzmwonkoeizdbtorbme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266214.2412963-1061-217238333190584/AnsiballZ_systemd.py'
Sep 30 21:03:34 compute-1 sudo[138213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:34 compute-1 python3.9[138215]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:34 compute-1 sudo[138213]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:35 compute-1 sudo[138368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phetashzpctgxihyrbrbjepphublpqrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266215.1198256-1061-108238562115063/AnsiballZ_systemd.py'
Sep 30 21:03:35 compute-1 sudo[138368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:35 compute-1 python3.9[138370]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:35 compute-1 systemd[1]: Reloading.
Sep 30 21:03:35 compute-1 systemd-rc-local-generator[138397]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:35 compute-1 systemd-sysv-generator[138402]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:36 compute-1 sudo[138368]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:36 compute-1 sudo[138559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnkhalhfgsfyqirbtqxigerlhoxwwlgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266216.2823195-1169-14755941615230/AnsiballZ_systemd.py'
Sep 30 21:03:36 compute-1 sudo[138559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:36 compute-1 python3.9[138561]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:36 compute-1 systemd[1]: Reloading.
Sep 30 21:03:36 compute-1 systemd-sysv-generator[138593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:36 compute-1 systemd-rc-local-generator[138589]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:37 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Sep 30 21:03:37 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Sep 30 21:03:37 compute-1 sudo[138559]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:37 compute-1 sudo[138751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypvxdrtipmrbkmfwetkvqwiauqwzgaru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266217.616903-1193-98247134715064/AnsiballZ_systemd.py'
Sep 30 21:03:37 compute-1 sudo[138751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:38 compute-1 python3.9[138753]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:38 compute-1 sudo[138751]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:03:38.660 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:03:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:03:38.661 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:03:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:03:38.661 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:03:38 compute-1 sudo[138906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iazuycodbxsvjwwjjddooixvnqrfukqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266218.5272386-1193-85023268855135/AnsiballZ_systemd.py'
Sep 30 21:03:38 compute-1 sudo[138906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:39 compute-1 python3.9[138908]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:39 compute-1 sudo[138906]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:39 compute-1 sudo[139061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhpgkzkiwgeootvsnibkarqzfofhhsuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266219.4072816-1193-115261751557433/AnsiballZ_systemd.py'
Sep 30 21:03:39 compute-1 sudo[139061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:39 compute-1 python3.9[139063]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:40 compute-1 sudo[139061]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:40 compute-1 sudo[139227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxadqkjimxcxgmvnmjhkxqyoicocflhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266220.2306976-1193-268200309331027/AnsiballZ_systemd.py'
Sep 30 21:03:40 compute-1 sudo[139227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:40 compute-1 podman[139190]: 2025-09-30 21:03:40.614954142 +0000 UTC m=+0.072053201 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:03:40 compute-1 python3.9[139235]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:40 compute-1 sudo[139227]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:41 compute-1 sudo[139389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwozosbqgefbloffgpmtcddhkobphghu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266221.148247-1193-98348648904299/AnsiballZ_systemd.py'
Sep 30 21:03:41 compute-1 sudo[139389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:41 compute-1 python3.9[139391]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:41 compute-1 sudo[139389]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:42 compute-1 sudo[139544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihqsaztewaxtfvgfslytrzssqopmpdbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266221.963981-1193-26709735185015/AnsiballZ_systemd.py'
Sep 30 21:03:42 compute-1 sudo[139544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:42 compute-1 python3.9[139546]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:42 compute-1 sudo[139544]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:43 compute-1 sudo[139699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irvyrulickjdbtgeqrrkqputhozwmqup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266222.7343457-1193-214985289089374/AnsiballZ_systemd.py'
Sep 30 21:03:43 compute-1 sudo[139699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:43 compute-1 python3.9[139701]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:43 compute-1 sudo[139699]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:43 compute-1 sudo[139854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbfnytfhweobwjcndbaaivpksqexhtgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266223.5427184-1193-69357606443736/AnsiballZ_systemd.py'
Sep 30 21:03:43 compute-1 sudo[139854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:44 compute-1 python3.9[139856]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:44 compute-1 sudo[139854]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:44 compute-1 sudo[140009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azrxygbhikjzmxuhameaqkjxyyxljwfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266224.2887099-1193-202246027864687/AnsiballZ_systemd.py'
Sep 30 21:03:44 compute-1 sudo[140009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:44 compute-1 python3.9[140011]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:46 compute-1 sudo[140009]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:46 compute-1 sudo[140164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skyardumdrfltjeuyjmmvwhtqnkuqbdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266226.166118-1193-35447732171268/AnsiballZ_systemd.py'
Sep 30 21:03:46 compute-1 sudo[140164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:46 compute-1 python3.9[140166]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:46 compute-1 sudo[140164]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:47 compute-1 sudo[140319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpnhxjqxqgwtxfhktuaqydoppzequbvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266227.0622752-1193-47295998295069/AnsiballZ_systemd.py'
Sep 30 21:03:47 compute-1 sudo[140319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:47 compute-1 python3.9[140321]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:47 compute-1 sudo[140319]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:48 compute-1 sudo[140474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxgbbruiavshuhhpdaemuggnmyxephqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266227.9146173-1193-76141868288674/AnsiballZ_systemd.py'
Sep 30 21:03:48 compute-1 sudo[140474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:48 compute-1 python3.9[140476]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:48 compute-1 sudo[140474]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:49 compute-1 sudo[140629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxtiflslvlvdilggzqsbwfipooisyhek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266228.7673666-1193-275901538095604/AnsiballZ_systemd.py'
Sep 30 21:03:49 compute-1 sudo[140629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:49 compute-1 python3.9[140631]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:49 compute-1 sudo[140629]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:50 compute-1 sudo[140784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbtllqdfxruhoaksxtjxdqtyrssvpbba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266229.6845675-1193-95850412466841/AnsiballZ_systemd.py'
Sep 30 21:03:50 compute-1 sudo[140784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:50 compute-1 python3.9[140786]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:50 compute-1 sudo[140784]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:51 compute-1 sudo[140939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urcbuefljpkjlhaldxqlkvuonhvqqhcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266230.9927282-1499-184501346990571/AnsiballZ_file.py'
Sep 30 21:03:51 compute-1 sudo[140939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:51 compute-1 python3.9[140941]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:51 compute-1 sudo[140939]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:52 compute-1 sudo[141091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hemhavhpmzxduptfupnefontnihladsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266231.7753377-1499-14978192461861/AnsiballZ_file.py'
Sep 30 21:03:52 compute-1 sudo[141091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:52 compute-1 python3.9[141093]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:52 compute-1 sudo[141091]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:52 compute-1 sudo[141243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezkegtabhpbvqnxwzipkgiexnrnckohn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266232.4155073-1499-197262427933035/AnsiballZ_file.py'
Sep 30 21:03:52 compute-1 sudo[141243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:52 compute-1 python3.9[141245]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:52 compute-1 sudo[141243]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:53 compute-1 sudo[141395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pprmhjwnvbdtfealqaggputinjeuobkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266233.1416345-1499-8718033872154/AnsiballZ_file.py'
Sep 30 21:03:53 compute-1 sudo[141395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:53 compute-1 python3.9[141397]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:53 compute-1 sudo[141395]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:54 compute-1 sudo[141547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riuqxhxdsokkzmgwnzdrihdhixqvjxkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266233.8190458-1499-90343974935283/AnsiballZ_file.py'
Sep 30 21:03:54 compute-1 sudo[141547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:54 compute-1 python3.9[141549]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:54 compute-1 sudo[141547]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:54 compute-1 sudo[141699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxtcerkleggmvdivlpluxgofxksoigcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266234.619373-1499-193738693404836/AnsiballZ_file.py'
Sep 30 21:03:54 compute-1 sudo[141699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:55 compute-1 python3.9[141701]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:55 compute-1 sudo[141699]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:56 compute-1 sudo[141851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lewrncyryxjrfloenerlvmghyfmnasry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266235.8728929-1628-44853276038385/AnsiballZ_stat.py'
Sep 30 21:03:56 compute-1 sudo[141851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:56 compute-1 python3.9[141853]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:03:56 compute-1 sudo[141851]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:57 compute-1 sudo[141976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vskzftjjgtbnvgqsfmkzbvjqrnvamban ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266235.8728929-1628-44853276038385/AnsiballZ_copy.py'
Sep 30 21:03:57 compute-1 sudo[141976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:57 compute-1 python3.9[141978]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266235.8728929-1628-44853276038385/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:03:57 compute-1 sudo[141976]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:57 compute-1 sudo[142128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkwrovychmuwvvwgfdqdwwrwmbcxgvov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266237.5180933-1628-42527434831235/AnsiballZ_stat.py'
Sep 30 21:03:57 compute-1 sudo[142128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:57 compute-1 python3.9[142130]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:03:57 compute-1 sudo[142128]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:58 compute-1 sudo[142253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjxeeumtbykfhpfoqrvftrqavggjpbll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266237.5180933-1628-42527434831235/AnsiballZ_copy.py'
Sep 30 21:03:58 compute-1 sudo[142253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:58 compute-1 python3.9[142255]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266237.5180933-1628-42527434831235/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:03:58 compute-1 sudo[142253]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:58 compute-1 sudo[142405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptitjdrfkdspxgjfkdvcpnicndpwuieh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266238.6026502-1628-35789185335840/AnsiballZ_stat.py'
Sep 30 21:03:58 compute-1 sudo[142405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:59 compute-1 python3.9[142407]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:03:59 compute-1 sudo[142405]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:59 compute-1 sudo[142530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pisvjjpgosoycwwggjsxckzyduianunh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266238.6026502-1628-35789185335840/AnsiballZ_copy.py'
Sep 30 21:03:59 compute-1 sudo[142530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:59 compute-1 python3.9[142532]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266238.6026502-1628-35789185335840/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:03:59 compute-1 sudo[142530]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:00 compute-1 sudo[142682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibuexmrggsrcpgamblgzsqjgffecvwpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266239.8305748-1628-235206940471613/AnsiballZ_stat.py'
Sep 30 21:04:00 compute-1 sudo[142682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:00 compute-1 python3.9[142684]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:00 compute-1 sudo[142682]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:00 compute-1 sudo[142807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hixviokivtjgrrpazjlzsqzbtkrdpsgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266239.8305748-1628-235206940471613/AnsiballZ_copy.py'
Sep 30 21:04:00 compute-1 sudo[142807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:00 compute-1 python3.9[142809]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266239.8305748-1628-235206940471613/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:00 compute-1 sudo[142807]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:01 compute-1 sudo[142959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-netjxpuvkgdwqbecuhfvaxwoynjlouzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266241.0888278-1628-218361573208125/AnsiballZ_stat.py'
Sep 30 21:04:01 compute-1 sudo[142959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:01 compute-1 python3.9[142961]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:01 compute-1 sudo[142959]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:02 compute-1 sudo[143084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmsazkqujzowgdcatffoeuxhcbtyrqwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266241.0888278-1628-218361573208125/AnsiballZ_copy.py'
Sep 30 21:04:02 compute-1 sudo[143084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:02 compute-1 python3.9[143086]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266241.0888278-1628-218361573208125/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:02 compute-1 sudo[143084]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:02 compute-1 sudo[143236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyysdmtfizomrzdfvpijsjttqukoqhin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266242.437612-1628-236585014929940/AnsiballZ_stat.py'
Sep 30 21:04:02 compute-1 sudo[143236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:02 compute-1 python3.9[143238]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:02 compute-1 sudo[143236]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:03 compute-1 sudo[143361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwfvuovbnldniygmwgedbvmzhcimviuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266242.437612-1628-236585014929940/AnsiballZ_copy.py'
Sep 30 21:04:03 compute-1 sudo[143361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:03 compute-1 python3.9[143363]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266242.437612-1628-236585014929940/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:03 compute-1 sudo[143361]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:04 compute-1 sudo[143513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsmwwubfaivqiiefemhzhdsofdgdhdvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266243.732791-1628-131600343737272/AnsiballZ_stat.py'
Sep 30 21:04:04 compute-1 sudo[143513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:04 compute-1 podman[143515]: 2025-09-30 21:04:04.225119597 +0000 UTC m=+0.152062598 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:04:04 compute-1 python3.9[143516]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:04 compute-1 sudo[143513]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:04 compute-1 sudo[143662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkyrnvqzujpwalktjrlsecnejvfyccld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266243.732791-1628-131600343737272/AnsiballZ_copy.py'
Sep 30 21:04:04 compute-1 sudo[143662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:04 compute-1 python3.9[143664]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266243.732791-1628-131600343737272/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:04 compute-1 sudo[143662]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:05 compute-1 sudo[143814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drouwrvuidbvsjqoomkjcqqaqbhpntvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266245.028205-1628-26732265884980/AnsiballZ_stat.py'
Sep 30 21:04:05 compute-1 sudo[143814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:05 compute-1 python3.9[143816]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:05 compute-1 sudo[143814]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:05 compute-1 sudo[143939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvkqfftgozvepcachjndpthwuzyxciqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266245.028205-1628-26732265884980/AnsiballZ_copy.py'
Sep 30 21:04:05 compute-1 sudo[143939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:06 compute-1 python3.9[143941]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266245.028205-1628-26732265884980/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:06 compute-1 sudo[143939]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:08 compute-1 sudo[144091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfsxnhltifqpslupgpwyotwnzikrtrcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266248.0098405-1967-221303665270903/AnsiballZ_command.py'
Sep 30 21:04:08 compute-1 sudo[144091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:08 compute-1 python3.9[144093]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Sep 30 21:04:08 compute-1 sudo[144091]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:09 compute-1 sudo[144244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkqkugmefsqiryzhrdhimmalkbarfgms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266248.8439665-1994-82421372966641/AnsiballZ_file.py'
Sep 30 21:04:09 compute-1 sudo[144244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:09 compute-1 python3.9[144246]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:09 compute-1 sudo[144244]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:09 compute-1 sudo[144396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqzvucfbwgcsakmuvounjjmbmglqpiam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266249.6029897-1994-20746793266747/AnsiballZ_file.py'
Sep 30 21:04:09 compute-1 sudo[144396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:10 compute-1 python3.9[144398]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:10 compute-1 sudo[144396]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:10 compute-1 sudo[144548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evcckocprffktxljvbelpyvwtxrxssvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266250.2241833-1994-209655163826983/AnsiballZ_file.py'
Sep 30 21:04:10 compute-1 sudo[144548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:10 compute-1 python3.9[144550]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:10 compute-1 sudo[144548]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:11 compute-1 podman[144650]: 2025-09-30 21:04:11.248437047 +0000 UTC m=+0.074750195 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent)
Sep 30 21:04:11 compute-1 sudo[144718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvxjvdhirnvdejrechmyywrekdvqspsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266250.9806342-1994-142386878916494/AnsiballZ_file.py'
Sep 30 21:04:11 compute-1 sudo[144718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:11 compute-1 python3.9[144720]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:11 compute-1 sudo[144718]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:12 compute-1 sudo[144870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxooozgabrhqjztgbndkqlbbiqjsygum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266251.7752788-1994-166806637248514/AnsiballZ_file.py'
Sep 30 21:04:12 compute-1 sudo[144870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:12 compute-1 python3.9[144872]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:12 compute-1 sudo[144870]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:12 compute-1 sudo[145022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nahslxlkwemdeqgrelwefujkgjctvubs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266252.594157-1994-16959522285277/AnsiballZ_file.py'
Sep 30 21:04:12 compute-1 sudo[145022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:13 compute-1 python3.9[145024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:13 compute-1 sudo[145022]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:13 compute-1 sudo[145174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feijjiwdwzvypmaynjlvyfxyjzzlxfld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266253.3678591-1994-95783926533633/AnsiballZ_file.py'
Sep 30 21:04:13 compute-1 sudo[145174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:13 compute-1 python3.9[145176]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:13 compute-1 sudo[145174]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:14 compute-1 sudo[145326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gufjpzexxksoeglbkqmvxrjggzjmiwnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266254.0887442-1994-17920354536137/AnsiballZ_file.py'
Sep 30 21:04:14 compute-1 sudo[145326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:14 compute-1 python3.9[145328]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:14 compute-1 sudo[145326]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:15 compute-1 sudo[145478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxjebpmdadxssszsydrkurctwrorbzun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266254.7750616-1994-177845490402233/AnsiballZ_file.py'
Sep 30 21:04:15 compute-1 sudo[145478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:15 compute-1 python3.9[145480]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:15 compute-1 sudo[145478]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:15 compute-1 sudo[145630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xizdojslkhaowgkmgjcrnizqyfdxgvua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266255.4475605-1994-235913138499413/AnsiballZ_file.py'
Sep 30 21:04:15 compute-1 sudo[145630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:15 compute-1 python3.9[145632]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:16 compute-1 sudo[145630]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:16 compute-1 sudo[145782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmesqhvbububipfucylalnydalflsrbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266256.3566558-1994-117603969528493/AnsiballZ_file.py'
Sep 30 21:04:16 compute-1 sudo[145782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:16 compute-1 python3.9[145784]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:16 compute-1 sudo[145782]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:17 compute-1 sudo[145934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upuapmfcvqtgdryxiqknyvbfthygxxux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266257.0283356-1994-149101882268156/AnsiballZ_file.py'
Sep 30 21:04:17 compute-1 sudo[145934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:17 compute-1 python3.9[145936]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:17 compute-1 sudo[145934]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:18 compute-1 sudo[146086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxvtipvyxpcysutnseqjcjfmkbauozfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266257.7586567-1994-197037504126227/AnsiballZ_file.py'
Sep 30 21:04:18 compute-1 sudo[146086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:18 compute-1 python3.9[146088]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:18 compute-1 sudo[146086]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:18 compute-1 sudo[146238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-authbnztvpqtrjpdmszdamnvqsvxfcdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266258.3860645-1994-199553051195717/AnsiballZ_file.py'
Sep 30 21:04:18 compute-1 sudo[146238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:18 compute-1 python3.9[146240]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:18 compute-1 sudo[146238]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:19 compute-1 sudo[146390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkrzuthkzaoktqtogydjewjkaftksmtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266259.4502325-2291-272678885664643/AnsiballZ_stat.py'
Sep 30 21:04:19 compute-1 sudo[146390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:20 compute-1 python3.9[146392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:20 compute-1 sudo[146390]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:20 compute-1 sudo[146513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnbktcjkggvcgvifvczdklqwanybvcmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266259.4502325-2291-272678885664643/AnsiballZ_copy.py'
Sep 30 21:04:20 compute-1 sudo[146513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:20 compute-1 python3.9[146515]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266259.4502325-2291-272678885664643/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:20 compute-1 sudo[146513]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:21 compute-1 sudo[146665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqyvalnjgrsnjtephopxcqoayvindrdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266260.8164024-2291-151961490478239/AnsiballZ_stat.py'
Sep 30 21:04:21 compute-1 sudo[146665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:21 compute-1 python3.9[146667]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:21 compute-1 sudo[146665]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:21 compute-1 sudo[146788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsixzkqcbovguycnmdgzqrjbnshwyvvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266260.8164024-2291-151961490478239/AnsiballZ_copy.py'
Sep 30 21:04:21 compute-1 sudo[146788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:22 compute-1 python3.9[146790]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266260.8164024-2291-151961490478239/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:22 compute-1 sudo[146788]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:22 compute-1 sudo[146940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbczaqfttdkesaahdcbdrazwpipiztiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266262.2933679-2291-89302839646861/AnsiballZ_stat.py'
Sep 30 21:04:22 compute-1 sudo[146940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:22 compute-1 python3.9[146942]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:22 compute-1 sudo[146940]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:23 compute-1 sudo[147063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnuqlpqpswbulihvzjzfoogbsazzmvhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266262.2933679-2291-89302839646861/AnsiballZ_copy.py'
Sep 30 21:04:23 compute-1 sudo[147063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:23 compute-1 python3.9[147065]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266262.2933679-2291-89302839646861/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:23 compute-1 sudo[147063]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:24 compute-1 sudo[147215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpnbtdfgkzlmbfqvbmkdarovzjdlnyqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266263.5248241-2291-62417887186242/AnsiballZ_stat.py'
Sep 30 21:04:24 compute-1 sudo[147215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:24 compute-1 python3.9[147217]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:24 compute-1 sudo[147215]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:24 compute-1 sudo[147338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbojrysqvgnkzngidfklmpguempzzpty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266263.5248241-2291-62417887186242/AnsiballZ_copy.py'
Sep 30 21:04:24 compute-1 sudo[147338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:24 compute-1 python3.9[147340]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266263.5248241-2291-62417887186242/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:24 compute-1 sudo[147338]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:25 compute-1 sudo[147490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouwictfqxrjzbshgsyeuumqpkxrpeapb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266264.9853046-2291-230005004211009/AnsiballZ_stat.py'
Sep 30 21:04:25 compute-1 sudo[147490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:25 compute-1 python3.9[147492]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:25 compute-1 sudo[147490]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:25 compute-1 sudo[147613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ergmizgfoixstulrntyjwxfepzkbvmur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266264.9853046-2291-230005004211009/AnsiballZ_copy.py'
Sep 30 21:04:25 compute-1 sudo[147613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:26 compute-1 python3.9[147615]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266264.9853046-2291-230005004211009/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:26 compute-1 sudo[147613]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:26 compute-1 sudo[147765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxcycyoighcbhdquxmillfxprpmpegrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266266.2089148-2291-261568420665868/AnsiballZ_stat.py'
Sep 30 21:04:26 compute-1 sudo[147765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:26 compute-1 python3.9[147767]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:26 compute-1 sudo[147765]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:27 compute-1 sudo[147888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljvzrjxmztdvvqejnastgzarhltylocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266266.2089148-2291-261568420665868/AnsiballZ_copy.py'
Sep 30 21:04:27 compute-1 sudo[147888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:27 compute-1 python3.9[147890]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266266.2089148-2291-261568420665868/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:27 compute-1 sudo[147888]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:27 compute-1 sudo[148040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctwddnzkhedzkzrrxvyknnkndvjpuksp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266267.570967-2291-27561007885794/AnsiballZ_stat.py'
Sep 30 21:04:27 compute-1 sudo[148040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:28 compute-1 python3.9[148042]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:28 compute-1 sudo[148040]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:28 compute-1 sudo[148163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwbhhapbmwjtzmxeqacrvmmwgqllzgag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266267.570967-2291-27561007885794/AnsiballZ_copy.py'
Sep 30 21:04:28 compute-1 sudo[148163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:28 compute-1 python3.9[148165]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266267.570967-2291-27561007885794/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:28 compute-1 sudo[148163]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:29 compute-1 sudo[148315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzwrzugybpwatmyslnxjrifixchzmkcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266268.9189582-2291-66804732459570/AnsiballZ_stat.py'
Sep 30 21:04:29 compute-1 sudo[148315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:29 compute-1 python3.9[148317]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:29 compute-1 sudo[148315]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:29 compute-1 sudo[148438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwqkpehroozrhjctkmmfvwbvgaazjujj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266268.9189582-2291-66804732459570/AnsiballZ_copy.py'
Sep 30 21:04:29 compute-1 sudo[148438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:29 compute-1 python3.9[148440]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266268.9189582-2291-66804732459570/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:29 compute-1 sudo[148438]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:30 compute-1 sudo[148590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofmzzcaqmuxindudxbuhrokfofgynnpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266270.079274-2291-92059786689991/AnsiballZ_stat.py'
Sep 30 21:04:30 compute-1 sudo[148590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:30 compute-1 python3.9[148592]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:30 compute-1 sudo[148590]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:30 compute-1 sudo[148713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noddguoyerbxaquksuclfvwfomobvzor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266270.079274-2291-92059786689991/AnsiballZ_copy.py'
Sep 30 21:04:30 compute-1 sudo[148713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:31 compute-1 python3.9[148715]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266270.079274-2291-92059786689991/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:31 compute-1 sudo[148713]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:31 compute-1 sudo[148865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eokgdvpsnihcpnfozndcyjtapaksdnnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266271.3087313-2291-239848790843944/AnsiballZ_stat.py'
Sep 30 21:04:31 compute-1 sudo[148865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:31 compute-1 python3.9[148867]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:31 compute-1 sudo[148865]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:32 compute-1 sudo[148988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzxaguzvaloiqnnofdjpfnvomgcygasj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266271.3087313-2291-239848790843944/AnsiballZ_copy.py'
Sep 30 21:04:32 compute-1 sudo[148988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:32 compute-1 python3.9[148990]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266271.3087313-2291-239848790843944/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:32 compute-1 sudo[148988]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:32 compute-1 sudo[149140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itjozacrudkepmuwpgxafwjovnnaeuwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266272.510763-2291-144504928503271/AnsiballZ_stat.py'
Sep 30 21:04:32 compute-1 sudo[149140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:32 compute-1 python3.9[149142]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:33 compute-1 sudo[149140]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:33 compute-1 sudo[149263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liwlcebkcgodkgotliaipajwpbczhrwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266272.510763-2291-144504928503271/AnsiballZ_copy.py'
Sep 30 21:04:33 compute-1 sudo[149263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:33 compute-1 python3.9[149265]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266272.510763-2291-144504928503271/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:33 compute-1 sudo[149263]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:33 compute-1 sudo[149415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyhglokuknxdrinktlvxsecgwiagwxbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266273.6987107-2291-239329892189031/AnsiballZ_stat.py'
Sep 30 21:04:33 compute-1 sudo[149415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:34 compute-1 python3.9[149417]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:34 compute-1 sudo[149415]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:34 compute-1 sudo[149553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxugcwdwkbiekjvnodwqpwfkpqqhgmez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266273.6987107-2291-239329892189031/AnsiballZ_copy.py'
Sep 30 21:04:34 compute-1 sudo[149553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:34 compute-1 podman[149512]: 2025-09-30 21:04:34.554138439 +0000 UTC m=+0.075105506 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:04:34 compute-1 python3.9[149560]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266273.6987107-2291-239329892189031/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:34 compute-1 sudo[149553]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:35 compute-1 sudo[149717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skiqhzohjsrkgzryddwtyrrjegonqiwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266274.8557303-2291-247027741878889/AnsiballZ_stat.py'
Sep 30 21:04:35 compute-1 sudo[149717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:35 compute-1 python3.9[149719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:35 compute-1 sudo[149717]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:35 compute-1 sudo[149840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whhgwxxgowksyyszrbywpenhnyvwrdsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266274.8557303-2291-247027741878889/AnsiballZ_copy.py'
Sep 30 21:04:35 compute-1 sudo[149840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:35 compute-1 python3.9[149842]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266274.8557303-2291-247027741878889/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:35 compute-1 sudo[149840]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:36 compute-1 sudo[149992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osiwyqryxmnrzpcguoxqemfltuxsoyvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266276.1732533-2291-78529989130102/AnsiballZ_stat.py'
Sep 30 21:04:36 compute-1 sudo[149992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:36 compute-1 python3.9[149994]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:36 compute-1 sudo[149992]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:37 compute-1 sudo[150115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gycxtcuxppkeuxadvzmhpmkowtmeezts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266276.1732533-2291-78529989130102/AnsiballZ_copy.py'
Sep 30 21:04:37 compute-1 sudo[150115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:37 compute-1 python3.9[150117]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266276.1732533-2291-78529989130102/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:37 compute-1 sudo[150115]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:04:38.661 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:04:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:04:38.662 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:04:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:04:38.662 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:04:40 compute-1 python3.9[150267]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:04:41 compute-1 sudo[150420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vygglblvedfujjkjblissmetdfompxka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266280.5361674-2909-237817765006813/AnsiballZ_seboolean.py'
Sep 30 21:04:41 compute-1 sudo[150420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:41 compute-1 python3.9[150422]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Sep 30 21:04:42 compute-1 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Sep 30 21:04:42 compute-1 podman[150425]: 2025-09-30 21:04:42.227061142 +0000 UTC m=+0.057430480 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:04:42 compute-1 sudo[150420]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:43 compute-1 sudo[150595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smcbspelzgyyfcxgnhhtuptphkzbcyfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266282.7603104-2933-195009483580495/AnsiballZ_copy.py'
Sep 30 21:04:43 compute-1 sudo[150595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:43 compute-1 python3.9[150597]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:43 compute-1 sudo[150595]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:43 compute-1 sudo[150747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixauauuuexownoznzikreetuluhftkbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266283.4249885-2933-108352290984555/AnsiballZ_copy.py'
Sep 30 21:04:43 compute-1 sudo[150747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:43 compute-1 python3.9[150749]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:43 compute-1 sudo[150747]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:44 compute-1 sudo[150899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hedixrbakqjyckxvsmlnkabulsezklyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266284.09906-2933-270709845909435/AnsiballZ_copy.py'
Sep 30 21:04:44 compute-1 sudo[150899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:44 compute-1 python3.9[150901]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:44 compute-1 sudo[150899]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:45 compute-1 sudo[151051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgweyyqnrvvbiodrtngilmhmqxbikwgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266284.7006607-2933-83353869156593/AnsiballZ_copy.py'
Sep 30 21:04:45 compute-1 sudo[151051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:45 compute-1 python3.9[151053]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:45 compute-1 sudo[151051]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:45 compute-1 sudo[151203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rypdoifzgqzrryweqfjdcsknfrbujkhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266285.4063-2933-252403840012455/AnsiballZ_copy.py'
Sep 30 21:04:45 compute-1 sudo[151203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:46 compute-1 python3.9[151205]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:46 compute-1 sudo[151203]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:47 compute-1 sudo[151355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uldewezouegihxrtnqwhtjkqhvhfpvdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266286.5788581-3041-54814497312001/AnsiballZ_copy.py'
Sep 30 21:04:47 compute-1 sudo[151355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:47 compute-1 python3.9[151357]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:47 compute-1 sudo[151355]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:47 compute-1 sudo[151507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqykkogtpoboejrwovbbajjedeiidfqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266287.4063687-3041-216167046992296/AnsiballZ_copy.py'
Sep 30 21:04:47 compute-1 sudo[151507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:47 compute-1 python3.9[151509]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:47 compute-1 sudo[151507]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:48 compute-1 sudo[151659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kybzbwxkxmivqzipsbtiqzimdpuwfdtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266288.1468282-3041-93845336740812/AnsiballZ_copy.py'
Sep 30 21:04:48 compute-1 sudo[151659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:48 compute-1 python3.9[151661]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:48 compute-1 sudo[151659]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:49 compute-1 sudo[151811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbvrztslqaixykbxplgreikpbulwqznv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266288.894226-3041-119187536363168/AnsiballZ_copy.py'
Sep 30 21:04:49 compute-1 sudo[151811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:49 compute-1 python3.9[151813]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:49 compute-1 sudo[151811]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:49 compute-1 sudo[151963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whmrcxjttwaexqyithchtnrtuaxljxuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266289.6416616-3041-200924833091487/AnsiballZ_copy.py'
Sep 30 21:04:49 compute-1 sudo[151963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:50 compute-1 python3.9[151965]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:50 compute-1 sudo[151963]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:50 compute-1 sudo[152115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvkneugugginupkfrbmwyzqjapvfvuir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266290.4002557-3149-272840043934583/AnsiballZ_systemd.py'
Sep 30 21:04:50 compute-1 sudo[152115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:51 compute-1 python3.9[152117]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:51 compute-1 systemd[1]: Reloading.
Sep 30 21:04:51 compute-1 systemd-rc-local-generator[152146]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:51 compute-1 systemd-sysv-generator[152150]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:51 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Sep 30 21:04:51 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Sep 30 21:04:51 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Sep 30 21:04:51 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Sep 30 21:04:51 compute-1 systemd[1]: Starting libvirt logging daemon...
Sep 30 21:04:51 compute-1 systemd[1]: Started libvirt logging daemon.
Sep 30 21:04:51 compute-1 sudo[152115]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:51 compute-1 sudo[152309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcfxvscjsthvfzvjztwuvipbyrvhpebe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266291.7459273-3149-120273591704438/AnsiballZ_systemd.py'
Sep 30 21:04:51 compute-1 sudo[152309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:52 compute-1 python3.9[152311]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:52 compute-1 systemd[1]: Reloading.
Sep 30 21:04:52 compute-1 systemd-rc-local-generator[152337]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:52 compute-1 systemd-sysv-generator[152342]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:52 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Sep 30 21:04:52 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Sep 30 21:04:52 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Sep 30 21:04:52 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Sep 30 21:04:52 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Sep 30 21:04:52 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Sep 30 21:04:52 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 21:04:52 compute-1 systemd[1]: Started libvirt nodedev daemon.
Sep 30 21:04:52 compute-1 sudo[152309]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:53 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Sep 30 21:04:53 compute-1 sudo[152524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnarxcidarzmapetaacgefvssejkrsia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266292.7821045-3149-226234823947103/AnsiballZ_systemd.py'
Sep 30 21:04:53 compute-1 sudo[152524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:53 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Sep 30 21:04:53 compute-1 python3.9[152526]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:53 compute-1 systemd[1]: Reloading.
Sep 30 21:04:53 compute-1 systemd-rc-local-generator[152554]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:53 compute-1 systemd-sysv-generator[152560]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:53 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Sep 30 21:04:53 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Sep 30 21:04:53 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Sep 30 21:04:53 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Sep 30 21:04:53 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Sep 30 21:04:53 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Sep 30 21:04:53 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 21:04:53 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 21:04:53 compute-1 sudo[152524]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:54 compute-1 sudo[152741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqhhgeuubmhwgbahkqfsukobquuczabn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266293.9730232-3149-110414773856577/AnsiballZ_systemd.py'
Sep 30 21:04:54 compute-1 sudo[152741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:54 compute-1 python3.9[152743]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:54 compute-1 systemd[1]: Reloading.
Sep 30 21:04:54 compute-1 systemd-rc-local-generator[152771]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:54 compute-1 systemd-sysv-generator[152777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:54 compute-1 setroubleshoot[152474]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l cd5a3a39-9fd5-4547-af43-e6be279aa196
Sep 30 21:04:54 compute-1 setroubleshoot[152474]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 21:04:54 compute-1 setroubleshoot[152474]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l cd5a3a39-9fd5-4547-af43-e6be279aa196
Sep 30 21:04:54 compute-1 setroubleshoot[152474]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 21:04:54 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Sep 30 21:04:54 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Sep 30 21:04:54 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Sep 30 21:04:54 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Sep 30 21:04:54 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Sep 30 21:04:54 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Sep 30 21:04:54 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Sep 30 21:04:55 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Sep 30 21:04:55 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Sep 30 21:04:55 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Sep 30 21:04:55 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 21:04:55 compute-1 systemd[1]: Started libvirt QEMU daemon.
Sep 30 21:04:55 compute-1 sudo[152741]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:55 compute-1 sudo[152954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhwcryiqttmrzqmepdkxxycrgeougfzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266295.2881482-3149-153515238891810/AnsiballZ_systemd.py'
Sep 30 21:04:55 compute-1 sudo[152954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:55 compute-1 python3.9[152956]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:55 compute-1 systemd[1]: Reloading.
Sep 30 21:04:56 compute-1 systemd-sysv-generator[152986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:56 compute-1 systemd-rc-local-generator[152983]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:56 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Sep 30 21:04:56 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Sep 30 21:04:56 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Sep 30 21:04:56 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Sep 30 21:04:56 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Sep 30 21:04:56 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Sep 30 21:04:56 compute-1 systemd[1]: Starting libvirt secret daemon...
Sep 30 21:04:56 compute-1 systemd[1]: Started libvirt secret daemon.
Sep 30 21:04:56 compute-1 sudo[152954]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:58 compute-1 sudo[153163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgmxocljxawztrzliloxwcmsrvojwpvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266297.9881415-3260-73673857483168/AnsiballZ_file.py'
Sep 30 21:04:58 compute-1 sudo[153163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:58 compute-1 python3.9[153165]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:58 compute-1 sudo[153163]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:59 compute-1 sudo[153315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trftyvhpzdtnnjdojriqfnaetzdigmgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266298.8444552-3284-45509589213386/AnsiballZ_find.py'
Sep 30 21:04:59 compute-1 sudo[153315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:59 compute-1 python3.9[153317]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:04:59 compute-1 sudo[153315]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:00 compute-1 sudo[153467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhhnxlojmiyslnoyqizrghzuieaxpwln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266300.0098386-3326-145354827352271/AnsiballZ_stat.py'
Sep 30 21:05:00 compute-1 sudo[153467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:00 compute-1 python3.9[153469]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:00 compute-1 sudo[153467]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:01 compute-1 sudo[153590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hauipitzbarkbeeecefgbzezrcrrlzly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266300.0098386-3326-145354827352271/AnsiballZ_copy.py'
Sep 30 21:05:01 compute-1 sudo[153590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:01 compute-1 python3.9[153592]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266300.0098386-3326-145354827352271/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:01 compute-1 sudo[153590]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:02 compute-1 sudo[153742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iumpshctpnxfvlsrapcemsupftfcosyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266301.6471586-3374-139356447676283/AnsiballZ_file.py'
Sep 30 21:05:02 compute-1 sudo[153742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:02 compute-1 python3.9[153744]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:02 compute-1 sudo[153742]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:03 compute-1 sudo[153894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjtgirrnjquyjrlrpaeleswsxsocfymi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266302.677895-3398-57733032955860/AnsiballZ_stat.py'
Sep 30 21:05:03 compute-1 sudo[153894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:03 compute-1 python3.9[153896]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:03 compute-1 sudo[153894]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:03 compute-1 sudo[153972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjfbiznhfhnsurlpjsolwgpdntydimtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266302.677895-3398-57733032955860/AnsiballZ_file.py'
Sep 30 21:05:03 compute-1 sudo[153972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:03 compute-1 python3.9[153974]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:03 compute-1 sudo[153972]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:04 compute-1 sudo[154124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpzrcqblhkskcubhmfsncfqxxzhdlrkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266304.0630739-3434-20223838294047/AnsiballZ_stat.py'
Sep 30 21:05:04 compute-1 sudo[154124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:04 compute-1 python3.9[154126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:04 compute-1 sudo[154124]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:04 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Sep 30 21:05:04 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.067s CPU time.
Sep 30 21:05:04 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Sep 30 21:05:04 compute-1 sudo[154213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebwkytbixfsffwnxqonirwifeivbrjhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266304.0630739-3434-20223838294047/AnsiballZ_file.py'
Sep 30 21:05:04 compute-1 sudo[154213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:05 compute-1 podman[154176]: 2025-09-30 21:05:05.00349128 +0000 UTC m=+0.113394489 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Sep 30 21:05:05 compute-1 python3.9[154219]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mf5cuq09 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:05 compute-1 sudo[154213]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:05 compute-1 sudo[154379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwuljrgclwgahbstykiuguhqenzopjzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266305.4808023-3471-122805064090399/AnsiballZ_stat.py'
Sep 30 21:05:05 compute-1 sudo[154379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:06 compute-1 python3.9[154381]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:06 compute-1 sudo[154379]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:06 compute-1 sudo[154457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzaerxrtqfpdpfhsywdvzbfykcfbzqnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266305.4808023-3471-122805064090399/AnsiballZ_file.py'
Sep 30 21:05:06 compute-1 sudo[154457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:06 compute-1 python3.9[154459]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:06 compute-1 sudo[154457]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:07 compute-1 sudo[154609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyjaielmfiqhpqtarawgksquknnayzdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266306.9484015-3509-116085351306674/AnsiballZ_command.py'
Sep 30 21:05:07 compute-1 sudo[154609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:07 compute-1 python3.9[154611]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:05:07 compute-1 sudo[154609]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:08 compute-1 sudo[154762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fweamwgqvflmagqkypmzpikzdtpsteuf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266307.9117644-3533-207489492849828/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 21:05:08 compute-1 sudo[154762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:08 compute-1 python3[154764]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 21:05:08 compute-1 sudo[154762]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:09 compute-1 sudo[154914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqfkaoeeyiihirophooxapeshwgiuqyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266308.8731313-3557-54879352893241/AnsiballZ_stat.py'
Sep 30 21:05:09 compute-1 sudo[154914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:09 compute-1 python3.9[154916]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:09 compute-1 sudo[154914]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:09 compute-1 sudo[154992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngrtjzfbtqqgxkhuupjnzzfrmgwenjii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266308.8731313-3557-54879352893241/AnsiballZ_file.py'
Sep 30 21:05:09 compute-1 sudo[154992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:09 compute-1 python3.9[154994]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:09 compute-1 sudo[154992]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:10 compute-1 sudo[155144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcttfeltmbleozyikslyebywfmgrdiqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266310.3463056-3593-130605470828356/AnsiballZ_stat.py'
Sep 30 21:05:10 compute-1 sudo[155144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:10 compute-1 python3.9[155146]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:10 compute-1 sudo[155144]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:11 compute-1 sudo[155222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahfkynwyczngenjprebhhliduwuonrnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266310.3463056-3593-130605470828356/AnsiballZ_file.py'
Sep 30 21:05:11 compute-1 sudo[155222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:11 compute-1 python3.9[155224]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:11 compute-1 sudo[155222]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:12 compute-1 sudo[155374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eidseorfjcrecuagmnsawjkkhgtdocqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266311.684875-3629-242073571916720/AnsiballZ_stat.py'
Sep 30 21:05:12 compute-1 sudo[155374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:12 compute-1 python3.9[155376]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:12 compute-1 sudo[155374]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:12 compute-1 sudo[155462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qseekoncoffeputbaiiphzyuaxluuaup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266311.684875-3629-242073571916720/AnsiballZ_file.py'
Sep 30 21:05:12 compute-1 sudo[155462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:12 compute-1 podman[155426]: 2025-09-30 21:05:12.6642596 +0000 UTC m=+0.079342523 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:05:12 compute-1 python3.9[155470]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:12 compute-1 sudo[155462]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:13 compute-1 sudo[155624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fntbwognxfaojtjxgpbmsafdyxylgjip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266313.1190693-3665-123038352126841/AnsiballZ_stat.py'
Sep 30 21:05:13 compute-1 sudo[155624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:13 compute-1 python3.9[155626]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:13 compute-1 sudo[155624]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:13 compute-1 sudo[155702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceglmhnrwyyncctqldymhsuichxgjwei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266313.1190693-3665-123038352126841/AnsiballZ_file.py'
Sep 30 21:05:13 compute-1 sudo[155702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:14 compute-1 python3.9[155704]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:14 compute-1 sudo[155702]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:14 compute-1 sudo[155854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbqhionrmxmullowwymgzpesycpqwnjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266314.3546743-3701-83928791097463/AnsiballZ_stat.py'
Sep 30 21:05:14 compute-1 sudo[155854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:14 compute-1 python3.9[155856]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:14 compute-1 sudo[155854]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:15 compute-1 sudo[155979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prvwthaqdaxqbmigbufplfbgcqtdrzoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266314.3546743-3701-83928791097463/AnsiballZ_copy.py'
Sep 30 21:05:15 compute-1 sudo[155979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:15 compute-1 python3.9[155981]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266314.3546743-3701-83928791097463/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:15 compute-1 sudo[155979]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:16 compute-1 sudo[156131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrofahoahzwotylwsqcgocofhkysuvjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266315.932086-3746-166070031769179/AnsiballZ_file.py'
Sep 30 21:05:16 compute-1 sudo[156131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:16 compute-1 python3.9[156133]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:16 compute-1 sudo[156131]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:17 compute-1 sudo[156283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egswtihbcjxsopfxfjxswytoabwxfrpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266316.825667-3770-178038465445599/AnsiballZ_command.py'
Sep 30 21:05:17 compute-1 sudo[156283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:17 compute-1 python3.9[156285]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:05:17 compute-1 sudo[156283]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:18 compute-1 sudo[156438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrbxhkxiuxnehbdexrptlglyyrisgxvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266317.6999726-3795-26059594167004/AnsiballZ_blockinfile.py'
Sep 30 21:05:18 compute-1 sudo[156438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:18 compute-1 python3.9[156440]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:18 compute-1 sudo[156438]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:19 compute-1 sudo[156590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlmrgdxwtbtdxkcetkpnyajjhntvugnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266318.8544517-3821-117927963605673/AnsiballZ_command.py'
Sep 30 21:05:19 compute-1 sudo[156590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:19 compute-1 python3.9[156592]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:05:19 compute-1 sudo[156590]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:19 compute-1 sshd-session[156593]: Invalid user ubuntu from 164.92.202.181 port 55886
Sep 30 21:05:19 compute-1 sshd-session[156593]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:05:19 compute-1 sshd-session[156593]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=164.92.202.181
Sep 30 21:05:20 compute-1 sudo[156745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khaibkgiwfinwpyqgjgnthckbzwherbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266319.7344906-3845-95562608905328/AnsiballZ_stat.py'
Sep 30 21:05:20 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:05:20 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:05:20 compute-1 sudo[156745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:20 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:05:20 compute-1 python3.9[156747]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:05:20 compute-1 sudo[156745]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:20 compute-1 sudo[156900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqqngksjsksipqrrzdyzgcyacivpmzzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266320.566761-3869-68490053235721/AnsiballZ_command.py'
Sep 30 21:05:20 compute-1 sudo[156900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:21 compute-1 python3.9[156902]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:05:21 compute-1 sudo[156900]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:21 compute-1 sshd-session[156593]: Failed password for invalid user ubuntu from 164.92.202.181 port 55886 ssh2
Sep 30 21:05:21 compute-1 sudo[157055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfqmntmkofyebojlnaoflylgkpnwqqha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266321.4316504-3894-156083345680258/AnsiballZ_file.py'
Sep 30 21:05:21 compute-1 sudo[157055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:21 compute-1 python3.9[157057]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:21 compute-1 sudo[157055]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:22 compute-1 sudo[157207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzbcxywouohyywelykelxkctquptolbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266322.2607987-3917-211469440809195/AnsiballZ_stat.py'
Sep 30 21:05:22 compute-1 sudo[157207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:22 compute-1 sshd-session[156593]: Received disconnect from 164.92.202.181 port 55886:11:  [preauth]
Sep 30 21:05:22 compute-1 sshd-session[156593]: Disconnected from invalid user ubuntu 164.92.202.181 port 55886 [preauth]
Sep 30 21:05:22 compute-1 python3.9[157209]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:22 compute-1 sudo[157207]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:23 compute-1 sudo[157330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upitfgnogesovjbxllgczhhkettrcamc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266322.2607987-3917-211469440809195/AnsiballZ_copy.py'
Sep 30 21:05:23 compute-1 sudo[157330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:23 compute-1 python3.9[157332]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266322.2607987-3917-211469440809195/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:23 compute-1 sudo[157330]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:24 compute-1 sudo[157482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aojpqdoesaetjrxawvfnwtnaalkrozog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266323.7857451-3962-101347062633746/AnsiballZ_stat.py'
Sep 30 21:05:24 compute-1 sudo[157482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:24 compute-1 python3.9[157484]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:24 compute-1 sudo[157482]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:24 compute-1 sudo[157605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mybujumviadqnnhvtvozuhypbhpfekmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266323.7857451-3962-101347062633746/AnsiballZ_copy.py'
Sep 30 21:05:24 compute-1 sudo[157605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:25 compute-1 python3.9[157607]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266323.7857451-3962-101347062633746/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:25 compute-1 sudo[157605]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:25 compute-1 sudo[157757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adygsrxnbsjwlqgpadlghnlooykhxagf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266325.3399086-4007-795194336950/AnsiballZ_stat.py'
Sep 30 21:05:25 compute-1 sudo[157757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:25 compute-1 python3.9[157759]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:25 compute-1 sudo[157757]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:26 compute-1 sudo[157880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gntcacufunsxlpllrsmlpomhjuiaphal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266325.3399086-4007-795194336950/AnsiballZ_copy.py'
Sep 30 21:05:26 compute-1 sudo[157880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:26 compute-1 python3.9[157882]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266325.3399086-4007-795194336950/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:26 compute-1 sudo[157880]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:27 compute-1 sudo[158032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkwplxyqbhprdviulzdifhongfikzcll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266326.8822145-4052-196333365744347/AnsiballZ_systemd.py'
Sep 30 21:05:27 compute-1 sudo[158032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:27 compute-1 python3.9[158034]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:05:27 compute-1 systemd[1]: Reloading.
Sep 30 21:05:27 compute-1 systemd-rc-local-generator[158057]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:27 compute-1 systemd-sysv-generator[158060]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:27 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Sep 30 21:05:28 compute-1 sudo[158032]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:28 compute-1 sudo[158222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgbwkixnrvrhlixnetnbkrccvlxzsfwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266328.2144887-4077-47700222733146/AnsiballZ_systemd.py'
Sep 30 21:05:28 compute-1 sudo[158222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:28 compute-1 python3.9[158224]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 21:05:28 compute-1 systemd[1]: Reloading.
Sep 30 21:05:29 compute-1 systemd-rc-local-generator[158255]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:29 compute-1 systemd-sysv-generator[158258]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:29 compute-1 systemd[1]: Reloading.
Sep 30 21:05:29 compute-1 systemd-rc-local-generator[158293]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:29 compute-1 systemd-sysv-generator[158296]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:29 compute-1 sudo[158222]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:29 compute-1 sshd-session[103983]: Connection closed by 192.168.122.30 port 55814
Sep 30 21:05:29 compute-1 sshd-session[103980]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:05:29 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Sep 30 21:05:29 compute-1 systemd[1]: session-24.scope: Consumed 3min 26.655s CPU time.
Sep 30 21:05:29 compute-1 systemd-logind[793]: Session 24 logged out. Waiting for processes to exit.
Sep 30 21:05:29 compute-1 systemd-logind[793]: Removed session 24.
Sep 30 21:05:35 compute-1 podman[158322]: 2025-09-30 21:05:35.337106795 +0000 UTC m=+0.169763066 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:05:35 compute-1 sshd-session[158348]: Accepted publickey for zuul from 192.168.122.30 port 42280 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:05:35 compute-1 systemd-logind[793]: New session 25 of user zuul.
Sep 30 21:05:35 compute-1 systemd[1]: Started Session 25 of User zuul.
Sep 30 21:05:35 compute-1 sshd-session[158348]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:05:36 compute-1 python3.9[158501]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 21:05:37 compute-1 sudo[158655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwmtqgvtowbprdtkhrgksduupidqqgmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266337.505399-68-218204418426339/AnsiballZ_file.py'
Sep 30 21:05:37 compute-1 sudo[158655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:38 compute-1 python3.9[158657]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:38 compute-1 sudo[158655]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:38 compute-1 sudo[158807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxcirxmqjdcvfehuodveakngbsqzrvjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266338.2349296-68-146965252311398/AnsiballZ_file.py'
Sep 30 21:05:38 compute-1 sudo[158807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:05:38.663 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:05:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:05:38.663 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:05:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:05:38.663 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:05:38 compute-1 python3.9[158809]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:38 compute-1 sudo[158807]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:39 compute-1 sudo[158959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbhvsfplyjdxlfnakncanmnxxaeowpat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266338.8500724-68-63736679670601/AnsiballZ_file.py'
Sep 30 21:05:39 compute-1 sudo[158959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:39 compute-1 python3.9[158961]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:39 compute-1 sudo[158959]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:39 compute-1 sudo[159111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcrkwkuauilhapfayzxbwgvwafxctlaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266339.4850998-68-106040129791865/AnsiballZ_file.py'
Sep 30 21:05:39 compute-1 sudo[159111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:39 compute-1 python3.9[159113]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 21:05:39 compute-1 sudo[159111]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:40 compute-1 sudo[159263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drasfcrvzbfnvdjeqlyafcepcojipvax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266340.1132615-68-208938776566274/AnsiballZ_file.py'
Sep 30 21:05:40 compute-1 sudo[159263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:40 compute-1 python3.9[159265]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:40 compute-1 sudo[159263]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:42 compute-1 sudo[159415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocnyqrldjbulgbsmydwyyaxswraxffun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266341.5596237-176-244907217976335/AnsiballZ_stat.py'
Sep 30 21:05:42 compute-1 sudo[159415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:42 compute-1 python3.9[159417]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:05:42 compute-1 sudo[159415]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:43 compute-1 sudo[159582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svkqdmcznuppxnmnrzpyjluuqbchycvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266342.589494-200-231672228857733/AnsiballZ_systemd.py'
Sep 30 21:05:43 compute-1 sudo[159582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:43 compute-1 podman[159543]: 2025-09-30 21:05:43.197560556 +0000 UTC m=+0.057402292 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:05:43 compute-1 python3.9[159590]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:05:43 compute-1 systemd[1]: Reloading.
Sep 30 21:05:43 compute-1 systemd-rc-local-generator[159620]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:43 compute-1 systemd-sysv-generator[159623]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:43 compute-1 sudo[159582]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:44 compute-1 sudo[159779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccihlojlljehzugdvzcgpyjiffrpqsgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266344.1347783-224-55958829772762/AnsiballZ_service_facts.py'
Sep 30 21:05:44 compute-1 sudo[159779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:44 compute-1 python3.9[159781]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:05:44 compute-1 network[159798]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:05:44 compute-1 network[159799]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:05:44 compute-1 network[159800]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:05:47 compute-1 sudo[159779]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:50 compute-1 sudo[160071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abzanuxwrwvjbleekcpdawsgclwqnfkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266350.0995557-248-244954758362841/AnsiballZ_systemd.py'
Sep 30 21:05:50 compute-1 sudo[160071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:50 compute-1 python3.9[160073]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:05:50 compute-1 systemd[1]: Reloading.
Sep 30 21:05:50 compute-1 systemd-rc-local-generator[160106]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:50 compute-1 systemd-sysv-generator[160110]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:51 compute-1 sudo[160071]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:52 compute-1 python3.9[160264]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:05:52 compute-1 unix_chkpwd[160289]: password check failed for user (root)
Sep 30 21:05:52 compute-1 sshd-session[160113]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116  user=root
Sep 30 21:05:52 compute-1 sudo[160415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhywrwkrhxcnrghmfmpbexnxtxfktntw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266352.2177193-299-169461747250485/AnsiballZ_podman_container.py'
Sep 30 21:05:52 compute-1 sudo[160415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:53 compute-1 python3.9[160417]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 21:05:53 compute-1 podman[160452]: 2025-09-30 21:05:53.265896464 +0000 UTC m=+0.047611652 container create 4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS)
Sep 30 21:05:53 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:05:53 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.2910] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/21)
Sep 30 21:05:53 compute-1 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 21:05:53 compute-1 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 21:05:53 compute-1 kernel: veth0: entered allmulticast mode
Sep 30 21:05:53 compute-1 kernel: veth0: entered promiscuous mode
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3116] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Sep 30 21:05:53 compute-1 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 21:05:53 compute-1 kernel: podman0: port 1(veth0) entered forwarding state
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3163] device (veth0): carrier: link connected
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3171] device (podman0): carrier: link connected
Sep 30 21:05:53 compute-1 systemd-udevd[160480]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:05:53 compute-1 systemd-udevd[160484]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:05:53 compute-1 podman[160452]: 2025-09-30 21:05:53.241169724 +0000 UTC m=+0.022884952 image pull 4c2cf735485aec82560a51e8042a9e65bbe194a07c6812512d6a5e2ed955852b quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3469] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3475] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3481] device (podman0): Activation: starting connection 'podman0' (bd453741-1ca5-4022-a9a3-b63bd0f7b8a7)
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3486] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3488] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3489] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3491] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 21:05:53 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3764] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3767] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.3777] device (podman0): Activation: successful, device activated.
Sep 30 21:05:53 compute-1 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Sep 30 21:05:53 compute-1 systemd[1]: Started libpod-conmon-4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b.scope.
Sep 30 21:05:53 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:05:53 compute-1 podman[160452]: 2025-09-30 21:05:53.646272951 +0000 UTC m=+0.427988179 container init 4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:05:53 compute-1 podman[160452]: 2025-09-30 21:05:53.654255581 +0000 UTC m=+0.435970759 container start 4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:05:53 compute-1 iscsid_config[160610]: iqn.1994-05.com.redhat:1154e476b8e
Sep 30 21:05:53 compute-1 systemd[1]: libpod-4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b.scope: Deactivated successfully.
Sep 30 21:05:53 compute-1 podman[160452]: 2025-09-30 21:05:53.670335954 +0000 UTC m=+0.452051222 container attach 4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:05:53 compute-1 podman[160452]: 2025-09-30 21:05:53.672149573 +0000 UTC m=+0.453864761 container died 4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:05:53 compute-1 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 21:05:53 compute-1 kernel: veth0 (unregistering): left allmulticast mode
Sep 30 21:05:53 compute-1 kernel: veth0 (unregistering): left promiscuous mode
Sep 30 21:05:53 compute-1 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 21:05:53 compute-1 NetworkManager[51724]: <info>  [1759266353.7898] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:05:54 compute-1 systemd[1]: run-netns-netns\x2dfc6bc864\x2dc17b\x2d0a62\x2d9bbe\x2d37504ebc333c.mount: Deactivated successfully.
Sep 30 21:05:54 compute-1 podman[160452]: 2025-09-30 21:05:54.161561053 +0000 UTC m=+0.943276241 container remove 4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 21:05:54 compute-1 python3.9[160417]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid:current-podified /usr/sbin/iscsi-iname
Sep 30 21:05:54 compute-1 systemd[1]: libpod-conmon-4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b.scope: Deactivated successfully.
Sep 30 21:05:54 compute-1 python3.9[160417]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Sep 30 21:05:54 compute-1 systemd[1]: var-lib-containers-storage-overlay-951bc75bf88353526bf919560167642bc33c7f9751956d9fa110f6730d472a07-merged.mount: Deactivated successfully.
Sep 30 21:05:54 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f808c4b7866a30aa59702d36e39cb69e232c25799b81aa1844bf76fb703405b-userdata-shm.mount: Deactivated successfully.
Sep 30 21:05:54 compute-1 sudo[160415]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:54 compute-1 sshd-session[160113]: Failed password for root from 80.94.95.116 port 26994 ssh2
Sep 30 21:05:54 compute-1 sudo[160852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhtamrxcfrplkwpffqkbnxfdygbfjylj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266354.606273-323-72617642786486/AnsiballZ_stat.py'
Sep 30 21:05:54 compute-1 sudo[160852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:55 compute-1 python3.9[160854]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:55 compute-1 sudo[160852]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:55 compute-1 sshd-session[160113]: Connection closed by authenticating user root 80.94.95.116 port 26994 [preauth]
Sep 30 21:05:55 compute-1 sudo[160975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifjvbfbhusjqdjtbeajlmvfujekcrzhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266354.606273-323-72617642786486/AnsiballZ_copy.py'
Sep 30 21:05:55 compute-1 sudo[160975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:55 compute-1 python3.9[160977]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266354.606273-323-72617642786486/.source.iscsi _original_basename=.96a2rxxu follow=False checksum=dd11673f78546109b164188b78d2ca9485135b16 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:55 compute-1 sudo[160975]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:56 compute-1 sudo[161127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbahgcjggbnurevrpmjuwvsvpsipjvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266356.180637-368-231136790161570/AnsiballZ_file.py'
Sep 30 21:05:56 compute-1 sudo[161127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:56 compute-1 python3.9[161129]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:56 compute-1 sudo[161127]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:57 compute-1 python3.9[161279]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:05:58 compute-1 sudo[161431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrvcjwosqegiapvmijuuiehybrjychsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266357.7866385-419-251105520205798/AnsiballZ_lineinfile.py'
Sep 30 21:05:58 compute-1 sudo[161431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:58 compute-1 python3.9[161433]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:58 compute-1 sudo[161431]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:59 compute-1 sudo[161583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beekexnmcwmqpfixtvdshbjmwxpdelej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266358.892215-446-79643563341806/AnsiballZ_file.py'
Sep 30 21:05:59 compute-1 sudo[161583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:59 compute-1 python3.9[161585]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:59 compute-1 sudo[161583]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:59 compute-1 sudo[161735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaukwyixpnbivtbkkqkidxqtmpkrobta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266359.69594-470-131526320474783/AnsiballZ_stat.py'
Sep 30 21:05:59 compute-1 sudo[161735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:00 compute-1 python3.9[161737]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:00 compute-1 sudo[161735]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:00 compute-1 sudo[161813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eswjbdtftwrfxuijetmcxripzhessxxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266359.69594-470-131526320474783/AnsiballZ_file.py'
Sep 30 21:06:00 compute-1 sudo[161813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:00 compute-1 python3.9[161815]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:00 compute-1 sudo[161813]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:01 compute-1 sudo[161965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyynxskvlgucgicfgxcnbewjgifpyyjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266360.8789108-470-226492933912977/AnsiballZ_stat.py'
Sep 30 21:06:01 compute-1 sudo[161965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:01 compute-1 python3.9[161967]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:01 compute-1 sudo[161965]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:01 compute-1 sudo[162043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdgbmocjulfdujqvfcngtxdzvtiocpne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266360.8789108-470-226492933912977/AnsiballZ_file.py'
Sep 30 21:06:01 compute-1 sudo[162043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:01 compute-1 python3.9[162045]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:01 compute-1 sudo[162043]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:02 compute-1 sudo[162195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adejofrjveteinpuufcajkpalrocopny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266362.2753677-539-264060741780436/AnsiballZ_file.py'
Sep 30 21:06:02 compute-1 sudo[162195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:02 compute-1 python3.9[162197]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:02 compute-1 sudo[162195]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:03 compute-1 sudo[162347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzhslzwwqtonxhtuvjwhwfrlamtwjxlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266363.1053433-563-59308960709303/AnsiballZ_stat.py'
Sep 30 21:06:03 compute-1 sudo[162347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:03 compute-1 python3.9[162349]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:03 compute-1 sudo[162347]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:03 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 21:06:03 compute-1 sudo[162426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccvjdzpnaxxndlnzjqgwdfnsuftufpaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266363.1053433-563-59308960709303/AnsiballZ_file.py'
Sep 30 21:06:03 compute-1 sudo[162426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:04 compute-1 python3.9[162428]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:04 compute-1 sudo[162426]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:04 compute-1 sudo[162578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obdlyputlkthvrejlvvgjbljciwkcclj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266364.4844368-599-252924593663609/AnsiballZ_stat.py'
Sep 30 21:06:04 compute-1 sudo[162578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:04 compute-1 python3.9[162580]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:05 compute-1 sudo[162578]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:05 compute-1 sudo[162656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zugwufjutwywgsfpdfbjwragfmvhejvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266364.4844368-599-252924593663609/AnsiballZ_file.py'
Sep 30 21:06:05 compute-1 sudo[162656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:05 compute-1 python3.9[162658]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:05 compute-1 sudo[162656]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:06 compute-1 sudo[162823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cudvpujomvsdjssbbenvcicmlxrlgadk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266365.8256466-635-131822896768113/AnsiballZ_systemd.py'
Sep 30 21:06:06 compute-1 sudo[162823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:06 compute-1 podman[162782]: 2025-09-30 21:06:06.172107134 +0000 UTC m=+0.095989484 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:06:06 compute-1 python3.9[162829]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:06:06 compute-1 systemd[1]: Reloading.
Sep 30 21:06:06 compute-1 systemd-sysv-generator[162868]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:06:06 compute-1 systemd-rc-local-generator[162865]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:06:06 compute-1 sudo[162823]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:07 compute-1 sudo[163024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfmadwlvhobpnkrxnesfdliscoenczhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266367.208642-659-177430279180791/AnsiballZ_stat.py'
Sep 30 21:06:07 compute-1 sudo[163024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:07 compute-1 python3.9[163026]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:07 compute-1 sudo[163024]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:07 compute-1 sudo[163102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdhpaeqylfxzicvfrmiuwvkzqcldcbuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266367.208642-659-177430279180791/AnsiballZ_file.py'
Sep 30 21:06:07 compute-1 sudo[163102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:08 compute-1 python3.9[163104]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:08 compute-1 sudo[163102]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:08 compute-1 sudo[163254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiyhkdsrwjqgseughkpvdowyzdhawkgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266368.540518-695-231780237778379/AnsiballZ_stat.py'
Sep 30 21:06:08 compute-1 sudo[163254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:09 compute-1 python3.9[163256]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:09 compute-1 sudo[163254]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:09 compute-1 sudo[163332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtjhfvqsqvxewwffuipgafzswwersbqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266368.540518-695-231780237778379/AnsiballZ_file.py'
Sep 30 21:06:09 compute-1 sudo[163332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:09 compute-1 python3.9[163334]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:09 compute-1 sudo[163332]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:10 compute-1 sudo[163484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlfylkhikefcubqlfzhxczbwclzykuym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266369.9332376-731-9924112634194/AnsiballZ_systemd.py'
Sep 30 21:06:10 compute-1 sudo[163484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:10 compute-1 python3.9[163486]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:06:10 compute-1 systemd[1]: Reloading.
Sep 30 21:06:10 compute-1 systemd-sysv-generator[163515]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:06:10 compute-1 systemd-rc-local-generator[163511]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:06:10 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 21:06:10 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 21:06:10 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 21:06:10 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 21:06:10 compute-1 sudo[163484]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:11 compute-1 sudo[163678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhwbkfyubljsawkfalrbnrhmrverwsas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266371.4534786-761-93785712379281/AnsiballZ_file.py'
Sep 30 21:06:11 compute-1 sudo[163678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:12 compute-1 python3.9[163680]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:12 compute-1 sudo[163678]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:12 compute-1 sudo[163830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmkzckhrjcaarifksceyutotrwtpvljh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266372.345718-785-194561870416710/AnsiballZ_stat.py'
Sep 30 21:06:12 compute-1 sudo[163830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:12 compute-1 python3.9[163832]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:12 compute-1 sudo[163830]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:13 compute-1 sudo[163953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnfmdxrleorgcvrvyoybvsaztihrljwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266372.345718-785-194561870416710/AnsiballZ_copy.py'
Sep 30 21:06:13 compute-1 sudo[163953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:13 compute-1 podman[163955]: 2025-09-30 21:06:13.313057458 +0000 UTC m=+0.053467994 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:06:13 compute-1 python3.9[163956]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266372.345718-785-194561870416710/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:13 compute-1 sudo[163953]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:14 compute-1 sudo[164125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fklmtwljrnldmjgplkaycacukfktigle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266374.081412-836-227512349989575/AnsiballZ_file.py'
Sep 30 21:06:14 compute-1 sudo[164125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:14 compute-1 python3.9[164127]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:14 compute-1 sudo[164125]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:15 compute-1 sudo[164277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocgatqimcfgtqtaxvndlqgbcbywfphmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266374.8726492-860-181224511727394/AnsiballZ_stat.py'
Sep 30 21:06:15 compute-1 sudo[164277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:15 compute-1 python3.9[164279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:15 compute-1 sudo[164277]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:15 compute-1 sudo[164400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwkycezmjrhnkkacffheswfkcclzgzgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266374.8726492-860-181224511727394/AnsiballZ_copy.py'
Sep 30 21:06:15 compute-1 sudo[164400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:16 compute-1 python3.9[164402]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266374.8726492-860-181224511727394/.source.json _original_basename=.4yby76lj follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:16 compute-1 sudo[164400]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:16 compute-1 sudo[164552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khgimjmllmjkemnkccshuzpjdbvtdxom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266376.3645325-905-35891489761033/AnsiballZ_file.py'
Sep 30 21:06:16 compute-1 sudo[164552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:16 compute-1 python3.9[164554]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:16 compute-1 sudo[164552]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:17 compute-1 sudo[164704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gptnnemdjriazxvxugcxnkpftjxzyqox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266377.161094-929-197352482520108/AnsiballZ_stat.py'
Sep 30 21:06:17 compute-1 sudo[164704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:17 compute-1 sudo[164704]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:18 compute-1 sudo[164827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkgrpowljmgskwljdgvbubacwapwibmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266377.161094-929-197352482520108/AnsiballZ_copy.py'
Sep 30 21:06:18 compute-1 sudo[164827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:18 compute-1 sudo[164827]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:19 compute-1 sudo[164979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggurlbylpbqjafbcjriqqzeoigmkfmup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266378.9249809-980-142478200638277/AnsiballZ_container_config_data.py'
Sep 30 21:06:19 compute-1 sudo[164979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:19 compute-1 python3.9[164981]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Sep 30 21:06:19 compute-1 sudo[164979]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:20 compute-1 sudo[165131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiabdnmaniwbiypdylivkqyxlokrvpon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266379.9129853-1007-257488376733764/AnsiballZ_container_config_hash.py'
Sep 30 21:06:20 compute-1 sudo[165131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:20 compute-1 python3.9[165133]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:06:20 compute-1 sudo[165131]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:21 compute-1 sudo[165283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdgomwjixxlxzrspgpaxsbkunwfjlntz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266381.143005-1034-115771076803398/AnsiballZ_podman_container_info.py'
Sep 30 21:06:21 compute-1 sudo[165283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:21 compute-1 python3.9[165285]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 21:06:22 compute-1 sudo[165283]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:23 compute-1 sudo[165461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdldkdqzpytnquksimtnhjjlwyynigxm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266382.949558-1073-272346563527864/AnsiballZ_edpm_container_manage.py'
Sep 30 21:06:23 compute-1 sudo[165461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:23 compute-1 python3[165463]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:06:23 compute-1 podman[165498]: 2025-09-30 21:06:23.985212227 +0000 UTC m=+0.061531446 container create bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 21:06:23 compute-1 podman[165498]: 2025-09-30 21:06:23.951256032 +0000 UTC m=+0.027575331 image pull 4c2cf735485aec82560a51e8042a9e65bbe194a07c6812512d6a5e2ed955852b quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Sep 30 21:06:23 compute-1 python3[165463]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Sep 30 21:06:24 compute-1 sudo[165461]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:24 compute-1 sudo[165686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebdqjuhzjgysncqtibymzqelbokqhfxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266384.4392316-1098-69011860620685/AnsiballZ_stat.py'
Sep 30 21:06:24 compute-1 sudo[165686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:25 compute-1 python3.9[165688]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:25 compute-1 sudo[165686]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:25 compute-1 sudo[165840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrmblnweizmvfozcpuyqsenjqtuhnzzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266385.4734797-1124-81878213570872/AnsiballZ_file.py'
Sep 30 21:06:25 compute-1 sudo[165840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:26 compute-1 python3.9[165842]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:26 compute-1 sudo[165840]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:26 compute-1 sudo[165916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ronavyjhjaphoefwppuqtfrnvwophhhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266385.4734797-1124-81878213570872/AnsiballZ_stat.py'
Sep 30 21:06:26 compute-1 sudo[165916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:26 compute-1 python3.9[165918]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:26 compute-1 sudo[165916]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:27 compute-1 sudo[166067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvkjzqhnthtqvuaszsbjdpzmyotkvchs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266386.6353438-1124-1740607578660/AnsiballZ_copy.py'
Sep 30 21:06:27 compute-1 sudo[166067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:27 compute-1 python3.9[166069]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266386.6353438-1124-1740607578660/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:27 compute-1 sudo[166067]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:27 compute-1 sudo[166143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owuuipeeminbuutkvllrtbuipkrgxief ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266386.6353438-1124-1740607578660/AnsiballZ_systemd.py'
Sep 30 21:06:27 compute-1 sudo[166143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:28 compute-1 python3.9[166145]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:06:28 compute-1 systemd[1]: Reloading.
Sep 30 21:06:28 compute-1 systemd-rc-local-generator[166172]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:06:28 compute-1 systemd-sysv-generator[166175]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:06:28 compute-1 sudo[166143]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:28 compute-1 sudo[166253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkigytgtlndnfkvibamgabjavknsuggo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266386.6353438-1124-1740607578660/AnsiballZ_systemd.py'
Sep 30 21:06:28 compute-1 sudo[166253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:28 compute-1 python3.9[166255]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:06:29 compute-1 systemd[1]: Reloading.
Sep 30 21:06:29 compute-1 systemd-rc-local-generator[166285]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:06:29 compute-1 systemd-sysv-generator[166289]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:06:29 compute-1 systemd[1]: Starting iscsid container...
Sep 30 21:06:29 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:06:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0f082b6b2b496bdba2bf734588be695106dc04482899131352d35580a0a7345/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:06:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0f082b6b2b496bdba2bf734588be695106dc04482899131352d35580a0a7345/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Sep 30 21:06:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0f082b6b2b496bdba2bf734588be695106dc04482899131352d35580a0a7345/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:06:29 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa.
Sep 30 21:06:29 compute-1 podman[166295]: 2025-09-30 21:06:29.60837982 +0000 UTC m=+0.259252892 container init bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:06:29 compute-1 iscsid[166311]: + sudo -E kolla_set_configs
Sep 30 21:06:29 compute-1 podman[166295]: 2025-09-30 21:06:29.63961881 +0000 UTC m=+0.290491852 container start bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:06:29 compute-1 sudo[166317]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:06:29 compute-1 systemd[1]: Created slice User Slice of UID 0.
Sep 30 21:06:29 compute-1 podman[166295]: iscsid
Sep 30 21:06:29 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 21:06:29 compute-1 systemd[1]: Started iscsid container.
Sep 30 21:06:29 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 21:06:29 compute-1 sudo[166253]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:29 compute-1 systemd[1]: Starting User Manager for UID 0...
Sep 30 21:06:29 compute-1 podman[166318]: 2025-09-30 21:06:29.730307638 +0000 UTC m=+0.076707294 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:06:29 compute-1 systemd[1]: bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa-5d5dc59f1658183a.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:06:29 compute-1 systemd[1]: bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa-5d5dc59f1658183a.service: Failed with result 'exit-code'.
Sep 30 21:06:29 compute-1 systemd[166338]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 21:06:29 compute-1 systemd[166338]: Queued start job for default target Main User Target.
Sep 30 21:06:29 compute-1 systemd[166338]: Created slice User Application Slice.
Sep 30 21:06:29 compute-1 systemd[166338]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 21:06:29 compute-1 systemd[166338]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:06:29 compute-1 systemd[166338]: Reached target Paths.
Sep 30 21:06:29 compute-1 systemd[166338]: Reached target Timers.
Sep 30 21:06:29 compute-1 systemd[166338]: Starting D-Bus User Message Bus Socket...
Sep 30 21:06:29 compute-1 systemd[166338]: Starting Create User's Volatile Files and Directories...
Sep 30 21:06:29 compute-1 systemd[166338]: Finished Create User's Volatile Files and Directories.
Sep 30 21:06:29 compute-1 systemd[166338]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:06:29 compute-1 systemd[166338]: Reached target Sockets.
Sep 30 21:06:29 compute-1 systemd[166338]: Reached target Basic System.
Sep 30 21:06:29 compute-1 systemd[166338]: Reached target Main User Target.
Sep 30 21:06:29 compute-1 systemd[166338]: Startup finished in 133ms.
Sep 30 21:06:29 compute-1 systemd[1]: Started User Manager for UID 0.
Sep 30 21:06:29 compute-1 systemd[1]: Started Session c3 of User root.
Sep 30 21:06:29 compute-1 sudo[166317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:06:29 compute-1 iscsid[166311]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:06:29 compute-1 iscsid[166311]: INFO:__main__:Validating config file
Sep 30 21:06:29 compute-1 iscsid[166311]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:06:29 compute-1 iscsid[166311]: INFO:__main__:Writing out command to execute
Sep 30 21:06:29 compute-1 sudo[166317]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:29 compute-1 systemd[1]: session-c3.scope: Deactivated successfully.
Sep 30 21:06:29 compute-1 iscsid[166311]: ++ cat /run_command
Sep 30 21:06:29 compute-1 iscsid[166311]: + CMD='/usr/sbin/iscsid -f'
Sep 30 21:06:29 compute-1 iscsid[166311]: + ARGS=
Sep 30 21:06:29 compute-1 iscsid[166311]: + sudo kolla_copy_cacerts
Sep 30 21:06:29 compute-1 sudo[166381]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:06:30 compute-1 systemd[1]: Started Session c4 of User root.
Sep 30 21:06:30 compute-1 sudo[166381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:06:30 compute-1 sudo[166381]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:30 compute-1 systemd[1]: session-c4.scope: Deactivated successfully.
Sep 30 21:06:30 compute-1 iscsid[166311]: + [[ ! -n '' ]]
Sep 30 21:06:30 compute-1 iscsid[166311]: + . kolla_extend_start
Sep 30 21:06:30 compute-1 iscsid[166311]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Sep 30 21:06:30 compute-1 iscsid[166311]: Running command: '/usr/sbin/iscsid -f'
Sep 30 21:06:30 compute-1 iscsid[166311]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Sep 30 21:06:30 compute-1 iscsid[166311]: + umask 0022
Sep 30 21:06:30 compute-1 iscsid[166311]: + exec /usr/sbin/iscsid -f
Sep 30 21:06:30 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Sep 30 21:06:30 compute-1 python3.9[166517]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:31 compute-1 sudo[166667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqlwiqowivtspjmvflqlhujagmcuuuvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266391.087281-1235-75646462449497/AnsiballZ_file.py'
Sep 30 21:06:31 compute-1 sudo[166667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:31 compute-1 python3.9[166669]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:31 compute-1 sudo[166667]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:32 compute-1 sudo[166819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofuhazdcfppteuqkcgbwbvsarnzbiwha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266392.1580613-1268-8828694891232/AnsiballZ_service_facts.py'
Sep 30 21:06:32 compute-1 sudo[166819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:32 compute-1 python3.9[166821]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:06:32 compute-1 network[166838]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:06:32 compute-1 network[166839]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:06:32 compute-1 network[166840]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:06:36 compute-1 podman[166949]: 2025-09-30 21:06:36.932783507 +0000 UTC m=+0.175760053 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2)
Sep 30 21:06:38 compute-1 sudo[166819]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:06:38.664 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:06:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:06:38.664 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:06:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:06:38.664 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:06:38 compute-1 sudo[167137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqpircesgmlupdadijsxueciednvsauk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266398.6491408-1298-185740283034463/AnsiballZ_file.py'
Sep 30 21:06:38 compute-1 sudo[167137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:39 compute-1 python3.9[167139]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 21:06:39 compute-1 sudo[167137]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:40 compute-1 sudo[167289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dorjhxfduteibdigjsggrxdlsfharayn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266399.4990907-1323-200149727770709/AnsiballZ_modprobe.py'
Sep 30 21:06:40 compute-1 sudo[167289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:40 compute-1 systemd[1]: Stopping User Manager for UID 0...
Sep 30 21:06:40 compute-1 systemd[166338]: Activating special unit Exit the Session...
Sep 30 21:06:40 compute-1 systemd[166338]: Stopped target Main User Target.
Sep 30 21:06:40 compute-1 systemd[166338]: Stopped target Basic System.
Sep 30 21:06:40 compute-1 systemd[166338]: Stopped target Paths.
Sep 30 21:06:40 compute-1 systemd[166338]: Stopped target Sockets.
Sep 30 21:06:40 compute-1 systemd[166338]: Stopped target Timers.
Sep 30 21:06:40 compute-1 systemd[166338]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:06:40 compute-1 systemd[166338]: Closed D-Bus User Message Bus Socket.
Sep 30 21:06:40 compute-1 systemd[166338]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:06:40 compute-1 systemd[166338]: Removed slice User Application Slice.
Sep 30 21:06:40 compute-1 systemd[166338]: Reached target Shutdown.
Sep 30 21:06:40 compute-1 systemd[166338]: Finished Exit the Session.
Sep 30 21:06:40 compute-1 systemd[166338]: Reached target Exit the Session.
Sep 30 21:06:40 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 21:06:40 compute-1 systemd[1]: Stopped User Manager for UID 0.
Sep 30 21:06:40 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 21:06:40 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 21:06:40 compute-1 python3.9[167291]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Sep 30 21:06:40 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 21:06:40 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 21:06:40 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 21:06:40 compute-1 sudo[167289]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:41 compute-1 sudo[167447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqktgnmmobdilgfubkocvenhqaeqfggb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266400.654139-1346-2815906500930/AnsiballZ_stat.py'
Sep 30 21:06:41 compute-1 sudo[167447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:41 compute-1 python3.9[167449]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:41 compute-1 sudo[167447]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:41 compute-1 sudo[167570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbcvsxhplvtxzlalyhlkvppaowtytfcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266400.654139-1346-2815906500930/AnsiballZ_copy.py'
Sep 30 21:06:41 compute-1 sudo[167570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:41 compute-1 python3.9[167572]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266400.654139-1346-2815906500930/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:41 compute-1 sudo[167570]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:42 compute-1 sudo[167722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuznmefzhrjnwocmcopqcqgpoexexlef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266402.3113568-1394-271612209748781/AnsiballZ_lineinfile.py'
Sep 30 21:06:42 compute-1 sudo[167722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:42 compute-1 python3.9[167724]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:42 compute-1 sudo[167722]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:43 compute-1 sudo[167885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqnlrnffqkfkgmmllyfhurwmojejqlok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266403.1098394-1418-209386783240025/AnsiballZ_systemd.py'
Sep 30 21:06:43 compute-1 sudo[167885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:43 compute-1 podman[167848]: 2025-09-30 21:06:43.454138315 +0000 UTC m=+0.064570359 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:06:43 compute-1 python3.9[167891]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:06:43 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 21:06:43 compute-1 systemd[1]: Stopped Load Kernel Modules.
Sep 30 21:06:43 compute-1 systemd[1]: Stopping Load Kernel Modules...
Sep 30 21:06:43 compute-1 systemd[1]: Starting Load Kernel Modules...
Sep 30 21:06:43 compute-1 systemd[1]: Finished Load Kernel Modules.
Sep 30 21:06:43 compute-1 sudo[167885]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:44 compute-1 sudo[168048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wseaswmboghsqefhxadvykdkaglzvumw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266404.2533667-1442-36122691283552/AnsiballZ_file.py'
Sep 30 21:06:44 compute-1 sudo[168048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:44 compute-1 python3.9[168050]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:44 compute-1 sudo[168048]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:45 compute-1 sudo[168200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbubmaouonpxstrfviiemutrbfkrdseg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266405.233354-1469-10214485295744/AnsiballZ_stat.py'
Sep 30 21:06:45 compute-1 sudo[168200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:45 compute-1 python3.9[168202]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:45 compute-1 sudo[168200]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:46 compute-1 sudo[168352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxndocuagkjfhthyzebtekmmumqzmonq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266406.1528096-1496-14440086827443/AnsiballZ_stat.py'
Sep 30 21:06:46 compute-1 sudo[168352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:46 compute-1 python3.9[168354]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:46 compute-1 sudo[168352]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:47 compute-1 sudo[168504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqvqdzohsgepptjoyohicipwcmfjjair ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266407.0764349-1520-81773423874888/AnsiballZ_stat.py'
Sep 30 21:06:47 compute-1 sudo[168504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:47 compute-1 python3.9[168506]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:47 compute-1 sudo[168504]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:48 compute-1 sudo[168627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrpnbitswddumignvoutkhgfmpridaba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266407.0764349-1520-81773423874888/AnsiballZ_copy.py'
Sep 30 21:06:48 compute-1 sudo[168627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:48 compute-1 python3.9[168629]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266407.0764349-1520-81773423874888/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:48 compute-1 sudo[168627]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:49 compute-1 sudo[168779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzcadatmddkkcirykuoffbczcmquxclq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266408.5169935-1565-265053899998178/AnsiballZ_command.py'
Sep 30 21:06:49 compute-1 sudo[168779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:49 compute-1 python3.9[168781]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:06:49 compute-1 sudo[168779]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:49 compute-1 sudo[168932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaeusvvduuscengkfgdvkserfnpddrud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266409.5798361-1589-259669743847987/AnsiballZ_lineinfile.py'
Sep 30 21:06:49 compute-1 sudo[168932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:50 compute-1 python3.9[168934]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:50 compute-1 sudo[168932]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:50 compute-1 sudo[169084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raoadxzonosqdiyoonpvdsdvyubimblb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266410.422916-1613-140823338591329/AnsiballZ_replace.py'
Sep 30 21:06:50 compute-1 sudo[169084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:51 compute-1 python3.9[169086]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:51 compute-1 sudo[169084]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:51 compute-1 sudo[169236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnolpncmxkmgesvvuxdemhsarpdlnqew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266411.4499943-1637-25503276236052/AnsiballZ_replace.py'
Sep 30 21:06:51 compute-1 sudo[169236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:52 compute-1 python3.9[169238]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:52 compute-1 sudo[169236]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:52 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Sep 30 21:06:52 compute-1 sudo[169389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvfkxlrrrtjgxpxuitbamomvdvlmbhcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266412.3274152-1664-41890230022696/AnsiballZ_lineinfile.py'
Sep 30 21:06:52 compute-1 sudo[169389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:52 compute-1 python3.9[169391]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:52 compute-1 sudo[169389]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:53 compute-1 sudo[169541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vglwlevrshfiayhyywbdwaygpiguyivo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266412.9805198-1664-110444241932976/AnsiballZ_lineinfile.py'
Sep 30 21:06:53 compute-1 sudo[169541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:53 compute-1 python3.9[169543]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:53 compute-1 sudo[169541]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:53 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 21:06:53 compute-1 sudo[169694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgfkoujeqstgfrsfsrhkgtjqjmuesoru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266413.5813136-1664-56129208314428/AnsiballZ_lineinfile.py'
Sep 30 21:06:53 compute-1 sudo[169694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:54 compute-1 python3.9[169696]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:54 compute-1 sudo[169694]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:54 compute-1 sudo[169846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vumhbvoklnnnfkblsjbbhzvphxqwjtpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266414.2121413-1664-69847498903532/AnsiballZ_lineinfile.py'
Sep 30 21:06:54 compute-1 sudo[169846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:54 compute-1 python3.9[169848]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:54 compute-1 sudo[169846]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:55 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Sep 30 21:06:55 compute-1 sudo[169999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggmwpybjpexxfdhukefxcglcxxichxes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266415.4019184-1752-12271990711064/AnsiballZ_stat.py'
Sep 30 21:06:55 compute-1 sudo[169999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:55 compute-1 python3.9[170001]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:55 compute-1 sudo[169999]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:56 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 21:06:56 compute-1 sudo[170154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkciqhbmbjyefxexunjtiacoyyorgxch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266416.2299721-1775-125204429860629/AnsiballZ_file.py'
Sep 30 21:06:56 compute-1 sudo[170154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:56 compute-1 python3.9[170156]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:56 compute-1 sudo[170154]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:57 compute-1 sudo[170306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoaexspldafhsattlopncljqulaidoep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266417.2228043-1802-36301555767201/AnsiballZ_file.py'
Sep 30 21:06:57 compute-1 sudo[170306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:57 compute-1 python3.9[170308]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:57 compute-1 sudo[170306]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:58 compute-1 sudo[170458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvubmqunbnhplxiidkvbmxptdblixpol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266418.0482192-1826-21144401855069/AnsiballZ_stat.py'
Sep 30 21:06:58 compute-1 sudo[170458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:58 compute-1 python3.9[170460]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:58 compute-1 sudo[170458]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:58 compute-1 sudo[170536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thidtnrgfktyybkybeuwszjuviostdze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266418.0482192-1826-21144401855069/AnsiballZ_file.py'
Sep 30 21:06:58 compute-1 sudo[170536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:59 compute-1 python3.9[170538]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:59 compute-1 sudo[170536]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:59 compute-1 sudo[170688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgpkshvckgcbesmmcskwmhqqddexosmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266419.2431467-1826-84705218799314/AnsiballZ_stat.py'
Sep 30 21:06:59 compute-1 sudo[170688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:59 compute-1 python3.9[170690]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:59 compute-1 sudo[170688]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:00 compute-1 sudo[170782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qigxqivkrdrunfxhesawdxedjhfqitlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266419.2431467-1826-84705218799314/AnsiballZ_file.py'
Sep 30 21:07:00 compute-1 sudo[170782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:00 compute-1 podman[170740]: 2025-09-30 21:07:00.098503441 +0000 UTC m=+0.062322268 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:07:00 compute-1 python3.9[170787]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:07:00 compute-1 sudo[170782]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:00 compute-1 sudo[170937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftuiumyrsamspynfjbdlvfkcdkirqvmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266420.643825-1895-62329247039590/AnsiballZ_file.py'
Sep 30 21:07:00 compute-1 sudo[170937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:01 compute-1 python3.9[170939]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:01 compute-1 sudo[170937]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:01 compute-1 sudo[171089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obhhxkjcowrqjbrhhnkpnwrcnpyvmxkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266421.563055-1919-34380599640219/AnsiballZ_stat.py'
Sep 30 21:07:01 compute-1 sudo[171089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:02 compute-1 python3.9[171091]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:02 compute-1 sudo[171089]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:02 compute-1 sudo[171167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xratpxueemsrfxrzvcdfsqrqlnjbcslk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266421.563055-1919-34380599640219/AnsiballZ_file.py'
Sep 30 21:07:02 compute-1 sudo[171167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:02 compute-1 python3.9[171169]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:02 compute-1 sudo[171167]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:03 compute-1 sudo[171319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvytvnmfuvicqfoqfjvqeezrqkfahjyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266422.8449967-1955-255966489173378/AnsiballZ_stat.py'
Sep 30 21:07:03 compute-1 sudo[171319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:03 compute-1 python3.9[171321]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:03 compute-1 sudo[171319]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:03 compute-1 sudo[171397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqmydfmmhvwspeufrjxbulntmdvjzmiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266422.8449967-1955-255966489173378/AnsiballZ_file.py'
Sep 30 21:07:03 compute-1 sudo[171397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:03 compute-1 python3.9[171399]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:03 compute-1 sudo[171397]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:04 compute-1 sudo[171549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahyyzycchkfyxbjylqyayqwfapwjydyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266424.2749305-1991-89552181083506/AnsiballZ_systemd.py'
Sep 30 21:07:04 compute-1 sudo[171549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:04 compute-1 python3.9[171551]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:07:04 compute-1 systemd[1]: Reloading.
Sep 30 21:07:04 compute-1 systemd-rc-local-generator[171580]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:04 compute-1 systemd-sysv-generator[171585]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:05 compute-1 sudo[171549]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:05 compute-1 sudo[171738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afofagjzzuyjxoroyggzerbvkjnniewv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266425.6092148-2015-196150181417248/AnsiballZ_stat.py'
Sep 30 21:07:05 compute-1 sudo[171738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:06 compute-1 python3.9[171740]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:06 compute-1 sudo[171738]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:06 compute-1 sudo[171816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpwoalkderxtlitowfakjtziohehkemr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266425.6092148-2015-196150181417248/AnsiballZ_file.py'
Sep 30 21:07:06 compute-1 sudo[171816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:06 compute-1 python3.9[171818]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:06 compute-1 sudo[171816]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:07 compute-1 sudo[171984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlxbcyzplphmwhqarhgcngstcepibwaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266426.8735452-2052-178499008434718/AnsiballZ_stat.py'
Sep 30 21:07:07 compute-1 sudo[171984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:07 compute-1 podman[171942]: 2025-09-30 21:07:07.217709437 +0000 UTC m=+0.099280976 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:07:07 compute-1 python3.9[171990]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:07 compute-1 sudo[171984]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:07 compute-1 sudo[172071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygygtinwftcaibxwdlpusnufjfepemzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266426.8735452-2052-178499008434718/AnsiballZ_file.py'
Sep 30 21:07:07 compute-1 sudo[172071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:07 compute-1 python3.9[172073]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:07 compute-1 sudo[172071]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:08 compute-1 sudo[172223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpowfykqrwaekudgxkaqzfmpvpaytnss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266428.2879105-2087-174071194003446/AnsiballZ_systemd.py'
Sep 30 21:07:08 compute-1 sudo[172223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:08 compute-1 python3.9[172225]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:07:08 compute-1 systemd[1]: Reloading.
Sep 30 21:07:09 compute-1 systemd-rc-local-generator[172249]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:09 compute-1 systemd-sysv-generator[172255]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:09 compute-1 systemd[1]: Starting Create netns directory...
Sep 30 21:07:09 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 21:07:09 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 21:07:09 compute-1 systemd[1]: Finished Create netns directory.
Sep 30 21:07:09 compute-1 sudo[172223]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:09 compute-1 sudo[172416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxultlnnqdfuqlclbqxmdwjvoyitijay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266429.7247093-2117-43316053146752/AnsiballZ_file.py'
Sep 30 21:07:09 compute-1 sudo[172416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:10 compute-1 python3.9[172418]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:07:10 compute-1 sudo[172416]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:11 compute-1 sudo[172568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuybwndbjpqblrqdzcvnqldcqegxveio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266430.6074204-2141-188201748965204/AnsiballZ_stat.py'
Sep 30 21:07:11 compute-1 sudo[172568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:11 compute-1 python3.9[172570]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:11 compute-1 sudo[172568]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:12 compute-1 sudo[172691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djfuppxkinjoezchgdvjhnntgauxxihl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266430.6074204-2141-188201748965204/AnsiballZ_copy.py'
Sep 30 21:07:12 compute-1 sudo[172691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:12 compute-1 python3.9[172693]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266430.6074204-2141-188201748965204/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:07:12 compute-1 sudo[172691]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:13 compute-1 sudo[172843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzomvjmnkagymboumxnfdgdrrmhenmvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266433.0315552-2192-51536770646146/AnsiballZ_file.py'
Sep 30 21:07:13 compute-1 sudo[172843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:13 compute-1 python3.9[172845]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:07:13 compute-1 sudo[172843]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:14 compute-1 sudo[173006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urqpdkaxagsddyrqyuopjfwgtsgtovnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266433.8507645-2216-233126059792615/AnsiballZ_stat.py'
Sep 30 21:07:14 compute-1 sudo[173006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:14 compute-1 podman[172969]: 2025-09-30 21:07:14.160209815 +0000 UTC m=+0.055834329 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 21:07:14 compute-1 python3.9[173014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:14 compute-1 sudo[173006]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:15 compute-1 sudo[173137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxmjvpewylfzigdyennyexcwmmhceanw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266433.8507645-2216-233126059792615/AnsiballZ_copy.py'
Sep 30 21:07:15 compute-1 sudo[173137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:15 compute-1 python3.9[173139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266433.8507645-2216-233126059792615/.source.json _original_basename=.ghzp54bq follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:15 compute-1 sudo[173137]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:16 compute-1 sudo[173289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhhuiywyksmvcvmhsyfpwfsnyqexessl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266436.0432706-2261-58631366531857/AnsiballZ_file.py'
Sep 30 21:07:16 compute-1 sudo[173289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:16 compute-1 python3.9[173291]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:16 compute-1 sudo[173289]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:17 compute-1 sudo[173441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmglkltyyqlyhfrxinhyjkzyvnbtmhht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266436.9691942-2285-171185515156323/AnsiballZ_stat.py'
Sep 30 21:07:17 compute-1 sudo[173441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:17 compute-1 sudo[173441]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:17 compute-1 sudo[173564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjaaihucogjexlausdwpncdynpeubgmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266436.9691942-2285-171185515156323/AnsiballZ_copy.py'
Sep 30 21:07:17 compute-1 sudo[173564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:18 compute-1 sudo[173564]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:19 compute-1 sudo[173716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxhuxuvucirbzvjswrtpfbcikdpnybkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266438.94449-2336-83714049337499/AnsiballZ_container_config_data.py'
Sep 30 21:07:19 compute-1 sudo[173716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:19 compute-1 python3.9[173718]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Sep 30 21:07:19 compute-1 sudo[173716]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:20 compute-1 sudo[173868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msjwwuevvtutjfdacuntxezsfvrfazss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266439.8596573-2363-113293936174722/AnsiballZ_container_config_hash.py'
Sep 30 21:07:20 compute-1 sudo[173868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:20 compute-1 python3.9[173870]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:07:20 compute-1 sudo[173868]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:21 compute-1 sudo[174020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfiuibokzujfersgynivyqhzaqvfxhwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266440.7361891-2390-57002568856401/AnsiballZ_podman_container_info.py'
Sep 30 21:07:21 compute-1 sudo[174020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:21 compute-1 python3.9[174022]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 21:07:21 compute-1 sudo[174020]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:22 compute-1 sudo[174198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sszrdvdbhbfsdtmvkkggubmksilgwlqr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266442.4764862-2429-48657073663194/AnsiballZ_edpm_container_manage.py'
Sep 30 21:07:22 compute-1 sudo[174198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:23 compute-1 python3[174200]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:07:23 compute-1 podman[174235]: 2025-09-30 21:07:23.326101269 +0000 UTC m=+0.029458402 image pull 80aeb93432d60c5f52c5325081f51dbf5658fe1615083ed284852e8f6df43250 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Sep 30 21:07:23 compute-1 podman[174235]: 2025-09-30 21:07:23.513467229 +0000 UTC m=+0.216824352 container create 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:07:23 compute-1 python3[174200]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Sep 30 21:07:23 compute-1 sudo[174198]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:24 compute-1 sudo[174422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgzspzdhdyzcuflapzvgvnanalwsycoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266444.1724584-2453-247786540289830/AnsiballZ_stat.py'
Sep 30 21:07:24 compute-1 sudo[174422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:24 compute-1 python3.9[174424]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:07:24 compute-1 sudo[174422]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:25 compute-1 sudo[174576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwxdsxdvdgnoopfzuvtaroktkoxwlxyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266445.2071059-2480-201911945322850/AnsiballZ_file.py'
Sep 30 21:07:25 compute-1 sudo[174576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:25 compute-1 python3.9[174578]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:25 compute-1 sudo[174576]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:25 compute-1 sudo[174652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agwnrhpdcxpnlnaiicndgrdikkssbbps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266445.2071059-2480-201911945322850/AnsiballZ_stat.py'
Sep 30 21:07:25 compute-1 sudo[174652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:26 compute-1 python3.9[174654]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:07:26 compute-1 sudo[174652]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:26 compute-1 sudo[174803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmprnubracpcgjebqfieriuwxsywmoyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266446.2751215-2480-5233725611535/AnsiballZ_copy.py'
Sep 30 21:07:26 compute-1 sudo[174803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:27 compute-1 python3.9[174805]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266446.2751215-2480-5233725611535/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:27 compute-1 sudo[174803]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:27 compute-1 sudo[174879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beegkpcqnclnfnwsfxlkecifzmqnqhtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266446.2751215-2480-5233725611535/AnsiballZ_systemd.py'
Sep 30 21:07:27 compute-1 sudo[174879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:27 compute-1 python3.9[174881]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:07:27 compute-1 systemd[1]: Reloading.
Sep 30 21:07:27 compute-1 systemd-rc-local-generator[174908]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:27 compute-1 systemd-sysv-generator[174913]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:27 compute-1 sudo[174879]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:28 compute-1 sudo[174990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdpityzamhtxnzmvlokpfubruoluehyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266446.2751215-2480-5233725611535/AnsiballZ_systemd.py'
Sep 30 21:07:28 compute-1 sudo[174990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:28 compute-1 python3.9[174992]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:07:28 compute-1 systemd[1]: Reloading.
Sep 30 21:07:28 compute-1 systemd-rc-local-generator[175023]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:28 compute-1 systemd-sysv-generator[175027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:28 compute-1 systemd[1]: Starting multipathd container...
Sep 30 21:07:28 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:07:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1782cc2aeec0ede754b3fa1dea61e7c3a04b8cea63aa9bd1c7fb6d26ebffcf6/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 21:07:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1782cc2aeec0ede754b3fa1dea61e7c3a04b8cea63aa9bd1c7fb6d26ebffcf6/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:07:29 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804.
Sep 30 21:07:29 compute-1 podman[175033]: 2025-09-30 21:07:29.006527128 +0000 UTC m=+0.123672857 container init 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:07:29 compute-1 multipathd[175048]: + sudo -E kolla_set_configs
Sep 30 21:07:29 compute-1 podman[175033]: 2025-09-30 21:07:29.029704647 +0000 UTC m=+0.146850346 container start 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, config_id=multipathd)
Sep 30 21:07:29 compute-1 sudo[175054]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:07:29 compute-1 sudo[175054]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:07:29 compute-1 sudo[175054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:07:29 compute-1 podman[175033]: multipathd
Sep 30 21:07:29 compute-1 systemd[1]: Started multipathd container.
Sep 30 21:07:29 compute-1 multipathd[175048]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:07:29 compute-1 multipathd[175048]: INFO:__main__:Validating config file
Sep 30 21:07:29 compute-1 multipathd[175048]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:07:29 compute-1 multipathd[175048]: INFO:__main__:Writing out command to execute
Sep 30 21:07:29 compute-1 sudo[174990]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:29 compute-1 sudo[175054]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:29 compute-1 multipathd[175048]: ++ cat /run_command
Sep 30 21:07:29 compute-1 multipathd[175048]: + CMD='/usr/sbin/multipathd -d'
Sep 30 21:07:29 compute-1 multipathd[175048]: + ARGS=
Sep 30 21:07:29 compute-1 multipathd[175048]: + sudo kolla_copy_cacerts
Sep 30 21:07:29 compute-1 podman[175055]: 2025-09-30 21:07:29.09918652 +0000 UTC m=+0.058846771 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:07:29 compute-1 sudo[175078]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:07:29 compute-1 sudo[175078]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:07:29 compute-1 sudo[175078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:07:29 compute-1 systemd[1]: 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804-77f3256e98f3e238.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:07:29 compute-1 systemd[1]: 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804-77f3256e98f3e238.service: Failed with result 'exit-code'.
Sep 30 21:07:29 compute-1 sudo[175078]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:29 compute-1 multipathd[175048]: + [[ ! -n '' ]]
Sep 30 21:07:29 compute-1 multipathd[175048]: + . kolla_extend_start
Sep 30 21:07:29 compute-1 multipathd[175048]: Running command: '/usr/sbin/multipathd -d'
Sep 30 21:07:29 compute-1 multipathd[175048]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 21:07:29 compute-1 multipathd[175048]: + umask 0022
Sep 30 21:07:29 compute-1 multipathd[175048]: + exec /usr/sbin/multipathd -d
Sep 30 21:07:29 compute-1 multipathd[175048]: 3144.856439 | --------start up--------
Sep 30 21:07:29 compute-1 multipathd[175048]: 3144.856454 | read /etc/multipath.conf
Sep 30 21:07:29 compute-1 multipathd[175048]: 3144.862009 | path checkers start up
Sep 30 21:07:30 compute-1 podman[175112]: 2025-09-30 21:07:30.233248605 +0000 UTC m=+0.081772664 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:07:30 compute-1 python3.9[175259]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:07:31 compute-1 sudo[175411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmsnwyeqrekbqizcrtvkijrwusqveiah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266451.3789804-2588-30997554785038/AnsiballZ_command.py'
Sep 30 21:07:31 compute-1 sudo[175411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:31 compute-1 python3.9[175413]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:07:31 compute-1 sudo[175411]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:32 compute-1 sudo[175576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpaiwlvetfruksneacfxhmygslwkuptj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266452.3137233-2612-244275158256964/AnsiballZ_systemd.py'
Sep 30 21:07:32 compute-1 sudo[175576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:32 compute-1 python3.9[175578]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:07:33 compute-1 systemd[1]: Stopping multipathd container...
Sep 30 21:07:33 compute-1 multipathd[175048]: 3148.821883 | exit (signal)
Sep 30 21:07:33 compute-1 multipathd[175048]: 3148.822717 | --------shut down-------
Sep 30 21:07:33 compute-1 systemd[1]: libpod-37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804.scope: Deactivated successfully.
Sep 30 21:07:33 compute-1 podman[175582]: 2025-09-30 21:07:33.120222616 +0000 UTC m=+0.090605316 container died 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:07:33 compute-1 systemd[1]: 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804-77f3256e98f3e238.timer: Deactivated successfully.
Sep 30 21:07:33 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804.
Sep 30 21:07:33 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804-userdata-shm.mount: Deactivated successfully.
Sep 30 21:07:33 compute-1 systemd[1]: var-lib-containers-storage-overlay-a1782cc2aeec0ede754b3fa1dea61e7c3a04b8cea63aa9bd1c7fb6d26ebffcf6-merged.mount: Deactivated successfully.
Sep 30 21:07:33 compute-1 podman[175582]: 2025-09-30 21:07:33.207368008 +0000 UTC m=+0.177750688 container cleanup 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=multipathd)
Sep 30 21:07:33 compute-1 podman[175582]: multipathd
Sep 30 21:07:33 compute-1 podman[175611]: multipathd
Sep 30 21:07:33 compute-1 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Sep 30 21:07:33 compute-1 systemd[1]: Stopped multipathd container.
Sep 30 21:07:33 compute-1 systemd[1]: Starting multipathd container...
Sep 30 21:07:33 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:07:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1782cc2aeec0ede754b3fa1dea61e7c3a04b8cea63aa9bd1c7fb6d26ebffcf6/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 21:07:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1782cc2aeec0ede754b3fa1dea61e7c3a04b8cea63aa9bd1c7fb6d26ebffcf6/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:07:33 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804.
Sep 30 21:07:33 compute-1 podman[175623]: 2025-09-30 21:07:33.420335395 +0000 UTC m=+0.126553217 container init 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:07:33 compute-1 multipathd[175638]: + sudo -E kolla_set_configs
Sep 30 21:07:33 compute-1 sudo[175644]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:07:33 compute-1 podman[175623]: 2025-09-30 21:07:33.456279814 +0000 UTC m=+0.162497586 container start 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:07:33 compute-1 sudo[175644]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:07:33 compute-1 sudo[175644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:07:33 compute-1 podman[175623]: multipathd
Sep 30 21:07:33 compute-1 systemd[1]: Started multipathd container.
Sep 30 21:07:33 compute-1 sudo[175576]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:33 compute-1 multipathd[175638]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:07:33 compute-1 multipathd[175638]: INFO:__main__:Validating config file
Sep 30 21:07:33 compute-1 multipathd[175638]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:07:33 compute-1 multipathd[175638]: INFO:__main__:Writing out command to execute
Sep 30 21:07:33 compute-1 sudo[175644]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:33 compute-1 multipathd[175638]: ++ cat /run_command
Sep 30 21:07:33 compute-1 multipathd[175638]: + CMD='/usr/sbin/multipathd -d'
Sep 30 21:07:33 compute-1 multipathd[175638]: + ARGS=
Sep 30 21:07:33 compute-1 multipathd[175638]: + sudo kolla_copy_cacerts
Sep 30 21:07:33 compute-1 podman[175645]: 2025-09-30 21:07:33.538940505 +0000 UTC m=+0.067012849 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 21:07:33 compute-1 sudo[175669]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:07:33 compute-1 systemd[1]: 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804-3c01314d9956afcd.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:07:33 compute-1 sudo[175669]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:07:33 compute-1 systemd[1]: 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804-3c01314d9956afcd.service: Failed with result 'exit-code'.
Sep 30 21:07:33 compute-1 sudo[175669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:07:33 compute-1 sudo[175669]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:33 compute-1 multipathd[175638]: + [[ ! -n '' ]]
Sep 30 21:07:33 compute-1 multipathd[175638]: + . kolla_extend_start
Sep 30 21:07:33 compute-1 multipathd[175638]: Running command: '/usr/sbin/multipathd -d'
Sep 30 21:07:33 compute-1 multipathd[175638]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 21:07:33 compute-1 multipathd[175638]: + umask 0022
Sep 30 21:07:33 compute-1 multipathd[175638]: + exec /usr/sbin/multipathd -d
Sep 30 21:07:33 compute-1 multipathd[175638]: 3149.302178 | --------start up--------
Sep 30 21:07:33 compute-1 multipathd[175638]: 3149.302202 | read /etc/multipath.conf
Sep 30 21:07:33 compute-1 multipathd[175638]: 3149.308287 | path checkers start up
Sep 30 21:07:35 compute-1 sudo[175828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfwgjfiwpfpydeaplbwxqztnswscgqea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266455.5351884-2636-77186715862875/AnsiballZ_file.py'
Sep 30 21:07:35 compute-1 sudo[175828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:36 compute-1 python3.9[175830]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:36 compute-1 sudo[175828]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:37 compute-1 sudo[175980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwhzwpmywywwahyqhsvriwgsuohurblt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266456.7677374-2672-143363413970319/AnsiballZ_file.py'
Sep 30 21:07:37 compute-1 sudo[175980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:37 compute-1 python3.9[175982]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 21:07:37 compute-1 sudo[175980]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:37 compute-1 sudo[176144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usyhrumolrcdmjsauycmkykfcbnkyqrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266457.505944-2696-15247729906777/AnsiballZ_modprobe.py'
Sep 30 21:07:37 compute-1 sudo[176144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:37 compute-1 podman[176106]: 2025-09-30 21:07:37.850251223 +0000 UTC m=+0.120064431 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:07:37 compute-1 python3.9[176154]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Sep 30 21:07:37 compute-1 kernel: Key type psk registered
Sep 30 21:07:38 compute-1 sudo[176144]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:38 compute-1 sudo[176322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhmlozzioaotgtntududpvzpeuqqdsmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266458.3320282-2720-202719339034781/AnsiballZ_stat.py'
Sep 30 21:07:38 compute-1 sudo[176322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:07:38.666 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:07:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:07:38.667 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:07:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:07:38.667 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:07:38 compute-1 python3.9[176324]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:38 compute-1 sudo[176322]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:39 compute-1 sudo[176445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kebrcxkrczwwvehzbtluwnfscoultwto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266458.3320282-2720-202719339034781/AnsiballZ_copy.py'
Sep 30 21:07:39 compute-1 sudo[176445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:39 compute-1 python3.9[176447]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266458.3320282-2720-202719339034781/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:39 compute-1 sudo[176445]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:40 compute-1 sudo[176597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awfgihczgtwgijrsoovqbxwtwywtvlgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266459.9818401-2768-266769851666497/AnsiballZ_lineinfile.py'
Sep 30 21:07:40 compute-1 sudo[176597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:40 compute-1 python3.9[176599]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:40 compute-1 sudo[176597]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:41 compute-1 sudo[176749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izfpotmlddqsvleddaiftmngvwpzbfei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266460.8468275-2792-118069680446716/AnsiballZ_systemd.py'
Sep 30 21:07:41 compute-1 sudo[176749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:41 compute-1 python3.9[176751]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:07:41 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 21:07:41 compute-1 systemd[1]: Stopped Load Kernel Modules.
Sep 30 21:07:41 compute-1 systemd[1]: Stopping Load Kernel Modules...
Sep 30 21:07:41 compute-1 systemd[1]: Starting Load Kernel Modules...
Sep 30 21:07:41 compute-1 systemd[1]: Finished Load Kernel Modules.
Sep 30 21:07:41 compute-1 sudo[176749]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:42 compute-1 sudo[176905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsjqevobqbavjhydyokrlamiwncyaqll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266462.0116396-2816-153747220070218/AnsiballZ_setup.py'
Sep 30 21:07:42 compute-1 sudo[176905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:42 compute-1 python3.9[176907]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 21:07:42 compute-1 sudo[176905]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:43 compute-1 sudo[176989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utgvoeqvtkgkofxszcaxjqllvyithvvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266462.0116396-2816-153747220070218/AnsiballZ_dnf.py'
Sep 30 21:07:43 compute-1 sudo[176989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:43 compute-1 python3.9[176991]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 21:07:45 compute-1 podman[176993]: 2025-09-30 21:07:45.226342501 +0000 UTC m=+0.073073253 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:07:49 compute-1 systemd[1]: Reloading.
Sep 30 21:07:49 compute-1 systemd-sysv-generator[177047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:49 compute-1 systemd-rc-local-generator[177043]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:50 compute-1 systemd[1]: Reloading.
Sep 30 21:07:50 compute-1 systemd-sysv-generator[177080]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:50 compute-1 systemd-rc-local-generator[177076]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:50 compute-1 systemd-logind[793]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 21:07:50 compute-1 systemd-logind[793]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 21:07:50 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 21:07:50 compute-1 systemd[1]: Starting man-db-cache-update.service...
Sep 30 21:07:50 compute-1 systemd[1]: Reloading.
Sep 30 21:07:50 compute-1 systemd-rc-local-generator[177169]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:50 compute-1 systemd-sysv-generator[177173]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:51 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 21:07:51 compute-1 sudo[176989]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:52 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 21:07:52 compute-1 systemd[1]: Finished man-db-cache-update.service.
Sep 30 21:07:52 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.623s CPU time.
Sep 30 21:07:52 compute-1 systemd[1]: run-r3d4e8e6d89c74d2f960ce79de23f9144.service: Deactivated successfully.
Sep 30 21:07:53 compute-1 sudo[178459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtjztxoxcdqkclgimrwnfipaillksynd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266472.9847887-2852-89234301480102/AnsiballZ_file.py'
Sep 30 21:07:53 compute-1 sudo[178459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:53 compute-1 python3.9[178461]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:53 compute-1 sudo[178459]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:54 compute-1 python3.9[178611]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 21:07:55 compute-1 sudo[178765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujltilxloszkwzmtpqohvygsxvvgifoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266475.2590508-2904-96102916869418/AnsiballZ_file.py'
Sep 30 21:07:55 compute-1 sudo[178765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:55 compute-1 python3.9[178767]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:55 compute-1 sudo[178765]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:57 compute-1 sudo[178917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gddedjklssmcnjmhpwbnyflkyenlpcet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266476.4182665-2937-138356214627517/AnsiballZ_systemd_service.py'
Sep 30 21:07:57 compute-1 sudo[178917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:57 compute-1 python3.9[178919]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:07:57 compute-1 systemd[1]: Reloading.
Sep 30 21:07:57 compute-1 systemd-rc-local-generator[178944]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:57 compute-1 systemd-sysv-generator[178950]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:57 compute-1 sudo[178917]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:58 compute-1 python3.9[179104]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:07:58 compute-1 network[179121]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:07:58 compute-1 network[179122]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:07:58 compute-1 network[179123]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:08:00 compute-1 podman[179179]: 2025-09-30 21:08:00.362141877 +0000 UTC m=+0.067710557 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:08:03 compute-1 sudo[179418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtqpfyhihevhxapcszxctoeocflnudlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266483.0191956-2994-143919688345830/AnsiballZ_systemd_service.py'
Sep 30 21:08:03 compute-1 sudo[179418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:03 compute-1 python3.9[179420]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:03 compute-1 sudo[179418]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:03 compute-1 podman[179422]: 2025-09-30 21:08:03.7853641 +0000 UTC m=+0.056109935 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2)
Sep 30 21:08:04 compute-1 sudo[179591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boktzhyxmarossmqagelruppylctjaka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266483.9168441-2994-120680690622036/AnsiballZ_systemd_service.py'
Sep 30 21:08:04 compute-1 sudo[179591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:04 compute-1 python3.9[179593]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:04 compute-1 sudo[179591]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:04 compute-1 sudo[179744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsaovyfopnqayrrmgnolzoydxenwergl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266484.6506553-2994-60592818194027/AnsiballZ_systemd_service.py'
Sep 30 21:08:04 compute-1 sudo[179744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:05 compute-1 python3.9[179746]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:05 compute-1 sudo[179744]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:05 compute-1 sudo[179897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfzeezcunvpxnsmkiqapffjmatbyxtqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266485.4532866-2994-151725233704/AnsiballZ_systemd_service.py'
Sep 30 21:08:05 compute-1 sudo[179897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:06 compute-1 python3.9[179899]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:06 compute-1 sudo[179897]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:06 compute-1 sudo[180050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-najwypvzzjetqelipdcpprjuksxeoypb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266486.2037156-2994-27655396558841/AnsiballZ_systemd_service.py'
Sep 30 21:08:06 compute-1 sudo[180050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:06 compute-1 python3.9[180052]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:06 compute-1 sudo[180050]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:07 compute-1 sudo[180203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjmzrioxlyefdjxhzxglerhpvjamptnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266487.0200894-2994-4784226302424/AnsiballZ_systemd_service.py'
Sep 30 21:08:07 compute-1 sudo[180203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:07 compute-1 python3.9[180205]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:07 compute-1 sudo[180203]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:08 compute-1 sudo[180369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eljbhkmfahilwhqqwqfzglqgbaxphhhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266487.7561903-2994-70900438548777/AnsiballZ_systemd_service.py'
Sep 30 21:08:08 compute-1 sudo[180369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:08 compute-1 podman[180330]: 2025-09-30 21:08:08.068528818 +0000 UTC m=+0.083194306 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:08:08 compute-1 python3.9[180378]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:08 compute-1 sudo[180369]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:08 compute-1 sudo[180535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlmnuykyjarldhezkedsvtpgeieqjbjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266488.613823-2994-132773438237773/AnsiballZ_systemd_service.py'
Sep 30 21:08:08 compute-1 sudo[180535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:09 compute-1 python3.9[180537]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:09 compute-1 sudo[180535]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:10 compute-1 sudo[180688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbbmtkkfzbyqjyhxgkebwtuptkecqsyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266490.7147264-3171-77749939546601/AnsiballZ_file.py'
Sep 30 21:08:10 compute-1 sudo[180688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:11 compute-1 python3.9[180690]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:11 compute-1 sudo[180688]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:11 compute-1 sudo[180840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czegrtskbfdcajvorwpqzpkjkmnffojb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266491.2908766-3171-45123126598396/AnsiballZ_file.py'
Sep 30 21:08:11 compute-1 sudo[180840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:11 compute-1 python3.9[180842]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:11 compute-1 sudo[180840]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:12 compute-1 sudo[180992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkvmwmzrpzgkrktqnfghdeoggqxjjzgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266491.92336-3171-207591989011091/AnsiballZ_file.py'
Sep 30 21:08:12 compute-1 sudo[180992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:12 compute-1 python3.9[180994]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:12 compute-1 sudo[180992]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:12 compute-1 sudo[181144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkkninreocrwbhqtqfbmrymyrpcwojkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266492.580856-3171-85619884436841/AnsiballZ_file.py'
Sep 30 21:08:12 compute-1 sudo[181144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:13 compute-1 python3.9[181146]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:13 compute-1 sudo[181144]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:13 compute-1 sudo[181296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvxnlwcnzjpxfewkqjmzoxknwwflkhot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266493.1968904-3171-6639995259931/AnsiballZ_file.py'
Sep 30 21:08:13 compute-1 sudo[181296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:13 compute-1 python3.9[181298]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:13 compute-1 sudo[181296]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:14 compute-1 sudo[181448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcyuyjycutcruosfqkekrbxklcmumeln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266493.8817968-3171-125328043325629/AnsiballZ_file.py'
Sep 30 21:08:14 compute-1 sudo[181448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:14 compute-1 python3.9[181450]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:14 compute-1 sudo[181448]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:14 compute-1 sudo[181600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfmtemlofbaawgkozgggwyxedroftpgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266494.45118-3171-217092235920686/AnsiballZ_file.py'
Sep 30 21:08:14 compute-1 sudo[181600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:14 compute-1 python3.9[181602]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:14 compute-1 sudo[181600]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:15 compute-1 sudo[181764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohcmtigsxbisimobjssvplthqckdabhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266495.0433905-3171-48073336471384/AnsiballZ_file.py'
Sep 30 21:08:15 compute-1 sudo[181764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:15 compute-1 podman[181726]: 2025-09-30 21:08:15.320070785 +0000 UTC m=+0.046235989 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:08:15 compute-1 python3.9[181772]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:15 compute-1 sudo[181764]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:16 compute-1 sudo[181922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvlqufibjyylbgioctighhfkmimpccfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266496.731946-3342-123095674776895/AnsiballZ_file.py'
Sep 30 21:08:16 compute-1 sudo[181922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:17 compute-1 python3.9[181924]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:17 compute-1 sudo[181922]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:17 compute-1 sudo[182074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juycluylfikzxbneogayugwrekclwmkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266497.3481681-3342-64067698225055/AnsiballZ_file.py'
Sep 30 21:08:17 compute-1 sudo[182074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:17 compute-1 python3.9[182076]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:17 compute-1 sudo[182074]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:18 compute-1 sudo[182226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdhzhazkymtexgtaautzeecjlklipzcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266497.9394484-3342-227458504847565/AnsiballZ_file.py'
Sep 30 21:08:18 compute-1 sudo[182226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:18 compute-1 python3.9[182228]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:18 compute-1 sudo[182226]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:18 compute-1 sudo[182378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tajvgtfymsfmdvruubzowbfuyudzelyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266498.5535069-3342-260338304106696/AnsiballZ_file.py'
Sep 30 21:08:18 compute-1 sudo[182378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:19 compute-1 python3.9[182380]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:19 compute-1 sudo[182378]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:19 compute-1 auditd[700]: Audit daemon rotating log files
Sep 30 21:08:19 compute-1 sudo[182530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziqrgykwkycudohdgyykerrebzmpmnzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266499.297509-3342-186655843458132/AnsiballZ_file.py'
Sep 30 21:08:19 compute-1 sudo[182530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:19 compute-1 python3.9[182532]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:19 compute-1 sudo[182530]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:20 compute-1 sudo[182682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdavlwelxvbyjhdzdsqsqrpzmnkxcbqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266500.04063-3342-154154384295700/AnsiballZ_file.py'
Sep 30 21:08:20 compute-1 sudo[182682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:20 compute-1 python3.9[182684]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:20 compute-1 sudo[182682]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:21 compute-1 sudo[182834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfzhwcssxhjjrijowuhyvxrlodxuvfxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266500.705522-3342-194307712573238/AnsiballZ_file.py'
Sep 30 21:08:21 compute-1 sudo[182834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:21 compute-1 python3.9[182836]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:21 compute-1 sudo[182834]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:21 compute-1 sudo[182986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykjkkxhhmlbaeznjjtafzhjdpmvbwbij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266501.4479291-3342-61626582892881/AnsiballZ_file.py'
Sep 30 21:08:21 compute-1 sudo[182986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:22 compute-1 python3.9[182988]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:22 compute-1 sudo[182986]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:23 compute-1 sudo[183138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jonffkokirvnmiuxrtwsexkowfhbvgrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266503.0818498-3516-100994230387552/AnsiballZ_command.py'
Sep 30 21:08:23 compute-1 sudo[183138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:23 compute-1 python3.9[183140]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:23 compute-1 sudo[183138]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:24 compute-1 python3.9[183292]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:08:25 compute-1 sudo[183442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvexetvzlghpwcrkgjbmihbyihwisimr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266505.0408998-3570-280570240256347/AnsiballZ_systemd_service.py'
Sep 30 21:08:25 compute-1 sudo[183442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:25 compute-1 python3.9[183444]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:08:25 compute-1 systemd[1]: Reloading.
Sep 30 21:08:25 compute-1 systemd-rc-local-generator[183475]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:08:25 compute-1 systemd-sysv-generator[183479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:08:25 compute-1 sudo[183442]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:26 compute-1 sudo[183632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czdqajongdanqxpxiaryizpcpckhyhow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266506.4082582-3594-1260421417329/AnsiballZ_command.py'
Sep 30 21:08:26 compute-1 sudo[183632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:26 compute-1 python3.9[183634]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:27 compute-1 sudo[183632]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:27 compute-1 sudo[183785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsrpthlrrkjqafbvexblhkwetryoueua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266507.1571105-3594-90265722719602/AnsiballZ_command.py'
Sep 30 21:08:27 compute-1 sudo[183785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:27 compute-1 python3.9[183787]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:27 compute-1 sudo[183785]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:28 compute-1 sudo[183938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbepayjayemsssiccvbxicxulseufryy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266507.9614723-3594-76791973516250/AnsiballZ_command.py'
Sep 30 21:08:28 compute-1 sudo[183938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:28 compute-1 python3.9[183940]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:28 compute-1 sudo[183938]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:29 compute-1 sudo[184091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnfuqmymdbkhklynlratxgkyhnfohngk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266508.8277905-3594-50078005809697/AnsiballZ_command.py'
Sep 30 21:08:29 compute-1 sudo[184091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:29 compute-1 python3.9[184093]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:29 compute-1 sudo[184091]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:29 compute-1 sudo[184244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzsmdjbujynawhuylcmghzuhuxizvpwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266509.5356805-3594-90967997089910/AnsiballZ_command.py'
Sep 30 21:08:29 compute-1 sudo[184244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:30 compute-1 python3.9[184246]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:30 compute-1 sudo[184244]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:30 compute-1 sudo[184409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atuggbekhhqnhbayoyxqauhmfjqqinuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266510.1716921-3594-157810074623785/AnsiballZ_command.py'
Sep 30 21:08:30 compute-1 sudo[184409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:30 compute-1 podman[184371]: 2025-09-30 21:08:30.490138117 +0000 UTC m=+0.071666064 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 21:08:30 compute-1 python3.9[184414]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:30 compute-1 sudo[184409]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:31 compute-1 sudo[184571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beumqclfgkngypdhubnzjdfdvweoxrol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266510.8214185-3594-125114333646076/AnsiballZ_command.py'
Sep 30 21:08:31 compute-1 sudo[184571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:31 compute-1 python3.9[184573]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:31 compute-1 sudo[184571]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:31 compute-1 sudo[184724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cobtbtofuqlkyalhhswyfgmeuvekiqhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266511.4551113-3594-216321016769285/AnsiballZ_command.py'
Sep 30 21:08:31 compute-1 sudo[184724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:31 compute-1 python3.9[184726]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:31 compute-1 sudo[184724]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:34 compute-1 sudo[184893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drfbtmxzcreqlvsociutdgocwhftpwsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266513.9127991-3801-207392574991217/AnsiballZ_file.py'
Sep 30 21:08:34 compute-1 sudo[184893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:34 compute-1 podman[184851]: 2025-09-30 21:08:34.215137574 +0000 UTC m=+0.061601503 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:08:34 compute-1 python3.9[184900]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:34 compute-1 sudo[184893]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:34 compute-1 sudo[185050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oauiejtuiyoifmbvwbkjumxkpsceljjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266514.5940967-3801-215451712702561/AnsiballZ_file.py'
Sep 30 21:08:34 compute-1 sudo[185050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:35 compute-1 python3.9[185052]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:35 compute-1 sudo[185050]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:35 compute-1 sudo[185202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjxyiralkblhevjbwdcdjlxwpnmbovcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266515.341065-3801-240373281138520/AnsiballZ_file.py'
Sep 30 21:08:35 compute-1 sudo[185202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:35 compute-1 python3.9[185204]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:35 compute-1 sudo[185202]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:36 compute-1 sudo[185354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysxhshavvsdkekuvanjswxyskbewcuxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266516.0960321-3867-1132667549279/AnsiballZ_file.py'
Sep 30 21:08:36 compute-1 sudo[185354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:36 compute-1 python3.9[185356]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:36 compute-1 sudo[185354]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:37 compute-1 sudo[185506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibzhqufxbhsbibrtaiqhzgwgpnleavty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266516.8498554-3867-11467857924335/AnsiballZ_file.py'
Sep 30 21:08:37 compute-1 sudo[185506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:37 compute-1 python3.9[185508]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:37 compute-1 sudo[185506]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:37 compute-1 sudo[185658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpujepjxpztswdrgadjkrikghuhoiocu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266517.47152-3867-114095815939408/AnsiballZ_file.py'
Sep 30 21:08:37 compute-1 sudo[185658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:38 compute-1 python3.9[185660]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:38 compute-1 sudo[185658]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:38 compute-1 podman[185682]: 2025-09-30 21:08:38.255679998 +0000 UTC m=+0.091458848 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Sep 30 21:08:38 compute-1 sudo[185836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyozltwxamxenfvxoxwjzwldidowzlpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266518.227445-3867-172820645824358/AnsiballZ_file.py'
Sep 30 21:08:38 compute-1 sudo[185836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:08:38.668 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:08:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:08:38.668 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:08:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:08:38.668 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:08:38 compute-1 python3.9[185838]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:38 compute-1 sudo[185836]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:39 compute-1 sudo[185988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heksbwenveddxmpghufxdxcnlhliavfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266518.9413176-3867-98162167262441/AnsiballZ_file.py'
Sep 30 21:08:39 compute-1 sudo[185988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:39 compute-1 python3.9[185990]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:39 compute-1 sudo[185988]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:39 compute-1 sudo[186140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwymdojfocobactzwnxoxoxkhrddnwng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266519.5643287-3867-157844082103604/AnsiballZ_file.py'
Sep 30 21:08:39 compute-1 sudo[186140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:40 compute-1 python3.9[186142]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:40 compute-1 sudo[186140]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:40 compute-1 sudo[186292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjdefsdainvldhdcvmqoqjeiuckuftzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266520.1649506-3867-188872081931398/AnsiballZ_file.py'
Sep 30 21:08:40 compute-1 sudo[186292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:40 compute-1 python3.9[186294]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:40 compute-1 sudo[186292]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:41 compute-1 sudo[186444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liajfhhblfmhxpoailblgmdcgrmrlvaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266520.7721024-3867-132509958892131/AnsiballZ_file.py'
Sep 30 21:08:41 compute-1 sudo[186444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:41 compute-1 python3.9[186446]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:41 compute-1 sudo[186444]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:41 compute-1 sudo[186596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crdrdlxfoglobkcgyoospnqravbbjftq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266521.3570893-3867-259345856010630/AnsiballZ_file.py'
Sep 30 21:08:41 compute-1 sudo[186596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:41 compute-1 python3.9[186598]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:41 compute-1 sudo[186596]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:46 compute-1 podman[186623]: 2025-09-30 21:08:46.199260529 +0000 UTC m=+0.048298303 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:08:47 compute-1 sudo[186768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyfgksuofxhtxzcnxaawsqtirviaonrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266527.188958-4215-179191988027510/AnsiballZ_getent.py'
Sep 30 21:08:47 compute-1 sudo[186768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:47 compute-1 python3.9[186770]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Sep 30 21:08:47 compute-1 sudo[186768]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:48 compute-1 sudo[186921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oibbnclvyqsrqjhojlzqospwkmpkbdsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266528.1858366-4238-214269767749721/AnsiballZ_group.py'
Sep 30 21:08:48 compute-1 sudo[186921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:48 compute-1 python3.9[186923]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 21:08:48 compute-1 groupadd[186924]: group added to /etc/group: name=nova, GID=42436
Sep 30 21:08:48 compute-1 groupadd[186924]: group added to /etc/gshadow: name=nova
Sep 30 21:08:48 compute-1 groupadd[186924]: new group: name=nova, GID=42436
Sep 30 21:08:48 compute-1 sudo[186921]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:49 compute-1 sudo[187079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcswpwhqfjyixipwivexdtjtrjyvqamn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266529.375481-4262-58414007707294/AnsiballZ_user.py'
Sep 30 21:08:49 compute-1 sudo[187079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:50 compute-1 python3.9[187081]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 21:08:50 compute-1 useradd[187083]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Sep 30 21:08:50 compute-1 useradd[187083]: add 'nova' to group 'libvirt'
Sep 30 21:08:50 compute-1 useradd[187083]: add 'nova' to shadow group 'libvirt'
Sep 30 21:08:50 compute-1 sudo[187079]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:51 compute-1 sshd-session[187114]: Accepted publickey for zuul from 192.168.122.30 port 52556 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:08:51 compute-1 systemd-logind[793]: New session 27 of user zuul.
Sep 30 21:08:51 compute-1 systemd[1]: Started Session 27 of User zuul.
Sep 30 21:08:51 compute-1 sshd-session[187114]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:08:51 compute-1 sshd-session[187117]: Received disconnect from 192.168.122.30 port 52556:11: disconnected by user
Sep 30 21:08:51 compute-1 sshd-session[187117]: Disconnected from user zuul 192.168.122.30 port 52556
Sep 30 21:08:51 compute-1 sshd-session[187114]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:08:51 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Sep 30 21:08:51 compute-1 systemd-logind[793]: Session 27 logged out. Waiting for processes to exit.
Sep 30 21:08:51 compute-1 systemd-logind[793]: Removed session 27.
Sep 30 21:08:52 compute-1 python3.9[187267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:52 compute-1 python3.9[187388]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266531.6071198-4337-94456598794303/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:53 compute-1 python3.9[187538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:53 compute-1 python3.9[187614]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:54 compute-1 python3.9[187764]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:54 compute-1 python3.9[187885]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266533.8239026-4337-187912633888227/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:55 compute-1 python3.9[188035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:55 compute-1 python3.9[188156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266535.0030196-4337-141587716064670/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:56 compute-1 python3.9[188306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:57 compute-1 python3.9[188427]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266536.105466-4337-67442187878033/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:58 compute-1 sudo[188577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apgsygsevphowbylplejhkuhiuifqmcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266538.3821406-4544-53160640799478/AnsiballZ_file.py'
Sep 30 21:08:58 compute-1 sudo[188577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:58 compute-1 python3.9[188579]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:58 compute-1 sudo[188577]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:59 compute-1 sudo[188729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlojvsnjffvxyrihfrordeotnbrcisty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266539.125306-4568-11732217204273/AnsiballZ_copy.py'
Sep 30 21:08:59 compute-1 sudo[188729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:59 compute-1 python3.9[188731]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:59 compute-1 sudo[188729]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:00 compute-1 sudo[188881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eudkeinaauvukmpzuxqnuipwohmpsyqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266539.9887946-4592-12032770681776/AnsiballZ_stat.py'
Sep 30 21:09:00 compute-1 sudo[188881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:00 compute-1 python3.9[188883]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:00 compute-1 sudo[188881]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:01 compute-1 podman[189007]: 2025-09-30 21:09:01.070374275 +0000 UTC m=+0.058522988 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:09:01 compute-1 sudo[189051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nryxezocntllfnixjgiyjnjnscjuevjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266540.7743251-4616-148139949270815/AnsiballZ_stat.py'
Sep 30 21:09:01 compute-1 sudo[189051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:01 compute-1 python3.9[189056]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:09:01 compute-1 sudo[189051]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:01 compute-1 sudo[189177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixvjgbvbbzepjiybtsrzbgugfqnrgydx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266540.7743251-4616-148139949270815/AnsiballZ_copy.py'
Sep 30 21:09:01 compute-1 sudo[189177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:01 compute-1 python3.9[189179]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759266540.7743251-4616-148139949270815/.source _original_basename=.cpx0b6_h follow=False checksum=0f6a28e5cb0ca5d9d8523ed5eeec5b7a0261527e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Sep 30 21:09:01 compute-1 sudo[189177]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:02 compute-1 python3.9[189331]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:03 compute-1 python3.9[189483]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:09:04 compute-1 python3.9[189604]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266543.3605561-4694-95083968204308/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:09:05 compute-1 podman[189728]: 2025-09-30 21:09:05.044930244 +0000 UTC m=+0.055647911 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:09:05 compute-1 python3.9[189772]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:09:05 compute-1 python3.9[189896]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266544.7339063-4739-26686091156604/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:09:06 compute-1 sudo[190046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwayucpqqhafaswdqyzefwbfhdfmwtft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266546.475672-4790-67329286406389/AnsiballZ_container_config_data.py'
Sep 30 21:09:06 compute-1 sudo[190046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:07 compute-1 python3.9[190048]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Sep 30 21:09:07 compute-1 sudo[190046]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:07 compute-1 sudo[190198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdcmkhukxeiawoedvrtkxszwlojxpxbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266547.4166534-4817-69900920876783/AnsiballZ_container_config_hash.py'
Sep 30 21:09:07 compute-1 sudo[190198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:07 compute-1 python3.9[190200]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:09:07 compute-1 sudo[190198]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:08 compute-1 sudo[190362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqxgzykjzanaxonrxyahykyasxrnskzp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266548.4023817-4847-238356083766207/AnsiballZ_edpm_container_manage.py'
Sep 30 21:09:08 compute-1 sudo[190362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:08 compute-1 podman[190324]: 2025-09-30 21:09:08.706428114 +0000 UTC m=+0.074709891 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:09:08 compute-1 python3[190370]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:09:09 compute-1 podman[190414]: 2025-09-30 21:09:09.114061645 +0000 UTC m=+0.051054497 container create e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=nova_compute_init, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, tcib_managed=true)
Sep 30 21:09:09 compute-1 podman[190414]: 2025-09-30 21:09:09.085188213 +0000 UTC m=+0.022181065 image pull 613e2b735827096139e990f475c5ac5de0e55d8048941a4521c0c17a4351e975 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Sep 30 21:09:09 compute-1 python3[190370]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Sep 30 21:09:09 compute-1 sudo[190362]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:10 compute-1 sudo[190602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-corhlgskohzohmtdjynsmczneqdutnuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266549.6901326-4871-256269515550810/AnsiballZ_stat.py'
Sep 30 21:09:10 compute-1 sudo[190602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:10 compute-1 python3.9[190604]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:10 compute-1 sudo[190602]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:11 compute-1 sudo[190756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwacdsazjmavuaeqlhsbmmvvhlgxxkyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266550.8912535-4907-186930619955784/AnsiballZ_container_config_data.py'
Sep 30 21:09:11 compute-1 sudo[190756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:11 compute-1 python3.9[190758]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Sep 30 21:09:11 compute-1 sudo[190756]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:12 compute-1 sudo[190908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gczgorxbspnqlsmskhzbidqdoqqcurgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266551.8738456-4934-206262845667616/AnsiballZ_container_config_hash.py'
Sep 30 21:09:12 compute-1 sudo[190908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:12 compute-1 python3.9[190910]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:09:12 compute-1 sudo[190908]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:13 compute-1 sudo[191060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pilkbgqiqxeswxdhiesqpzschxckxuxu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266552.9282603-4964-185928872272829/AnsiballZ_edpm_container_manage.py'
Sep 30 21:09:13 compute-1 sudo[191060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:13 compute-1 python3[191062]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:09:13 compute-1 podman[191100]: 2025-09-30 21:09:13.82664652 +0000 UTC m=+0.046475765 container create e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:09:13 compute-1 podman[191100]: 2025-09-30 21:09:13.801833046 +0000 UTC m=+0.021662311 image pull 613e2b735827096139e990f475c5ac5de0e55d8048941a4521c0c17a4351e975 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Sep 30 21:09:13 compute-1 python3[191062]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Sep 30 21:09:13 compute-1 sudo[191060]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:14 compute-1 sudo[191288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szdoaatljwsemcswojijjidkdmzsicyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266554.4164486-4988-99218167377426/AnsiballZ_stat.py'
Sep 30 21:09:14 compute-1 sudo[191288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:14 compute-1 python3.9[191290]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:14 compute-1 sudo[191288]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:15 compute-1 sudo[191442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuelzvptrpeiznoyyugbuaahyxfyxdtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266555.38435-5015-120959227417651/AnsiballZ_file.py'
Sep 30 21:09:15 compute-1 sudo[191442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:15 compute-1 python3.9[191444]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:09:15 compute-1 sudo[191442]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:16 compute-1 podman[191567]: 2025-09-30 21:09:16.447419202 +0000 UTC m=+0.068442813 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:09:16 compute-1 sudo[191609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsnmxyygdhtmkkcdpnkzmrsezxjhjwfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266555.9090085-5015-176251507263080/AnsiballZ_copy.py'
Sep 30 21:09:16 compute-1 sudo[191609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:16 compute-1 python3.9[191614]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266555.9090085-5015-176251507263080/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:09:16 compute-1 sudo[191609]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:16 compute-1 sudo[191688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxxkfkkmpnqtwssnyqgcgrkwbnbywqnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266555.9090085-5015-176251507263080/AnsiballZ_systemd.py'
Sep 30 21:09:16 compute-1 sudo[191688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:17 compute-1 python3.9[191690]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:09:17 compute-1 systemd[1]: Reloading.
Sep 30 21:09:17 compute-1 systemd-sysv-generator[191719]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:09:17 compute-1 systemd-rc-local-generator[191715]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:09:17 compute-1 sudo[191688]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:17 compute-1 sudo[191799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rycpxepiqtmzdhdwnipgbvmbwubbyssp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266555.9090085-5015-176251507263080/AnsiballZ_systemd.py'
Sep 30 21:09:17 compute-1 sudo[191799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:18 compute-1 python3.9[191801]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:09:18 compute-1 systemd[1]: Reloading.
Sep 30 21:09:18 compute-1 systemd-sysv-generator[191831]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:09:18 compute-1 systemd-rc-local-generator[191828]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:09:18 compute-1 systemd[1]: Starting nova_compute container...
Sep 30 21:09:18 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:09:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-1 podman[191840]: 2025-09-30 21:09:18.622980888 +0000 UTC m=+0.145136617 container init e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 21:09:18 compute-1 podman[191840]: 2025-09-30 21:09:18.628884225 +0000 UTC m=+0.151039934 container start e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, managed_by=edpm_ansible)
Sep 30 21:09:18 compute-1 nova_compute[191855]: + sudo -E kolla_set_configs
Sep 30 21:09:18 compute-1 podman[191840]: nova_compute
Sep 30 21:09:18 compute-1 systemd[1]: Started nova_compute container.
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Validating config file
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Copying service configuration files
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Deleting /etc/ceph
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Creating directory /etc/ceph
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Writing out command to execute
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:18 compute-1 nova_compute[191855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 21:09:18 compute-1 nova_compute[191855]: ++ cat /run_command
Sep 30 21:09:18 compute-1 nova_compute[191855]: + CMD=nova-compute
Sep 30 21:09:18 compute-1 nova_compute[191855]: + ARGS=
Sep 30 21:09:18 compute-1 nova_compute[191855]: + sudo kolla_copy_cacerts
Sep 30 21:09:18 compute-1 sudo[191799]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:18 compute-1 nova_compute[191855]: + [[ ! -n '' ]]
Sep 30 21:09:18 compute-1 nova_compute[191855]: + . kolla_extend_start
Sep 30 21:09:18 compute-1 nova_compute[191855]: Running command: 'nova-compute'
Sep 30 21:09:18 compute-1 nova_compute[191855]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 21:09:18 compute-1 nova_compute[191855]: + umask 0022
Sep 30 21:09:18 compute-1 nova_compute[191855]: + exec nova-compute
Sep 30 21:09:20 compute-1 python3.9[192017]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:20 compute-1 nova_compute[191855]: 2025-09-30 21:09:20.694 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:20 compute-1 nova_compute[191855]: 2025-09-30 21:09:20.695 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:20 compute-1 nova_compute[191855]: 2025-09-30 21:09:20.695 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:20 compute-1 nova_compute[191855]: 2025-09-30 21:09:20.695 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 21:09:20 compute-1 nova_compute[191855]: 2025-09-30 21:09:20.821 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:09:20 compute-1 nova_compute[191855]: 2025-09-30 21:09:20.846 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.508 2 INFO nova.virt.driver [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 21:09:21 compute-1 python3.9[192171]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.616 2 INFO nova.compute.provider_config [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.632 2 DEBUG oslo_concurrency.lockutils [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.632 2 DEBUG oslo_concurrency.lockutils [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.632 2 DEBUG oslo_concurrency.lockutils [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.633 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.633 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.633 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.633 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.633 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.633 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.634 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.634 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.634 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.634 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.634 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.634 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.634 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.635 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.635 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.635 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.635 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.635 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.635 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.635 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.636 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.636 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.636 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.636 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.636 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.636 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.637 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.637 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.637 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.637 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.637 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.637 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.637 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.638 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.638 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.638 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.638 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.638 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.638 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.638 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.639 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.639 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.639 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.639 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.639 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.639 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.640 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.640 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.640 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.640 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.640 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.640 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.641 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.641 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.641 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.641 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.641 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.642 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.642 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.642 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.642 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.642 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.642 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.642 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.643 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.643 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.643 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.643 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.643 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.643 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.644 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.644 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.644 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.644 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.644 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.644 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.645 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.645 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.645 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.645 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.645 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.645 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.645 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.646 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.646 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.646 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.646 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.646 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.646 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.646 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.647 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.647 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.647 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.647 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.647 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.647 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.647 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.648 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.648 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.648 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.648 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.648 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.648 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.648 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.649 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.649 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.649 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.649 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.649 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.649 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.649 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.649 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.650 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.650 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.650 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.650 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.650 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.650 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.650 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.651 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.651 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.651 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.651 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.651 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.651 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.651 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.652 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.652 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.652 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.652 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.652 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.652 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.652 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.653 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.653 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.653 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.653 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.653 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.653 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.653 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.654 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.654 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.654 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.654 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.654 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.654 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.654 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.655 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.655 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.655 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.655 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.655 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.655 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.656 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.656 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.656 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.656 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.656 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.656 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.656 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.657 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.657 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.657 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.657 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.657 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.657 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.657 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.658 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.658 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.658 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.658 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.658 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.658 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.658 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.659 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.659 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.659 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.659 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.659 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.659 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.659 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.660 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.660 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.660 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.660 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.660 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.660 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.660 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.661 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.661 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.661 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.661 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.661 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.661 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.661 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.662 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.662 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.662 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.662 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.662 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.662 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.662 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.663 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.663 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.663 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.663 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.663 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.663 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.663 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.664 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.664 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.664 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.664 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.664 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.664 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.664 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.665 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.665 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.665 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.665 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.665 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.665 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.665 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.666 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.666 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.666 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.666 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.666 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.666 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.666 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.667 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.667 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.667 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.667 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.667 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.705 2 WARNING oslo_config.cfg [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 21:09:21 compute-1 nova_compute[191855]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 21:09:21 compute-1 nova_compute[191855]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 21:09:21 compute-1 nova_compute[191855]: and ``live_migration_inbound_addr`` respectively.
Sep 30 21:09:21 compute-1 nova_compute[191855]: ).  Its value may be silently ignored in the future.
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.773 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.808 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.809 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.809 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.809 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Sep 30 21:09:21 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 21:09:21 compute-1 systemd[1]: Started libvirt QEMU daemon.
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.877 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f07f9d84790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.881 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f07f9d84790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.882 2 INFO nova.virt.libvirt.driver [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Connection event '1' reason 'None'
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.924 2 WARNING nova.virt.libvirt.driver [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Sep 30 21:09:21 compute-1 nova_compute[191855]: 2025-09-30 21:09:21.926 2 DEBUG nova.virt.libvirt.volume.mount [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Sep 30 21:09:22 compute-1 python3.9[192373]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.698 2 INFO nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Libvirt host capabilities <capabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]: 
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <host>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <uuid>1cca553b-9dfa-4866-8e03-ea5bf0299bf0</uuid>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <arch>x86_64</arch>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <microcode version='16777317'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <signature family='23' model='49' stepping='0'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <maxphysaddr mode='emulate' bits='40'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='x2apic'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='tsc-deadline'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='osxsave'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='hypervisor'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='tsc_adjust'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='spec-ctrl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='stibp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='arch-capabilities'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='cmp_legacy'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='topoext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='virt-ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='lbrv'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='tsc-scale'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='vmcb-clean'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='pause-filter'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='pfthreshold'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='svme-addr-chk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='rdctl-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='mds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature name='pschange-mc-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <pages unit='KiB' size='4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <pages unit='KiB' size='2048'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <pages unit='KiB' size='1048576'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <power_management>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <suspend_mem/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <suspend_disk/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <suspend_hybrid/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </power_management>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <iommu support='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <migration_features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <live/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <uri_transports>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <uri_transport>tcp</uri_transport>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <uri_transport>rdma</uri_transport>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </uri_transports>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </migration_features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <topology>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <cells num='1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <cell id='0'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:           <memory unit='KiB'>7864116</memory>
Sep 30 21:09:22 compute-1 nova_compute[191855]:           <pages unit='KiB' size='4'>1966029</pages>
Sep 30 21:09:22 compute-1 nova_compute[191855]:           <pages unit='KiB' size='2048'>0</pages>
Sep 30 21:09:22 compute-1 nova_compute[191855]:           <pages unit='KiB' size='1048576'>0</pages>
Sep 30 21:09:22 compute-1 nova_compute[191855]:           <distances>
Sep 30 21:09:22 compute-1 nova_compute[191855]:             <sibling id='0' value='10'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:           </distances>
Sep 30 21:09:22 compute-1 nova_compute[191855]:           <cpus num='8'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:           </cpus>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         </cell>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </cells>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </topology>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <cache>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </cache>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <secmodel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model>selinux</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <doi>0</doi>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </secmodel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <secmodel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model>dac</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <doi>0</doi>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <baselabel type='kvm'>+107:+107</baselabel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <baselabel type='qemu'>+107:+107</baselabel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </secmodel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </host>
Sep 30 21:09:22 compute-1 nova_compute[191855]: 
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <guest>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <os_type>hvm</os_type>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <arch name='i686'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <wordsize>32</wordsize>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <domain type='qemu'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <domain type='kvm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </arch>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <pae/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <nonpae/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <acpi default='on' toggle='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <apic default='on' toggle='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <cpuselection/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <deviceboot/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <disksnapshot default='on' toggle='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <externalSnapshot/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </guest>
Sep 30 21:09:22 compute-1 nova_compute[191855]: 
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <guest>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <os_type>hvm</os_type>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <arch name='x86_64'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <wordsize>64</wordsize>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <domain type='qemu'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <domain type='kvm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </arch>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <acpi default='on' toggle='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <apic default='on' toggle='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <cpuselection/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <deviceboot/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <disksnapshot default='on' toggle='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <externalSnapshot/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </guest>
Sep 30 21:09:22 compute-1 nova_compute[191855]: 
Sep 30 21:09:22 compute-1 nova_compute[191855]: </capabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]: 
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.706 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.725 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Sep 30 21:09:22 compute-1 nova_compute[191855]: <domainCapabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <domain>kvm</domain>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <arch>i686</arch>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <vcpu max='240'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <iothreads supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <os supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <enum name='firmware'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <loader supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>rom</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pflash</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='readonly'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>yes</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>no</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='secure'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>no</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </loader>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </os>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>on</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>off</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='maximumMigratable'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>on</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>off</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='succor'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='custom' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-128'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-256'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-512'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='KnightsMill'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SierraForest'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='athlon'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='athlon-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='core2duo'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='core2duo-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='coreduo'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='coreduo-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='n270'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='n270-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='phenom'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='phenom-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <memoryBacking supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <enum name='sourceType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>file</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>anonymous</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>memfd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </memoryBacking>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <devices>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <disk supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='diskDevice'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>disk</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>cdrom</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>floppy</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>lun</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='bus'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ide</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>fdc</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>scsi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>sata</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </disk>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <graphics supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vnc</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>egl-headless</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>dbus</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </graphics>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <video supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='modelType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vga</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>cirrus</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>none</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>bochs</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ramfb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </video>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <hostdev supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='mode'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>subsystem</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='startupPolicy'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>default</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>mandatory</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>requisite</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>optional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='subsysType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pci</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>scsi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='capsType'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='pciBackend'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </hostdev>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <rng supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>random</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>egd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>builtin</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </rng>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <filesystem supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='driverType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>path</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>handle</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtiofs</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </filesystem>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <tpm supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tpm-tis</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tpm-crb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>emulator</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>external</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendVersion'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>2.0</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </tpm>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <redirdev supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='bus'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </redirdev>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <channel supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pty</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>unix</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </channel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <crypto supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>qemu</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>builtin</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </crypto>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <interface supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>default</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>passt</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </interface>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <panic supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>isa</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>hyperv</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </panic>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </devices>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <gic supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <genid supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <backup supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <async-teardown supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <ps2 supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <sev supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <sgx supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <hyperv supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='features'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>relaxed</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vapic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>spinlocks</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vpindex</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>runtime</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>synic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>stimer</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>reset</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vendor_id</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>frequencies</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>reenlightenment</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tlbflush</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ipi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>avic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>emsr_bitmap</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>xmm_input</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </hyperv>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <launchSecurity supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </features>
Sep 30 21:09:22 compute-1 nova_compute[191855]: </domainCapabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.730 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Sep 30 21:09:22 compute-1 nova_compute[191855]: <domainCapabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <domain>kvm</domain>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <arch>i686</arch>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <vcpu max='4096'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <iothreads supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <os supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <enum name='firmware'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <loader supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>rom</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pflash</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='readonly'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>yes</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>no</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='secure'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>no</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </loader>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </os>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>on</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>off</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='maximumMigratable'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>on</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>off</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='succor'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='custom' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-128'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-256'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-512'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='KnightsMill'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SierraForest'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='athlon'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='athlon-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='core2duo'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='core2duo-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='coreduo'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='coreduo-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='n270'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='n270-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='phenom'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='phenom-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <memoryBacking supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <enum name='sourceType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>file</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>anonymous</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>memfd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </memoryBacking>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <devices>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <disk supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='diskDevice'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>disk</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>cdrom</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>floppy</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>lun</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='bus'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>fdc</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>scsi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>sata</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </disk>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <graphics supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vnc</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>egl-headless</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>dbus</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </graphics>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <video supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='modelType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vga</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>cirrus</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>none</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>bochs</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ramfb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </video>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <hostdev supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='mode'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>subsystem</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='startupPolicy'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>default</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>mandatory</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>requisite</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>optional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='subsysType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pci</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>scsi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='capsType'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='pciBackend'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </hostdev>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <rng supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>random</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>egd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>builtin</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </rng>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <filesystem supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='driverType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>path</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>handle</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtiofs</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </filesystem>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <tpm supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tpm-tis</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tpm-crb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>emulator</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>external</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendVersion'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>2.0</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </tpm>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <redirdev supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='bus'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </redirdev>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <channel supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pty</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>unix</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </channel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <crypto supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>qemu</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>builtin</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </crypto>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <interface supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>default</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>passt</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </interface>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <panic supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>isa</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>hyperv</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </panic>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </devices>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <gic supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <genid supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <backup supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <async-teardown supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <ps2 supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <sev supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <sgx supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <hyperv supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='features'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>relaxed</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vapic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>spinlocks</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vpindex</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>runtime</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>synic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>stimer</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>reset</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vendor_id</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>frequencies</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>reenlightenment</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tlbflush</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ipi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>avic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>emsr_bitmap</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>xmm_input</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </hyperv>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <launchSecurity supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </features>
Sep 30 21:09:22 compute-1 nova_compute[191855]: </domainCapabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.780 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.784 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Sep 30 21:09:22 compute-1 nova_compute[191855]: <domainCapabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <domain>kvm</domain>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <arch>x86_64</arch>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <vcpu max='240'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <iothreads supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <os supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <enum name='firmware'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <loader supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>rom</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pflash</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='readonly'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>yes</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>no</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='secure'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>no</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </loader>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </os>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>on</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>off</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='maximumMigratable'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>on</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>off</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='succor'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='custom' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-128'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-256'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-512'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='KnightsMill'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SierraForest'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='athlon'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='athlon-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='core2duo'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='core2duo-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='coreduo'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='coreduo-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='n270'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='n270-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='phenom'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='phenom-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <memoryBacking supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <enum name='sourceType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>file</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>anonymous</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>memfd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </memoryBacking>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <devices>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <disk supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='diskDevice'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>disk</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>cdrom</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>floppy</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>lun</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='bus'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ide</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>fdc</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>scsi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>sata</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </disk>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <graphics supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vnc</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>egl-headless</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>dbus</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </graphics>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <video supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='modelType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vga</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>cirrus</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>none</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>bochs</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ramfb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </video>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <hostdev supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='mode'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>subsystem</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='startupPolicy'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>default</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>mandatory</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>requisite</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>optional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='subsysType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pci</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>scsi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='capsType'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='pciBackend'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </hostdev>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <rng supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>random</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>egd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>builtin</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </rng>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <filesystem supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='driverType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>path</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>handle</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtiofs</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </filesystem>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <tpm supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tpm-tis</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tpm-crb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>emulator</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>external</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendVersion'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>2.0</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </tpm>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <redirdev supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='bus'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </redirdev>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <channel supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pty</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>unix</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </channel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <crypto supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>qemu</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>builtin</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </crypto>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <interface supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>default</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>passt</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </interface>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <panic supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>isa</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>hyperv</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </panic>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </devices>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <gic supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <genid supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <backup supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <async-teardown supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <ps2 supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <sev supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <sgx supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <hyperv supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='features'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>relaxed</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vapic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>spinlocks</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vpindex</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>runtime</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>synic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>stimer</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>reset</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vendor_id</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>frequencies</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>reenlightenment</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tlbflush</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ipi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>avic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>emsr_bitmap</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>xmm_input</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </hyperv>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <launchSecurity supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </features>
Sep 30 21:09:22 compute-1 nova_compute[191855]: </domainCapabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.853 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Sep 30 21:09:22 compute-1 nova_compute[191855]: <domainCapabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <domain>kvm</domain>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <arch>x86_64</arch>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <vcpu max='4096'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <iothreads supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <os supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <enum name='firmware'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>efi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <loader supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>rom</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pflash</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='readonly'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>yes</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>no</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='secure'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>yes</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>no</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </loader>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </os>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>on</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>off</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='maximumMigratable'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>on</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>off</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='succor'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <mode name='custom' supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Denverton-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='EPYC-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-128'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-256'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx10-512'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Haswell-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='KnightsMill'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xop'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='la57'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SierraForest'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='hle'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='pku'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='erms'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='athlon'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='athlon-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='core2duo'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='core2duo-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='coreduo'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='coreduo-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='n270'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='n270-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='ss'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='phenom'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <blockers model='phenom-v1'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </blockers>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </mode>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <memoryBacking supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <enum name='sourceType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>file</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>anonymous</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <value>memfd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </memoryBacking>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <devices>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <disk supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='diskDevice'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>disk</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>cdrom</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>floppy</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>lun</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='bus'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>fdc</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>scsi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>sata</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </disk>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <graphics supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vnc</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>egl-headless</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>dbus</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </graphics>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <video supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='modelType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vga</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>cirrus</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>none</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>bochs</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ramfb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </video>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <hostdev supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='mode'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>subsystem</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='startupPolicy'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>default</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>mandatory</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>requisite</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>optional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='subsysType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pci</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>scsi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='capsType'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='pciBackend'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </hostdev>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <rng supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>random</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>egd</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>builtin</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </rng>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <filesystem supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='driverType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>path</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>handle</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>virtiofs</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </filesystem>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <tpm supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tpm-tis</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tpm-crb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>emulator</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>external</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendVersion'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>2.0</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </tpm>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <redirdev supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='bus'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>usb</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </redirdev>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <channel supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>pty</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>unix</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </channel>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <crypto supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='type'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>qemu</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>builtin</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </crypto>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <interface supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='backendType'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>default</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>passt</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </interface>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <panic supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='model'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>isa</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>hyperv</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </panic>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </devices>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <features>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <gic supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <genid supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <backup supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <async-teardown supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <ps2 supported='yes'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <sev supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <sgx supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <hyperv supported='yes'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       <enum name='features'>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>relaxed</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vapic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>spinlocks</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vpindex</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>runtime</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>synic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>stimer</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>reset</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>vendor_id</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>frequencies</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>reenlightenment</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>tlbflush</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>ipi</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>avic</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>emsr_bitmap</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:         <value>xmm_input</value>
Sep 30 21:09:22 compute-1 nova_compute[191855]:       </enum>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     </hyperv>
Sep 30 21:09:22 compute-1 nova_compute[191855]:     <launchSecurity supported='no'/>
Sep 30 21:09:22 compute-1 nova_compute[191855]:   </features>
Sep 30 21:09:22 compute-1 nova_compute[191855]: </domainCapabilities>
Sep 30 21:09:22 compute-1 nova_compute[191855]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.913 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.913 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.914 2 DEBUG nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.914 2 INFO nova.virt.libvirt.host [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Secure Boot support detected
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.915 2 INFO nova.virt.libvirt.driver [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.915 2 INFO nova.virt.libvirt.driver [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.926 2 DEBUG nova.virt.libvirt.driver [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] cpu compare xml: <cpu match="exact">
Sep 30 21:09:22 compute-1 nova_compute[191855]:   <model>Nehalem</model>
Sep 30 21:09:22 compute-1 nova_compute[191855]: </cpu>
Sep 30 21:09:22 compute-1 nova_compute[191855]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.929 2 DEBUG nova.virt.libvirt.driver [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Sep 30 21:09:22 compute-1 nova_compute[191855]: 2025-09-30 21:09:22.963 2 INFO nova.virt.node [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Determined node identity e551d5b4-e9f6-409e-b2a1-508a20c11333 from /var/lib/nova/compute_id
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.009 2 WARNING nova.compute.manager [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Compute nodes ['e551d5b4-e9f6-409e-b2a1-508a20c11333'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.054 2 INFO nova.compute.manager [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.156 2 WARNING nova.compute.manager [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.156 2 DEBUG oslo_concurrency.lockutils [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.156 2 DEBUG oslo_concurrency.lockutils [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.156 2 DEBUG oslo_concurrency.lockutils [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.157 2 DEBUG nova.compute.resource_tracker [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:09:23 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 21:09:23 compute-1 sudo[192535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rubetsjvrxpdtbeebssobvfwtxscsnyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266562.9169178-5195-16311410215223/AnsiballZ_podman_container.py'
Sep 30 21:09:23 compute-1 sudo[192535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:23 compute-1 systemd[1]: Started libvirt nodedev daemon.
Sep 30 21:09:23 compute-1 python3.9[192538]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.450 2 WARNING nova.virt.libvirt.driver [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.451 2 DEBUG nova.compute.resource_tracker [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6186MB free_disk=73.67033767700195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:09:23 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.451 2 DEBUG oslo_concurrency.lockutils [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.452 2 DEBUG oslo_concurrency.lockutils [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.465 2 WARNING nova.compute.resource_tracker [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] No compute node record for compute-1.ctlplane.example.com:e551d5b4-e9f6-409e-b2a1-508a20c11333: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host e551d5b4-e9f6-409e-b2a1-508a20c11333 could not be found.
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.492 2 INFO nova.compute.resource_tracker [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: e551d5b4-e9f6-409e-b2a1-508a20c11333
Sep 30 21:09:23 compute-1 sudo[192535]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.597 2 DEBUG nova.compute.resource_tracker [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:09:23 compute-1 nova_compute[191855]: 2025-09-30 21:09:23.597 2 DEBUG nova.compute.resource_tracker [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:09:24 compute-1 sudo[192731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpflorwpmhtffsduknpdtfpynmgabbnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266563.8928478-5219-115803145264122/AnsiballZ_systemd.py'
Sep 30 21:09:24 compute-1 sudo[192731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:24 compute-1 python3.9[192733]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:09:24 compute-1 systemd[1]: Stopping nova_compute container...
Sep 30 21:09:24 compute-1 nova_compute[191855]: 2025-09-30 21:09:24.707 2 DEBUG oslo_concurrency.lockutils [None req-26e5302b-7b02-4397-8080-608a16580faf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:24 compute-1 nova_compute[191855]: 2025-09-30 21:09:24.708 2 DEBUG oslo_concurrency.lockutils [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:09:24 compute-1 nova_compute[191855]: 2025-09-30 21:09:24.708 2 DEBUG oslo_concurrency.lockutils [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:09:24 compute-1 nova_compute[191855]: 2025-09-30 21:09:24.708 2 DEBUG oslo_concurrency.lockutils [None req-00ec43f6-4c67-4f2f-9832-bb6a19a7426e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:09:25 compute-1 virtqemud[192217]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Sep 30 21:09:25 compute-1 virtqemud[192217]: hostname: compute-1
Sep 30 21:09:25 compute-1 virtqemud[192217]: End of file while reading data: Input/output error
Sep 30 21:09:25 compute-1 systemd[1]: libpod-e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c.scope: Deactivated successfully.
Sep 30 21:09:25 compute-1 podman[192737]: 2025-09-30 21:09:25.116662758 +0000 UTC m=+0.478580841 container died e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute)
Sep 30 21:09:25 compute-1 systemd[1]: libpod-e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c.scope: Consumed 3.135s CPU time.
Sep 30 21:09:25 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:09:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc-merged.mount: Deactivated successfully.
Sep 30 21:09:25 compute-1 podman[192737]: 2025-09-30 21:09:25.208079895 +0000 UTC m=+0.569997988 container cleanup e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:09:25 compute-1 podman[192737]: nova_compute
Sep 30 21:09:25 compute-1 podman[192766]: nova_compute
Sep 30 21:09:25 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Sep 30 21:09:25 compute-1 systemd[1]: Stopped nova_compute container.
Sep 30 21:09:25 compute-1 systemd[1]: Starting nova_compute container...
Sep 30 21:09:25 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:09:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/679d02d7035a81ac2165568bd580463e046b482411e6c34730e885dbe67564fc/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-1 podman[192779]: 2025-09-30 21:09:25.458496408 +0000 UTC m=+0.124599176 container init e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:09:25 compute-1 podman[192779]: 2025-09-30 21:09:25.465448644 +0000 UTC m=+0.131551362 container start e26f20e64de19ff50b42a4e750f6c7388abbe13affcce931b8e6a21c9dd8008c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:09:25 compute-1 nova_compute[192795]: + sudo -E kolla_set_configs
Sep 30 21:09:25 compute-1 podman[192779]: nova_compute
Sep 30 21:09:25 compute-1 systemd[1]: Started nova_compute container.
Sep 30 21:09:25 compute-1 sudo[192731]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Validating config file
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Copying service configuration files
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Deleting /etc/ceph
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Creating directory /etc/ceph
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Writing out command to execute
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:25 compute-1 nova_compute[192795]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 21:09:25 compute-1 nova_compute[192795]: ++ cat /run_command
Sep 30 21:09:25 compute-1 nova_compute[192795]: + CMD=nova-compute
Sep 30 21:09:25 compute-1 nova_compute[192795]: + ARGS=
Sep 30 21:09:25 compute-1 nova_compute[192795]: + sudo kolla_copy_cacerts
Sep 30 21:09:25 compute-1 nova_compute[192795]: + [[ ! -n '' ]]
Sep 30 21:09:25 compute-1 nova_compute[192795]: + . kolla_extend_start
Sep 30 21:09:25 compute-1 nova_compute[192795]: Running command: 'nova-compute'
Sep 30 21:09:25 compute-1 nova_compute[192795]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 21:09:25 compute-1 nova_compute[192795]: + umask 0022
Sep 30 21:09:25 compute-1 nova_compute[192795]: + exec nova-compute
Sep 30 21:09:26 compute-1 sudo[192956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuooqfgyuwniovreckfebbuydmextbax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266565.85781-5246-113697302573047/AnsiballZ_podman_container.py'
Sep 30 21:09:26 compute-1 sudo[192956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:26 compute-1 python3.9[192958]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 21:09:26 compute-1 systemd[1]: Started libpod-conmon-e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329.scope.
Sep 30 21:09:26 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:09:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1831a61e2cf5e398ea94563ff46fe630802347892b551ad4b82f3982e3b3fe4d/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1831a61e2cf5e398ea94563ff46fe630802347892b551ad4b82f3982e3b3fe4d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1831a61e2cf5e398ea94563ff46fe630802347892b551ad4b82f3982e3b3fe4d/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:26 compute-1 podman[192984]: 2025-09-30 21:09:26.813497719 +0000 UTC m=+0.137703168 container init e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Sep 30 21:09:26 compute-1 podman[192984]: 2025-09-30 21:09:26.82402148 +0000 UTC m=+0.148226919 container start e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:09:26 compute-1 python3.9[192958]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Applying nova statedir ownership
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Sep 30 21:09:26 compute-1 nova_compute_init[193004]: INFO:nova_statedir:Nova statedir ownership complete
Sep 30 21:09:26 compute-1 systemd[1]: libpod-e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329.scope: Deactivated successfully.
Sep 30 21:09:26 compute-1 conmon[192998]: conmon e0f36e6c68726fb936d3 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329.scope/container/memory.events
Sep 30 21:09:26 compute-1 podman[193018]: 2025-09-30 21:09:26.922178687 +0000 UTC m=+0.025411171 container died e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:09:26 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329-userdata-shm.mount: Deactivated successfully.
Sep 30 21:09:26 compute-1 systemd[1]: var-lib-containers-storage-overlay-1831a61e2cf5e398ea94563ff46fe630802347892b551ad4b82f3982e3b3fe4d-merged.mount: Deactivated successfully.
Sep 30 21:09:26 compute-1 podman[193018]: 2025-09-30 21:09:26.965382324 +0000 UTC m=+0.068614788 container cleanup e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:09:26 compute-1 systemd[1]: libpod-conmon-e0f36e6c68726fb936d3ab99f28c3aa2d25250205d57cb5e06ab0a9ccf62b329.scope: Deactivated successfully.
Sep 30 21:09:26 compute-1 sudo[192956]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:27 compute-1 sshd-session[158351]: Connection closed by 192.168.122.30 port 42280
Sep 30 21:09:27 compute-1 sshd-session[158348]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:09:27 compute-1 nova_compute[192795]: 2025-09-30 21:09:27.530 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:27 compute-1 nova_compute[192795]: 2025-09-30 21:09:27.530 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:27 compute-1 nova_compute[192795]: 2025-09-30 21:09:27.530 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:27 compute-1 nova_compute[192795]: 2025-09-30 21:09:27.530 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 21:09:27 compute-1 systemd-logind[793]: Session 25 logged out. Waiting for processes to exit.
Sep 30 21:09:27 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Sep 30 21:09:27 compute-1 systemd[1]: session-25.scope: Consumed 2min 16.104s CPU time.
Sep 30 21:09:27 compute-1 systemd-logind[793]: Removed session 25.
Sep 30 21:09:27 compute-1 nova_compute[192795]: 2025-09-30 21:09:27.671 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:09:27 compute-1 nova_compute[192795]: 2025-09-30 21:09:27.683 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.118 2 INFO nova.virt.driver [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.210 2 INFO nova.compute.provider_config [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.220 2 DEBUG oslo_concurrency.lockutils [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.220 2 DEBUG oslo_concurrency.lockutils [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.220 2 DEBUG oslo_concurrency.lockutils [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.220 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.221 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.221 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.221 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.221 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.221 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.221 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.221 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.222 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.222 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.222 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.222 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.222 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.222 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.222 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.223 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.223 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.223 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.223 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.223 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.223 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.223 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.224 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.224 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.224 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.224 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.224 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.224 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.225 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.225 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.225 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.225 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.225 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.226 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.226 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.226 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.226 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.226 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.226 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.226 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.227 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.227 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.227 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.227 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.227 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.227 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.228 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.228 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.228 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.228 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.228 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.228 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.229 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.229 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.229 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.229 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.229 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.229 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.229 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.230 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.230 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.230 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.230 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.230 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.230 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.230 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.230 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.231 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.231 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.231 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.231 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.231 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.231 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.231 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.232 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.232 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.232 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.232 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.232 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.232 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.232 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.233 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.233 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.233 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.233 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.233 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.233 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.233 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.234 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.234 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.234 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.234 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.234 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.234 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.234 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.235 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.235 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.235 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.235 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.235 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.235 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.235 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.235 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.236 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.236 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.236 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.236 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.236 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.236 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.236 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.237 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.237 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.237 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.237 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.237 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.237 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.237 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.238 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.238 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.238 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.238 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.238 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.238 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.238 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.239 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.239 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.239 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.239 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.239 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.239 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.239 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.240 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.240 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.240 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.240 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.240 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.240 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.240 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.240 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.241 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.241 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.241 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.241 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.241 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.241 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.242 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.242 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.242 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.242 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.242 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.242 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.243 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.243 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.243 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.243 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.243 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.243 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.243 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.244 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.244 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.244 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.244 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.244 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.244 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.244 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.245 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.245 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.245 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.245 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.245 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.245 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.245 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.246 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.246 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.246 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.246 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.246 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.246 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.247 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.247 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.247 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.247 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.247 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.247 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.247 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.248 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.248 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.248 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.248 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.248 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.248 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.248 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.248 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.249 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.249 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.249 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.249 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.249 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.249 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.249 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.250 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.250 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.250 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.250 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.250 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.250 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.250 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.251 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.251 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.251 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.251 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.251 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.251 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.251 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.252 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.252 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.252 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.252 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.252 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.252 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.253 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.253 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.253 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.253 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.253 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.253 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.253 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.254 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.254 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.254 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.254 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.254 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.254 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.254 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.255 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.255 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.255 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.255 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.255 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.255 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.256 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.256 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.256 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.256 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.256 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.256 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.256 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.256 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.257 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.257 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.257 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.257 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.257 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.257 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.257 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.258 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.258 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.258 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.258 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.258 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.258 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.258 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.259 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.259 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.259 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.259 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.259 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.259 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.259 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.260 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.260 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.260 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.260 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.260 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.260 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.260 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.261 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.261 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.261 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.261 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.261 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.261 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.262 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.262 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.262 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.262 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.262 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.262 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.262 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.263 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.263 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.263 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.263 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.263 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.263 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.264 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.264 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.264 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.264 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.264 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.264 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.264 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.264 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.265 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.265 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.265 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.265 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.265 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.265 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.265 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.266 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.266 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.266 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.266 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.266 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.266 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.266 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.267 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.267 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.267 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.267 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.267 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.267 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.267 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.268 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.268 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.268 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.268 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.268 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.268 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.268 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.268 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.269 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.269 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.269 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.269 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.269 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.269 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.270 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.270 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.270 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.270 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.270 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.270 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.270 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.271 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.271 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.271 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.271 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.271 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.271 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.271 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.272 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.272 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.272 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.272 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.272 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.273 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.273 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.273 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.273 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.273 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.273 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.273 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.274 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.274 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.274 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.274 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.274 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.274 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.275 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.275 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.275 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.275 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.275 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.275 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.275 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.276 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.276 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.276 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.276 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.276 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.276 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.277 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.277 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.277 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.277 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.277 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.278 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.278 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.278 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.278 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.278 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.279 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.279 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.279 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.279 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.279 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.280 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.280 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.280 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.280 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.280 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.280 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.281 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.281 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.281 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.281 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.281 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.281 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.281 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.282 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.282 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.282 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.282 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.282 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.282 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.282 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.283 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.283 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.283 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.283 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.283 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.283 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.283 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.283 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.284 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.284 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.284 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.284 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.284 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.284 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.284 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.285 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.285 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.285 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.285 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.285 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.285 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.286 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.286 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.286 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.286 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.286 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.286 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.287 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.287 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.287 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.287 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.287 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.288 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.288 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.288 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.288 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.288 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.289 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.289 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.289 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.289 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.289 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.290 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.290 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.290 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.290 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.290 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.290 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.291 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.291 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.291 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.291 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.291 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.292 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.292 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.292 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.292 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.292 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.293 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.293 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.293 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.293 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.293 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.293 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.293 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.294 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.294 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.294 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.294 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.294 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.295 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.295 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.295 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.295 2 WARNING oslo_config.cfg [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 21:09:28 compute-1 nova_compute[192795]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 21:09:28 compute-1 nova_compute[192795]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 21:09:28 compute-1 nova_compute[192795]: and ``live_migration_inbound_addr`` respectively.
Sep 30 21:09:28 compute-1 nova_compute[192795]: ).  Its value may be silently ignored in the future.
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.296 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.296 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.296 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.296 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.296 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.296 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.297 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.297 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.297 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.297 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.297 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.297 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.297 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.298 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.298 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.298 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.298 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.298 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.298 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.298 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.299 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.299 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.299 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.299 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.299 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.299 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.299 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.300 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.300 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.300 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.300 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.300 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.301 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.301 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.301 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.301 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.301 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.301 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.301 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.302 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.302 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.302 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.302 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.302 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.302 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.302 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.302 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.303 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.303 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.303 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.303 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.303 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.303 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.303 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.304 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.304 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.304 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.304 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.304 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.304 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.304 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.305 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.305 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.305 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.305 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.305 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.305 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.305 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.306 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.306 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.306 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.306 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.306 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.306 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.306 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.307 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.307 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.307 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.307 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.307 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.307 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.307 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.308 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.308 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.308 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.308 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.308 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.308 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.309 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.309 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.309 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.309 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.309 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.310 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.310 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.310 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.310 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.310 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.310 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.310 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.311 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.311 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.311 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.311 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.311 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.311 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.311 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.312 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.312 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.312 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.312 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.312 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.312 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.313 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.313 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.313 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.313 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.313 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.313 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.314 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.314 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.314 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.314 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.314 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.315 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.315 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.315 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.315 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.315 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.316 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.316 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.316 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.316 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.316 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.317 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.317 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.317 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.317 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.317 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.318 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.318 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.318 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.318 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.318 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.319 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.319 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.319 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.319 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.319 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.319 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.319 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.320 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.320 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.320 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.320 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.320 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.320 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.320 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.321 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.321 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.321 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.321 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.321 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.321 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.322 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.322 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.322 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.322 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.322 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.322 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.322 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.323 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.323 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.323 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.323 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.323 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.323 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.323 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.324 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.324 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.324 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.324 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.324 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.324 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.325 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.325 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.325 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.325 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.325 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.325 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.325 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.326 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.326 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.326 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.326 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.326 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.326 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.327 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.327 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.327 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.327 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.327 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.327 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.327 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.327 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.328 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.328 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.328 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.328 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.328 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.328 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.328 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.329 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.329 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.329 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.329 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.329 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.329 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.329 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.330 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.330 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.330 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.330 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.330 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.330 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.330 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.331 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.331 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.331 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.331 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.331 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.331 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.331 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.331 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.332 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.332 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.332 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.332 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.332 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.332 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.332 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.333 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.333 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.333 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.333 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.333 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.333 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.333 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.334 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.334 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.334 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.334 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.334 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.334 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.335 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.335 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.335 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.335 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.335 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.335 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.335 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.335 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.336 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.336 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.336 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.336 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.336 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.336 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.336 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.337 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.337 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.337 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.337 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.337 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.337 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.337 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.338 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.338 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.338 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.338 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.338 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.338 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.338 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.339 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.339 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.339 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.339 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.339 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.339 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.339 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.340 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.340 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.340 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.340 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.340 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.340 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.340 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.341 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.341 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.341 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.341 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.341 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.341 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.341 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.342 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.342 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.342 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.342 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.342 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.342 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.342 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.343 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.343 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.343 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.343 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.343 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.343 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.343 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.344 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.344 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.344 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.344 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.344 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.344 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.344 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.345 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.345 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.345 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.345 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.345 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.345 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.345 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.346 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.346 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.346 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.346 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.346 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.346 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.347 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.347 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.347 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.347 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.347 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.347 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.347 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.348 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.348 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.348 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.348 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.348 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.348 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.348 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.349 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.349 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.349 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.349 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.349 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.349 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.349 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.349 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.350 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.350 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.350 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.350 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.350 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.350 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.350 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.351 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.351 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.351 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.351 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.351 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.351 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.351 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.352 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.352 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.352 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.352 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.352 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.352 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.352 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.352 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.353 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.353 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.353 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.353 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.353 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.353 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.353 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.354 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.354 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.354 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.354 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.354 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.354 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.354 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.355 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.355 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.355 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.355 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.355 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.355 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.355 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.356 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.356 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.356 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.356 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.356 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.356 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.356 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.357 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.357 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.357 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.357 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.357 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.357 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.357 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.358 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.358 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.358 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.358 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.358 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.358 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.358 2 DEBUG oslo_service.service [None req-3ad675c7-2eca-4025-a105-d6116c7a982b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.359 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.371 2 INFO nova.virt.node [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Determined node identity e551d5b4-e9f6-409e-b2a1-508a20c11333 from /var/lib/nova/compute_id
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.372 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.372 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.372 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.373 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.385 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f9e8f942250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.388 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f9e8f942250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.388 2 INFO nova.virt.libvirt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Connection event '1' reason 'None'
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.397 2 INFO nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Libvirt host capabilities <capabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]: 
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <host>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <uuid>1cca553b-9dfa-4866-8e03-ea5bf0299bf0</uuid>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <arch>x86_64</arch>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <microcode version='16777317'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <signature family='23' model='49' stepping='0'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <maxphysaddr mode='emulate' bits='40'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='x2apic'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='tsc-deadline'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='osxsave'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='hypervisor'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='tsc_adjust'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='spec-ctrl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='stibp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='arch-capabilities'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='cmp_legacy'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='topoext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='virt-ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='lbrv'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='tsc-scale'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='vmcb-clean'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='pause-filter'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='pfthreshold'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='svme-addr-chk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='rdctl-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='mds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature name='pschange-mc-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <pages unit='KiB' size='4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <pages unit='KiB' size='2048'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <pages unit='KiB' size='1048576'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <power_management>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <suspend_mem/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <suspend_disk/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <suspend_hybrid/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </power_management>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <iommu support='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <migration_features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <live/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <uri_transports>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <uri_transport>tcp</uri_transport>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <uri_transport>rdma</uri_transport>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </uri_transports>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </migration_features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <topology>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <cells num='1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <cell id='0'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:           <memory unit='KiB'>7864116</memory>
Sep 30 21:09:28 compute-1 nova_compute[192795]:           <pages unit='KiB' size='4'>1966029</pages>
Sep 30 21:09:28 compute-1 nova_compute[192795]:           <pages unit='KiB' size='2048'>0</pages>
Sep 30 21:09:28 compute-1 nova_compute[192795]:           <pages unit='KiB' size='1048576'>0</pages>
Sep 30 21:09:28 compute-1 nova_compute[192795]:           <distances>
Sep 30 21:09:28 compute-1 nova_compute[192795]:             <sibling id='0' value='10'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:           </distances>
Sep 30 21:09:28 compute-1 nova_compute[192795]:           <cpus num='8'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:           </cpus>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         </cell>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </cells>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </topology>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <cache>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </cache>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <secmodel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model>selinux</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <doi>0</doi>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </secmodel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <secmodel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model>dac</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <doi>0</doi>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <baselabel type='kvm'>+107:+107</baselabel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <baselabel type='qemu'>+107:+107</baselabel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </secmodel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </host>
Sep 30 21:09:28 compute-1 nova_compute[192795]: 
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <guest>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <os_type>hvm</os_type>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <arch name='i686'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <wordsize>32</wordsize>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <domain type='qemu'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <domain type='kvm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </arch>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <pae/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <nonpae/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <acpi default='on' toggle='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <apic default='on' toggle='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <cpuselection/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <deviceboot/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <disksnapshot default='on' toggle='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <externalSnapshot/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </guest>
Sep 30 21:09:28 compute-1 nova_compute[192795]: 
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <guest>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <os_type>hvm</os_type>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <arch name='x86_64'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <wordsize>64</wordsize>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <domain type='qemu'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <domain type='kvm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </arch>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <acpi default='on' toggle='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <apic default='on' toggle='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <cpuselection/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <deviceboot/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <disksnapshot default='on' toggle='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <externalSnapshot/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </guest>
Sep 30 21:09:28 compute-1 nova_compute[192795]: 
Sep 30 21:09:28 compute-1 nova_compute[192795]: </capabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]: 
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.401 2 DEBUG nova.virt.libvirt.volume.mount [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.403 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.409 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Sep 30 21:09:28 compute-1 nova_compute[192795]: <domainCapabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <domain>kvm</domain>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <arch>i686</arch>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <vcpu max='4096'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <iothreads supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <os supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <enum name='firmware'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <loader supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>rom</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pflash</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='readonly'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>yes</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>no</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='secure'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>no</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </loader>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </os>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>on</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>off</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='maximumMigratable'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>on</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>off</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='succor'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='custom' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-128'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-256'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-512'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='KnightsMill'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SierraForest'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='athlon'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='athlon-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='core2duo'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='core2duo-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='coreduo'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='coreduo-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='n270'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='n270-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='phenom'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='phenom-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <memoryBacking supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <enum name='sourceType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>file</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>anonymous</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>memfd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </memoryBacking>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <disk supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='diskDevice'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>disk</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>cdrom</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>floppy</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>lun</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='bus'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>fdc</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>scsi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>sata</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <graphics supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vnc</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>egl-headless</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>dbus</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </graphics>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <video supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='modelType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vga</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>cirrus</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>none</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>bochs</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ramfb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </video>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <hostdev supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='mode'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>subsystem</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='startupPolicy'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>default</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>mandatory</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>requisite</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>optional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='subsysType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pci</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>scsi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='capsType'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='pciBackend'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </hostdev>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <rng supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>random</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>egd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>builtin</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <filesystem supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='driverType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>path</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>handle</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtiofs</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </filesystem>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <tpm supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tpm-tis</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tpm-crb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>emulator</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>external</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendVersion'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>2.0</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </tpm>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <redirdev supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='bus'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </redirdev>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <channel supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pty</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>unix</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </channel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <crypto supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>qemu</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>builtin</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </crypto>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <interface supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>default</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>passt</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <panic supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>isa</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>hyperv</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </panic>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <gic supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <genid supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <backup supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <async-teardown supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <ps2 supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <sev supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <sgx supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <hyperv supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='features'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>relaxed</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vapic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>spinlocks</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vpindex</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>runtime</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>synic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>stimer</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>reset</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vendor_id</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>frequencies</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>reenlightenment</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tlbflush</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ipi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>avic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>emsr_bitmap</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>xmm_input</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </hyperv>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <launchSecurity supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </features>
Sep 30 21:09:28 compute-1 nova_compute[192795]: </domainCapabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.416 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Sep 30 21:09:28 compute-1 nova_compute[192795]: <domainCapabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <domain>kvm</domain>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <arch>i686</arch>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <vcpu max='240'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <iothreads supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <os supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <enum name='firmware'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <loader supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>rom</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pflash</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='readonly'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>yes</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>no</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='secure'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>no</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </loader>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </os>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>on</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>off</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='maximumMigratable'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>on</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>off</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='succor'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='custom' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-128'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-256'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-512'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='KnightsMill'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SierraForest'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='athlon'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='athlon-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='core2duo'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='core2duo-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='coreduo'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='coreduo-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='n270'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='n270-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='phenom'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='phenom-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <memoryBacking supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <enum name='sourceType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>file</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>anonymous</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>memfd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </memoryBacking>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <disk supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='diskDevice'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>disk</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>cdrom</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>floppy</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>lun</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='bus'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ide</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>fdc</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>scsi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>sata</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <graphics supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vnc</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>egl-headless</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>dbus</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </graphics>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <video supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='modelType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vga</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>cirrus</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>none</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>bochs</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ramfb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </video>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <hostdev supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='mode'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>subsystem</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='startupPolicy'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>default</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>mandatory</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>requisite</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>optional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='subsysType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pci</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>scsi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='capsType'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='pciBackend'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </hostdev>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <rng supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>random</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>egd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>builtin</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <filesystem supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='driverType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>path</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>handle</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtiofs</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </filesystem>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <tpm supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tpm-tis</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tpm-crb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>emulator</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>external</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendVersion'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>2.0</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </tpm>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <redirdev supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='bus'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </redirdev>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <channel supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pty</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>unix</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </channel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <crypto supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>qemu</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>builtin</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </crypto>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <interface supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>default</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>passt</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <panic supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>isa</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>hyperv</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </panic>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <gic supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <genid supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <backup supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <async-teardown supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <ps2 supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <sev supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <sgx supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <hyperv supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='features'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>relaxed</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vapic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>spinlocks</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vpindex</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>runtime</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>synic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>stimer</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>reset</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vendor_id</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>frequencies</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>reenlightenment</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tlbflush</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ipi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>avic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>emsr_bitmap</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>xmm_input</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </hyperv>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <launchSecurity supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </features>
Sep 30 21:09:28 compute-1 nova_compute[192795]: </domainCapabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.458 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.461 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Sep 30 21:09:28 compute-1 nova_compute[192795]: <domainCapabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <domain>kvm</domain>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <arch>x86_64</arch>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <vcpu max='4096'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <iothreads supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <os supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <enum name='firmware'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>efi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <loader supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>rom</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pflash</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='readonly'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>yes</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>no</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='secure'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>yes</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>no</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </loader>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </os>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>on</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>off</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='maximumMigratable'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>on</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>off</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='succor'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='custom' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-128'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-256'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-512'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='KnightsMill'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SierraForest'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='athlon'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='athlon-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='core2duo'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='core2duo-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='coreduo'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='coreduo-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='n270'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='n270-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='phenom'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='phenom-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <memoryBacking supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <enum name='sourceType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>file</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>anonymous</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>memfd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </memoryBacking>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <disk supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='diskDevice'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>disk</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>cdrom</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>floppy</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>lun</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='bus'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>fdc</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>scsi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>sata</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <graphics supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vnc</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>egl-headless</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>dbus</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </graphics>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <video supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='modelType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vga</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>cirrus</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>none</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>bochs</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ramfb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </video>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <hostdev supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='mode'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>subsystem</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='startupPolicy'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>default</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>mandatory</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>requisite</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>optional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='subsysType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pci</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>scsi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='capsType'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='pciBackend'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </hostdev>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <rng supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>random</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>egd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>builtin</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <filesystem supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='driverType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>path</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>handle</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtiofs</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </filesystem>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <tpm supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tpm-tis</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tpm-crb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>emulator</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>external</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendVersion'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>2.0</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </tpm>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <redirdev supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='bus'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </redirdev>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <channel supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pty</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>unix</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </channel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <crypto supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>qemu</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>builtin</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </crypto>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <interface supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>default</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>passt</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <panic supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>isa</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>hyperv</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </panic>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <gic supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <genid supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <backup supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <async-teardown supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <ps2 supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <sev supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <sgx supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <hyperv supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='features'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>relaxed</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vapic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>spinlocks</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vpindex</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>runtime</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>synic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>stimer</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>reset</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vendor_id</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>frequencies</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>reenlightenment</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tlbflush</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ipi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>avic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>emsr_bitmap</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>xmm_input</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </hyperv>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <launchSecurity supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </features>
Sep 30 21:09:28 compute-1 nova_compute[192795]: </domainCapabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.539 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Sep 30 21:09:28 compute-1 nova_compute[192795]: <domainCapabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <domain>kvm</domain>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <arch>x86_64</arch>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <vcpu max='240'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <iothreads supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <os supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <enum name='firmware'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <loader supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>rom</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pflash</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='readonly'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>yes</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>no</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='secure'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>no</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </loader>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </os>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>on</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>off</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='maximumMigratable'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>on</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>off</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='succor'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <mode name='custom' supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Denverton-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='EPYC-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-128'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-256'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx10-512'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Haswell-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='KnightsMill'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xop'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='la57'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SierraForest'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='hle'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='pku'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='erms'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='athlon'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='athlon-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='core2duo'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='core2duo-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='coreduo'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='coreduo-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='n270'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='n270-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='ss'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='phenom'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <blockers model='phenom-v1'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </blockers>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </mode>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <memoryBacking supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <enum name='sourceType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>file</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>anonymous</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <value>memfd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </memoryBacking>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <disk supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='diskDevice'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>disk</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>cdrom</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>floppy</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>lun</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='bus'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ide</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>fdc</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>scsi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>sata</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <graphics supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vnc</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>egl-headless</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>dbus</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </graphics>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <video supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='modelType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vga</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>cirrus</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>none</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>bochs</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ramfb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </video>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <hostdev supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='mode'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>subsystem</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='startupPolicy'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>default</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>mandatory</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>requisite</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>optional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='subsysType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pci</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>scsi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='capsType'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='pciBackend'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </hostdev>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <rng supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>random</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>egd</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>builtin</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <filesystem supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='driverType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>path</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>handle</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>virtiofs</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </filesystem>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <tpm supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tpm-tis</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tpm-crb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>emulator</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>external</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendVersion'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>2.0</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </tpm>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <redirdev supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='bus'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>usb</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </redirdev>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <channel supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>pty</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>unix</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </channel>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <crypto supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='type'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>qemu</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>builtin</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </crypto>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <interface supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='backendType'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>default</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>passt</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <panic supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='model'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>isa</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>hyperv</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </panic>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <features>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <gic supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <genid supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <backup supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <async-teardown supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <ps2 supported='yes'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <sev supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <sgx supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <hyperv supported='yes'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       <enum name='features'>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>relaxed</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vapic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>spinlocks</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vpindex</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>runtime</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>synic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>stimer</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>reset</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>vendor_id</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>frequencies</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>reenlightenment</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>tlbflush</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>ipi</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>avic</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>emsr_bitmap</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:         <value>xmm_input</value>
Sep 30 21:09:28 compute-1 nova_compute[192795]:       </enum>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     </hyperv>
Sep 30 21:09:28 compute-1 nova_compute[192795]:     <launchSecurity supported='no'/>
Sep 30 21:09:28 compute-1 nova_compute[192795]:   </features>
Sep 30 21:09:28 compute-1 nova_compute[192795]: </domainCapabilities>
Sep 30 21:09:28 compute-1 nova_compute[192795]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.608 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.609 2 INFO nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Secure Boot support detected
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.611 2 INFO nova.virt.libvirt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.611 2 INFO nova.virt.libvirt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.622 2 DEBUG nova.virt.libvirt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] cpu compare xml: <cpu match="exact">
Sep 30 21:09:28 compute-1 nova_compute[192795]:   <model>Nehalem</model>
Sep 30 21:09:28 compute-1 nova_compute[192795]: </cpu>
Sep 30 21:09:28 compute-1 nova_compute[192795]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.625 2 DEBUG nova.virt.libvirt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.662 2 INFO nova.virt.node [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Determined node identity e551d5b4-e9f6-409e-b2a1-508a20c11333 from /var/lib/nova/compute_id
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.719 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Verified node e551d5b4-e9f6-409e-b2a1-508a20c11333 matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.746 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.849 2 DEBUG oslo_concurrency.lockutils [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.849 2 DEBUG oslo_concurrency.lockutils [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.849 2 DEBUG oslo_concurrency.lockutils [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:28 compute-1 nova_compute[192795]: 2025-09-30 21:09:28.849 2 DEBUG nova.compute.resource_tracker [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.030 2 WARNING nova.virt.libvirt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.031 2 DEBUG nova.compute.resource_tracker [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6155MB free_disk=73.66912460327148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.031 2 DEBUG oslo_concurrency.lockutils [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.032 2 DEBUG oslo_concurrency.lockutils [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.179 2 DEBUG nova.compute.resource_tracker [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.179 2 DEBUG nova.compute.resource_tracker [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.243 2 DEBUG nova.scheduler.client.report [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.254 2 DEBUG nova.scheduler.client.report [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.254 2 DEBUG nova.compute.provider_tree [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.268 2 DEBUG nova.scheduler.client.report [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.309 2 DEBUG nova.scheduler.client.report [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.336 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Sep 30 21:09:29 compute-1 nova_compute[192795]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.336 2 INFO nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] kernel doesn't support AMD SEV
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.337 2 DEBUG nova.compute.provider_tree [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.337 2 DEBUG nova.virt.libvirt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.339 2 DEBUG nova.virt.libvirt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Libvirt baseline CPU <cpu>
Sep 30 21:09:29 compute-1 nova_compute[192795]:   <arch>x86_64</arch>
Sep 30 21:09:29 compute-1 nova_compute[192795]:   <model>Nehalem</model>
Sep 30 21:09:29 compute-1 nova_compute[192795]:   <vendor>AMD</vendor>
Sep 30 21:09:29 compute-1 nova_compute[192795]:   <topology sockets="8" cores="1" threads="1"/>
Sep 30 21:09:29 compute-1 nova_compute[192795]: </cpu>
Sep 30 21:09:29 compute-1 nova_compute[192795]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.397 2 DEBUG nova.scheduler.client.report [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Updated inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.397 2 DEBUG nova.compute.provider_tree [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Updating resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.397 2 DEBUG nova.compute.provider_tree [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.498 2 DEBUG nova.compute.provider_tree [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Updating resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.536 2 DEBUG nova.compute.resource_tracker [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.536 2 DEBUG oslo_concurrency.lockutils [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.536 2 DEBUG nova.service [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.602 2 DEBUG nova.service [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Sep 30 21:09:29 compute-1 nova_compute[192795]: 2025-09-30 21:09:29.602 2 DEBUG nova.servicegroup.drivers.db [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Sep 30 21:09:31 compute-1 podman[193096]: 2025-09-30 21:09:31.22189309 +0000 UTC m=+0.067224820 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:09:33 compute-1 sshd-session[193116]: Accepted publickey for zuul from 192.168.122.30 port 38168 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:09:33 compute-1 systemd-logind[793]: New session 28 of user zuul.
Sep 30 21:09:33 compute-1 systemd[1]: Started Session 28 of User zuul.
Sep 30 21:09:33 compute-1 sshd-session[193116]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:09:35 compute-1 python3.9[193269]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 21:09:35 compute-1 podman[193274]: 2025-09-30 21:09:35.258637605 +0000 UTC m=+0.089284421 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:09:36 compute-1 sudo[193443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsyxkzycsksmwepmeogeqjiibgyymyse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266575.7665374-74-196305736554877/AnsiballZ_systemd_service.py'
Sep 30 21:09:36 compute-1 sudo[193443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:36 compute-1 python3.9[193445]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:09:36 compute-1 systemd[1]: Reloading.
Sep 30 21:09:36 compute-1 systemd-rc-local-generator[193470]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:09:36 compute-1 systemd-sysv-generator[193475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:09:37 compute-1 sudo[193443]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:38 compute-1 python3.9[193629]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:09:38 compute-1 network[193646]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:09:38 compute-1 network[193647]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:09:38 compute-1 network[193648]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:09:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:09:38.669 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:09:38.670 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:09:38.670 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:39 compute-1 podman[193655]: 2025-09-30 21:09:39.16010158 +0000 UTC m=+0.134182833 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:09:43 compute-1 sudo[193949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqpybszuffflxmhrcpdfqyfgagdirdzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266583.633819-131-200609164517502/AnsiballZ_systemd_service.py'
Sep 30 21:09:43 compute-1 sudo[193949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:44 compute-1 python3.9[193951]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:09:44 compute-1 sudo[193949]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:45 compute-1 sudo[194102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyztcgqkdyzxglyzrmptmphqmxkxrjrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266584.8509274-161-231843074610997/AnsiballZ_file.py'
Sep 30 21:09:45 compute-1 sudo[194102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:45 compute-1 python3.9[194104]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:09:45 compute-1 sudo[194102]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:45 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:09:45 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:09:46 compute-1 sudo[194255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nydosisiwbhcwtvtnrivznxbyrumjpac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266585.7892776-185-280879223592600/AnsiballZ_file.py'
Sep 30 21:09:46 compute-1 sudo[194255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:46 compute-1 python3.9[194257]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:09:46 compute-1 sudo[194255]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:47 compute-1 podman[194381]: 2025-09-30 21:09:47.169688526 +0000 UTC m=+0.046811721 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:09:47 compute-1 sudo[194424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltypyirigoufmqguilfmqofbwrisljdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266586.7050452-212-232370414685299/AnsiballZ_command.py'
Sep 30 21:09:47 compute-1 sudo[194424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:47 compute-1 python3.9[194428]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:09:47 compute-1 sudo[194424]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:48 compute-1 python3.9[194580]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:09:48 compute-1 sudo[194730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzrbaryuuskbdolllejvzdfwhndcslsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266588.6847804-266-151040959950516/AnsiballZ_systemd_service.py'
Sep 30 21:09:48 compute-1 sudo[194730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:49 compute-1 python3.9[194732]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:09:49 compute-1 systemd[1]: Reloading.
Sep 30 21:09:49 compute-1 systemd-rc-local-generator[194759]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:09:49 compute-1 systemd-sysv-generator[194762]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:09:49 compute-1 sudo[194730]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:50 compute-1 sudo[194916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqeqiifsgizvkyxgtcdlqzoudkgolvgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266589.9842863-290-32885684347526/AnsiballZ_command.py'
Sep 30 21:09:50 compute-1 sudo[194916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:50 compute-1 python3.9[194918]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:09:50 compute-1 sudo[194916]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:51 compute-1 sudo[195069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gojsuqddouprrnnkmnrwhhysdairivsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266590.9524186-317-260459472022391/AnsiballZ_file.py'
Sep 30 21:09:51 compute-1 sudo[195069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:51 compute-1 python3.9[195071]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:09:51 compute-1 sudo[195069]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:52 compute-1 python3.9[195221]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:53 compute-1 python3.9[195373]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:09:53 compute-1 python3.9[195494]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266592.6527598-365-212034697875130/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:09:54 compute-1 sudo[195644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndqxxffivbbspfodinrdgwbvsnbvicix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266594.2475262-411-74807457831074/AnsiballZ_group.py'
Sep 30 21:09:54 compute-1 sudo[195644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:54 compute-1 python3.9[195646]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Sep 30 21:09:54 compute-1 sudo[195644]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:55 compute-1 sudo[195796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-regebfbkqbzwpajrshyuvwaeejbzenok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266595.3814247-443-256631005907024/AnsiballZ_getent.py'
Sep 30 21:09:55 compute-1 sudo[195796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:55 compute-1 python3.9[195798]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Sep 30 21:09:56 compute-1 sudo[195796]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:56 compute-1 sudo[195949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilqcyjoseadxblobhximiffqhnhjvrho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266596.303975-467-127938320733160/AnsiballZ_group.py'
Sep 30 21:09:56 compute-1 sudo[195949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:56 compute-1 python3.9[195951]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 21:09:56 compute-1 groupadd[195952]: group added to /etc/group: name=ceilometer, GID=42405
Sep 30 21:09:56 compute-1 groupadd[195952]: group added to /etc/gshadow: name=ceilometer
Sep 30 21:09:56 compute-1 groupadd[195952]: new group: name=ceilometer, GID=42405
Sep 30 21:09:56 compute-1 sudo[195949]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:57 compute-1 sudo[196107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxnwgosueeolvthuesljtzkslcygwqrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266597.2055707-491-179215772829628/AnsiballZ_user.py'
Sep 30 21:09:57 compute-1 sudo[196107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:57 compute-1 python3.9[196109]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 21:09:57 compute-1 useradd[196111]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 21:09:57 compute-1 useradd[196111]: add 'ceilometer' to group 'libvirt'
Sep 30 21:09:57 compute-1 useradd[196111]: add 'ceilometer' to shadow group 'libvirt'
Sep 30 21:09:58 compute-1 sudo[196107]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:59 compute-1 python3.9[196267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:00 compute-1 python3.9[196388]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759266599.0351384-569-184581219825077/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:00 compute-1 python3.9[196538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:01 compute-1 python3.9[196659]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759266600.375922-569-98605932966892/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:01 compute-1 podman[196660]: 2025-09-30 21:10:01.420373471 +0000 UTC m=+0.084444944 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:10:01 compute-1 python3.9[196829]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:02 compute-1 python3.9[196950]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759266601.4666896-569-168047679361047/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:03 compute-1 python3.9[197100]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:10:04 compute-1 python3.9[197252]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:10:05 compute-1 python3.9[197404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:05 compute-1 podman[197499]: 2025-09-30 21:10:05.512037114 +0000 UTC m=+0.066447040 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:10:05 compute-1 python3.9[197533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266604.798361-746-201325959030260/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:06 compute-1 python3.9[197697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:06 compute-1 python3.9[197773]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:07 compute-1 python3.9[197923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:07 compute-1 python3.9[198044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266606.8551085-746-254743576843066/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:08 compute-1 python3.9[198194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:08 compute-1 python3.9[198315]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266607.9093587-746-229165593910319/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:09 compute-1 podman[198439]: 2025-09-30 21:10:09.415584352 +0000 UTC m=+0.083748485 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:10:09 compute-1 python3.9[198475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:10 compute-1 python3.9[198610]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266609.0868683-746-256076560356982/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:10 compute-1 python3.9[198760]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:11 compute-1 python3.9[198881]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266610.19374-746-69225927015321/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:11 compute-1 nova_compute[192795]: 2025-09-30 21:10:11.603 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:11 compute-1 nova_compute[192795]: 2025-09-30 21:10:11.637 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:11 compute-1 python3.9[199031]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:12 compute-1 python3.9[199152]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266611.4118645-746-203516735953895/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:12 compute-1 python3.9[199302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:13 compute-1 python3.9[199423]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266612.4867496-746-162925262979431/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:13 compute-1 python3.9[199573]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:14 compute-1 python3.9[199694]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266613.55691-746-43943645976052/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:15 compute-1 python3.9[199844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:15 compute-1 python3.9[199965]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266614.7000601-746-206780446181227/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:16 compute-1 python3.9[200115]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:16 compute-1 python3.9[200236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266615.7761168-746-144127852683098/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:18 compute-1 podman[200261]: 2025-09-30 21:10:18.199261319 +0000 UTC m=+0.049301248 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:10:20 compute-1 python3.9[200405]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:21 compute-1 python3.9[200481]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:22 compute-1 python3.9[200631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:22 compute-1 python3.9[200707]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:23 compute-1 python3.9[200857]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:24 compute-1 python3.9[200933]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:24 compute-1 sudo[201083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efuglycjrllekzdetknginwbprxeadmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266624.3835099-1313-172809829879540/AnsiballZ_file.py'
Sep 30 21:10:24 compute-1 sudo[201083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:24 compute-1 python3.9[201085]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:24 compute-1 sudo[201083]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:25 compute-1 sudo[201235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeinsfdxegmqqbjobdrlrcducbyqxyqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266625.1995585-1337-165637567750175/AnsiballZ_file.py'
Sep 30 21:10:25 compute-1 sudo[201235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:25 compute-1 python3.9[201237]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:25 compute-1 sudo[201235]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:26 compute-1 sudo[201387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndpsajqrspixyucaipnfzrdnnvdfxjfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266625.9888833-1361-149876317727823/AnsiballZ_file.py'
Sep 30 21:10:26 compute-1 sudo[201387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:26 compute-1 python3.9[201389]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:26 compute-1 sudo[201387]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:27 compute-1 sudo[201539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzlhkszzuoklqgtwqjyigejyvumowswy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266626.8503463-1385-54725291025213/AnsiballZ_systemd_service.py'
Sep 30 21:10:27 compute-1 sudo[201539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:27 compute-1 python3.9[201541]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:10:27 compute-1 systemd[1]: Reloading.
Sep 30 21:10:27 compute-1 systemd-rc-local-generator[201569]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:27 compute-1 systemd-sysv-generator[201574]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.735 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.735 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.735 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.736 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.736 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.736 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.736 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.737 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.737 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.770 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.770 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.770 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.770 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:10:27 compute-1 systemd[1]: Listening on Podman API Socket.
Sep 30 21:10:27 compute-1 sudo[201539]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.961 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.962 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6136MB free_disk=73.67160034179688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.963 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:10:27 compute-1 nova_compute[192795]: 2025-09-30 21:10:27.963 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:10:28 compute-1 nova_compute[192795]: 2025-09-30 21:10:28.052 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:10:28 compute-1 nova_compute[192795]: 2025-09-30 21:10:28.053 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:10:28 compute-1 nova_compute[192795]: 2025-09-30 21:10:28.071 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:10:28 compute-1 nova_compute[192795]: 2025-09-30 21:10:28.088 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:10:28 compute-1 nova_compute[192795]: 2025-09-30 21:10:28.089 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:10:28 compute-1 nova_compute[192795]: 2025-09-30 21:10:28.090 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:10:29 compute-1 sudo[201730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylszwpolxtmspevqjgyjvrbaadlfcaqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266629.439653-1412-144463121551539/AnsiballZ_stat.py'
Sep 30 21:10:29 compute-1 sudo[201730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:29 compute-1 python3.9[201732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:29 compute-1 sudo[201730]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:30 compute-1 sudo[201853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikcewjyoczcvgfehgmzodpmmgyignluh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266629.439653-1412-144463121551539/AnsiballZ_copy.py'
Sep 30 21:10:30 compute-1 sudo[201853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:30 compute-1 python3.9[201855]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266629.439653-1412-144463121551539/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:30 compute-1 sudo[201853]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:30 compute-1 sudo[201929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbwynjjxbndszvopyqcjfcaakutrkkrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266629.439653-1412-144463121551539/AnsiballZ_stat.py'
Sep 30 21:10:30 compute-1 sudo[201929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:30 compute-1 python3.9[201931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:30 compute-1 sudo[201929]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:31 compute-1 sudo[202052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqmmxhbwymvgjhzbzfhyvmsvwnzisoxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266629.439653-1412-144463121551539/AnsiballZ_copy.py'
Sep 30 21:10:31 compute-1 sudo[202052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:31 compute-1 python3.9[202054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266629.439653-1412-144463121551539/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:31 compute-1 sudo[202052]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:31 compute-1 podman[202055]: 2025-09-30 21:10:31.799043642 +0000 UTC m=+0.105704086 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.build-date=20250923)
Sep 30 21:10:32 compute-1 sudo[202224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysaqgeopwfffrllbiskumdizcfsdgxgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266632.3494837-1496-251278356967874/AnsiballZ_container_config_data.py'
Sep 30 21:10:32 compute-1 sudo[202224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:33 compute-1 python3.9[202226]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Sep 30 21:10:33 compute-1 sudo[202224]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:33 compute-1 sudo[202376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlsdplcoeequnyngvnjgknclgikmbfma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266633.394364-1523-157942160527569/AnsiballZ_container_config_hash.py'
Sep 30 21:10:33 compute-1 sudo[202376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:34 compute-1 python3.9[202378]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:10:34 compute-1 sudo[202376]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:34 compute-1 sudo[202528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jslgkxnkekskoijvdktetkjogbqulvuq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266634.5227134-1553-252004189872986/AnsiballZ_edpm_container_manage.py'
Sep 30 21:10:34 compute-1 sudo[202528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:35 compute-1 python3[202530]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:10:35 compute-1 podman[202566]: 2025-09-30 21:10:35.477559621 +0000 UTC m=+0.024793117 image pull c1fbb3a9fe801a81492a24a592ec5927cb36487bb102738c2047084bd3d79886 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Sep 30 21:10:35 compute-1 podman[202566]: 2025-09-30 21:10:35.605480355 +0000 UTC m=+0.152713821 container create 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute)
Sep 30 21:10:35 compute-1 python3[202530]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Sep 30 21:10:35 compute-1 sudo[202528]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:36 compute-1 podman[202705]: 2025-09-30 21:10:36.216199487 +0000 UTC m=+0.062891494 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, org.label-schema.vendor=CentOS)
Sep 30 21:10:36 compute-1 sudo[202775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjqzailmzwegoiacnmksrehnudinobty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266635.9965055-1577-161530924871576/AnsiballZ_stat.py'
Sep 30 21:10:36 compute-1 sudo[202775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:36 compute-1 python3.9[202777]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:10:36 compute-1 sudo[202775]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:37 compute-1 sudo[202929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jofsslhvjiasmoxahjvjjxmgneeuziqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266637.023385-1604-116806122849981/AnsiballZ_file.py'
Sep 30 21:10:37 compute-1 sudo[202929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:37 compute-1 python3.9[202931]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:37 compute-1 sudo[202929]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:38 compute-1 sudo[203080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csfydijhgmwnqtozptekndnjfrjagjuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266637.7048073-1604-34870687905023/AnsiballZ_copy.py'
Sep 30 21:10:38 compute-1 sudo[203080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:38 compute-1 python3.9[203082]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266637.7048073-1604-34870687905023/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:38 compute-1 sudo[203080]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:10:38.670 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:10:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:10:38.671 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:10:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:10:38.671 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:10:39 compute-1 sudo[203156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlejrimvkxlhudgsqxdycuawnvrfbnci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266637.7048073-1604-34870687905023/AnsiballZ_systemd.py'
Sep 30 21:10:39 compute-1 sudo[203156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:39 compute-1 python3.9[203158]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:10:39 compute-1 systemd[1]: Reloading.
Sep 30 21:10:39 compute-1 systemd-rc-local-generator[203184]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:39 compute-1 systemd-sysv-generator[203189]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:39 compute-1 sudo[203156]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:39 compute-1 podman[203194]: 2025-09-30 21:10:39.804791646 +0000 UTC m=+0.072889303 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:10:39 compute-1 sudo[203293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjlozjemdiserusgssxmrcyzxhrocmvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266637.7048073-1604-34870687905023/AnsiballZ_systemd.py'
Sep 30 21:10:39 compute-1 sudo[203293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:40 compute-1 python3.9[203295]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:10:40 compute-1 systemd[1]: Reloading.
Sep 30 21:10:40 compute-1 systemd-rc-local-generator[203325]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:40 compute-1 systemd-sysv-generator[203328]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:40 compute-1 systemd[1]: Starting ceilometer_agent_compute container...
Sep 30 21:10:40 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:10:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587bd42bb178bec95864c3234368f2d36082a851ba32d807561efa7590923c5e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587bd42bb178bec95864c3234368f2d36082a851ba32d807561efa7590923c5e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587bd42bb178bec95864c3234368f2d36082a851ba32d807561efa7590923c5e/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587bd42bb178bec95864c3234368f2d36082a851ba32d807561efa7590923c5e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:40 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4.
Sep 30 21:10:40 compute-1 podman[203335]: 2025-09-30 21:10:40.744655488 +0000 UTC m=+0.128490250 container init 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: + sudo -E kolla_set_configs
Sep 30 21:10:40 compute-1 podman[203335]: 2025-09-30 21:10:40.766603059 +0000 UTC m=+0.150437791 container start 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, org.label-schema.build-date=20250923, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:10:40 compute-1 sudo[203356]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: sudo: unable to send audit message: Operation not permitted
Sep 30 21:10:40 compute-1 podman[203335]: ceilometer_agent_compute
Sep 30 21:10:40 compute-1 sudo[203356]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:10:40 compute-1 sudo[203356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Sep 30 21:10:40 compute-1 systemd[1]: Started ceilometer_agent_compute container.
Sep 30 21:10:40 compute-1 sudo[203293]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Validating config file
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Copying service configuration files
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: INFO:__main__:Writing out command to execute
Sep 30 21:10:40 compute-1 sudo[203356]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: ++ cat /run_command
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: + ARGS=
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: + sudo kolla_copy_cacerts
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: sudo: unable to send audit message: Operation not permitted
Sep 30 21:10:40 compute-1 sudo[203379]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:10:40 compute-1 sudo[203379]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:10:40 compute-1 sudo[203379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Sep 30 21:10:40 compute-1 sudo[203379]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: + [[ ! -n '' ]]
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: + . kolla_extend_start
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: + umask 0022
Sep 30 21:10:40 compute-1 ceilometer_agent_compute[203350]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Sep 30 21:10:40 compute-1 podman[203357]: 2025-09-30 21:10:40.853564 +0000 UTC m=+0.077056865 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:10:40 compute-1 systemd[1]: 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4-7b13ecb9caffdfb6.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:10:40 compute-1 systemd[1]: 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4-7b13ecb9caffdfb6.service: Failed with result 'exit-code'.
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.662 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.663 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.663 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.663 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.663 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.663 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.663 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.663 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.663 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.663 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.664 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.664 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.664 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.664 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.664 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.664 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.664 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.664 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.664 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.665 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.666 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.667 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.668 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.669 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.670 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.671 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.672 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.673 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.673 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.673 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.673 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.673 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.673 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.673 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.673 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.673 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.674 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.675 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.676 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.677 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.678 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.697 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.699 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.700 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Sep 30 21:10:41 compute-1 sudo[203534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ximxflnwjszdjkecsogtiyyqmgdzadib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266641.3559432-1676-190103018292746/AnsiballZ_systemd.py'
Sep 30 21:10:41 compute-1 sudo[203534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.835 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.916 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.916 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.916 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.917 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.917 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.917 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.917 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.917 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.917 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.917 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.917 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.918 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.918 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.918 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.918 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.918 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.918 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.918 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.918 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.919 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.920 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.921 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.922 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.922 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.922 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.922 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.922 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.922 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.922 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.922 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.922 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.923 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.926 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.927 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.927 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.927 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.927 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.927 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.927 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.927 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.927 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.927 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.928 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.928 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.928 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.928 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.928 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.928 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.928 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.928 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.929 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.929 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.929 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.929 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.929 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.929 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.929 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.929 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.929 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.938 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.938 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.938 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.938 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.938 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.941 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.946 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:41 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:41.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:42 compute-1 python3.9[203537]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:10:42 compute-1 systemd[1]: Stopping ceilometer_agent_compute container...
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:42.183 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:42.284 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:42.284 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:42.284 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203350]: 2025-09-30 21:10:42.294 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Sep 30 21:10:42 compute-1 virtqemud[192217]: End of file while reading data: Input/output error
Sep 30 21:10:42 compute-1 virtqemud[192217]: End of file while reading data: Input/output error
Sep 30 21:10:42 compute-1 systemd[1]: libpod-5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4.scope: Deactivated successfully.
Sep 30 21:10:42 compute-1 systemd[1]: libpod-5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4.scope: Consumed 1.366s CPU time.
Sep 30 21:10:42 compute-1 podman[203544]: 2025-09-30 21:10:42.46010997 +0000 UTC m=+0.320637713 container died 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:10:42 compute-1 systemd[1]: 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4-7b13ecb9caffdfb6.timer: Deactivated successfully.
Sep 30 21:10:42 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4.
Sep 30 21:10:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4-userdata-shm.mount: Deactivated successfully.
Sep 30 21:10:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-587bd42bb178bec95864c3234368f2d36082a851ba32d807561efa7590923c5e-merged.mount: Deactivated successfully.
Sep 30 21:10:42 compute-1 podman[203544]: 2025-09-30 21:10:42.510904878 +0000 UTC m=+0.371432621 container cleanup 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Sep 30 21:10:42 compute-1 podman[203544]: ceilometer_agent_compute
Sep 30 21:10:42 compute-1 podman[203573]: ceilometer_agent_compute
Sep 30 21:10:42 compute-1 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Sep 30 21:10:42 compute-1 systemd[1]: Stopped ceilometer_agent_compute container.
Sep 30 21:10:42 compute-1 systemd[1]: Starting ceilometer_agent_compute container...
Sep 30 21:10:42 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:10:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587bd42bb178bec95864c3234368f2d36082a851ba32d807561efa7590923c5e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587bd42bb178bec95864c3234368f2d36082a851ba32d807561efa7590923c5e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587bd42bb178bec95864c3234368f2d36082a851ba32d807561efa7590923c5e/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587bd42bb178bec95864c3234368f2d36082a851ba32d807561efa7590923c5e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:42 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4.
Sep 30 21:10:42 compute-1 podman[203587]: 2025-09-30 21:10:42.748331029 +0000 UTC m=+0.130629747 container init 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: + sudo -E kolla_set_configs
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: sudo: unable to send audit message: Operation not permitted
Sep 30 21:10:42 compute-1 sudo[203609]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:10:42 compute-1 sudo[203609]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:10:42 compute-1 sudo[203609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Sep 30 21:10:42 compute-1 podman[203587]: 2025-09-30 21:10:42.779006425 +0000 UTC m=+0.161305113 container start 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:10:42 compute-1 podman[203587]: ceilometer_agent_compute
Sep 30 21:10:42 compute-1 systemd[1]: Started ceilometer_agent_compute container.
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Validating config file
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Copying service configuration files
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Sep 30 21:10:42 compute-1 sudo[203534]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: INFO:__main__:Writing out command to execute
Sep 30 21:10:42 compute-1 sudo[203609]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: ++ cat /run_command
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: + ARGS=
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: + sudo kolla_copy_cacerts
Sep 30 21:10:42 compute-1 podman[203610]: 2025-09-30 21:10:42.849229656 +0000 UTC m=+0.061235310 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:10:42 compute-1 systemd[1]: 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4-60200c873018c4a.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:10:42 compute-1 systemd[1]: 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4-60200c873018c4a.service: Failed with result 'exit-code'.
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: sudo: unable to send audit message: Operation not permitted
Sep 30 21:10:42 compute-1 sudo[203632]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:10:42 compute-1 sudo[203632]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:10:42 compute-1 sudo[203632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Sep 30 21:10:42 compute-1 sudo[203632]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: + [[ ! -n '' ]]
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: + . kolla_extend_start
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: + umask 0022
Sep 30 21:10:42 compute-1 ceilometer_agent_compute[203603]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Sep 30 21:10:43 compute-1 sudo[203784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmmitrdlngbkvczdpdzglpspavfnnalx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266643.1480744-1701-94047838210323/AnsiballZ_stat.py'
Sep 30 21:10:43 compute-1 sudo[203784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:43 compute-1 python3.9[203786]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:43 compute-1 sudo[203784]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.793 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.793 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.793 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.793 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.794 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.794 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.794 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.794 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.794 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.794 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.794 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.794 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.795 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.795 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.795 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.795 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.795 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.795 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.795 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.795 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.796 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.796 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.796 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.796 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.796 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.796 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.796 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.796 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.796 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.797 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.797 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.797 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.797 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.797 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.797 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.797 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.797 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.797 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.798 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.799 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.800 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.800 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.800 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.800 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.800 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.800 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.800 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.800 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.800 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.801 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.801 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.801 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.801 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.801 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.801 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.801 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.801 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.801 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.802 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.803 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.804 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.804 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.804 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.804 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.804 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.804 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.805 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.805 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.805 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.805 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.805 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.805 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.805 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.805 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.805 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.806 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.806 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.806 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.806 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.806 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.806 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.806 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.806 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.806 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.807 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.807 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.808 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.809 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.810 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.811 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.828 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.829 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.830 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.843 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.978 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.978 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.978 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.978 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.979 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.979 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.979 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.979 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.979 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.979 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.979 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.979 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.979 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.980 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.980 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.980 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.980 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.980 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.980 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.980 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.980 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.980 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.981 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.982 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.983 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.984 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.985 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.985 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.985 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.985 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.985 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.985 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.985 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.985 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.986 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.987 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.988 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.989 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.989 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.989 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.989 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.989 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.989 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.989 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.989 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.989 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.990 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.991 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.991 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.991 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.991 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.991 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.991 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.991 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.991 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.992 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.992 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.992 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.992 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.992 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.992 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.992 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.992 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.992 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.993 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.994 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.995 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.996 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.997 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.998 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.999 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.999 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.999 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.999 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.999 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.999 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.999 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.999 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:10:43 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:43.999 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.001 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.006 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:10:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-1 sudo[203913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuwihnktutgxievsvemmytyabmvoilqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266643.1480744-1701-94047838210323/AnsiballZ_copy.py'
Sep 30 21:10:44 compute-1 sudo[203913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:44 compute-1 python3.9[203915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266643.1480744-1701-94047838210323/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:44 compute-1 sudo[203913]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:45 compute-1 sudo[204065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjwbdbcotlmfclgqcolldgqtormitzsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266644.8259523-1751-172367964907627/AnsiballZ_container_config_data.py'
Sep 30 21:10:45 compute-1 sudo[204065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:45 compute-1 python3.9[204067]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Sep 30 21:10:45 compute-1 sudo[204065]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:46 compute-1 sudo[204217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhnsqybntotzquitsqegmcscxqwuzymi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266645.8151271-1778-182187495754827/AnsiballZ_container_config_hash.py'
Sep 30 21:10:46 compute-1 sudo[204217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:46 compute-1 python3.9[204219]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:10:46 compute-1 sudo[204217]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:47 compute-1 sudo[204369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uynywtcjogjunreywscshtuospzwcydv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266646.9463413-1808-89861561888491/AnsiballZ_edpm_container_manage.py'
Sep 30 21:10:47 compute-1 sudo[204369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:47 compute-1 python3[204371]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:10:47 compute-1 podman[204405]: 2025-09-30 21:10:47.791598609 +0000 UTC m=+0.052860297 container create 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter)
Sep 30 21:10:47 compute-1 podman[204405]: 2025-09-30 21:10:47.763358913 +0000 UTC m=+0.024620611 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Sep 30 21:10:47 compute-1 python3[204371]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Sep 30 21:10:47 compute-1 sudo[204369]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:48 compute-1 sudo[204605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srfgfnshmvijsjsxgxycrqrasqxybwbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266648.2234619-1832-33784435393882/AnsiballZ_stat.py'
Sep 30 21:10:48 compute-1 sudo[204605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:48 compute-1 podman[204566]: 2025-09-30 21:10:48.561144059 +0000 UTC m=+0.054918643 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 21:10:48 compute-1 python3.9[204613]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:10:48 compute-1 sudo[204605]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:49 compute-1 sudo[204766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imffeosdjixwhzawsrgfeafnduuusxfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266649.2157865-1859-206768821685939/AnsiballZ_file.py'
Sep 30 21:10:49 compute-1 sudo[204766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:49 compute-1 python3.9[204768]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:49 compute-1 sudo[204766]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:50 compute-1 sudo[204917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exrsusbovqbbmjjhwaclonwimvkckndv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266649.7957926-1859-117945176094580/AnsiballZ_copy.py'
Sep 30 21:10:50 compute-1 sudo[204917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:50 compute-1 python3.9[204919]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266649.7957926-1859-117945176094580/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:50 compute-1 sudo[204917]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:50 compute-1 sudo[204993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbicdxlbvarxophknxwjgyybnkumbeni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266649.7957926-1859-117945176094580/AnsiballZ_systemd.py'
Sep 30 21:10:50 compute-1 sudo[204993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:50 compute-1 python3.9[204995]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:10:50 compute-1 systemd[1]: Reloading.
Sep 30 21:10:51 compute-1 systemd-rc-local-generator[205022]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:51 compute-1 systemd-sysv-generator[205026]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:51 compute-1 sudo[204993]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:51 compute-1 sudo[205104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmdzuynyweuxwdzwqhpnupjxzqcxbdvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266649.7957926-1859-117945176094580/AnsiballZ_systemd.py'
Sep 30 21:10:51 compute-1 sudo[205104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:51 compute-1 python3.9[205106]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:10:51 compute-1 systemd[1]: Reloading.
Sep 30 21:10:51 compute-1 systemd-rc-local-generator[205137]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:51 compute-1 systemd-sysv-generator[205140]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:52 compute-1 systemd[1]: Starting node_exporter container...
Sep 30 21:10:52 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:10:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82ee68a1e522b5868a081d15b59ed53564211b5c0ce00f9ff29a8c1fe8e30692/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82ee68a1e522b5868a081d15b59ed53564211b5c0ce00f9ff29a8c1fe8e30692/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:52 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b.
Sep 30 21:10:52 compute-1 podman[205147]: 2025-09-30 21:10:52.426523348 +0000 UTC m=+0.245739215 container init 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.439Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.439Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.439Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.440Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.440Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.440Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.440Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.440Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.440Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=arp
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=bcache
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=bonding
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=btrfs
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=conntrack
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=cpu
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=cpufreq
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=diskstats
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=edac
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=fibrechannel
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=filefd
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=filesystem
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=infiniband
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=ipvs
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=loadavg
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=mdadm
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=meminfo
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=netclass
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=netdev
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=netstat
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=nfs
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=nfsd
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=nvme
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=schedstat
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=sockstat
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=softnet
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=systemd
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=tapestats
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=udp_queues
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=vmstat
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=xfs
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.441Z caller=node_exporter.go:117 level=info collector=zfs
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.442Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Sep 30 21:10:52 compute-1 node_exporter[205162]: ts=2025-09-30T21:10:52.442Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Sep 30 21:10:52 compute-1 podman[205147]: 2025-09-30 21:10:52.44932039 +0000 UTC m=+0.268536237 container start 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:10:52 compute-1 podman[205147]: node_exporter
Sep 30 21:10:52 compute-1 systemd[1]: Started node_exporter container.
Sep 30 21:10:52 compute-1 sudo[205104]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:52 compute-1 podman[205171]: 2025-09-30 21:10:52.546211216 +0000 UTC m=+0.085606565 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:10:53 compute-1 sudo[205344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waktvgahzfvapdrmzvnfebbhtugjzsxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266653.1730351-1931-96332595338856/AnsiballZ_systemd.py'
Sep 30 21:10:53 compute-1 sudo[205344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:53 compute-1 python3.9[205346]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:10:53 compute-1 systemd[1]: Stopping node_exporter container...
Sep 30 21:10:53 compute-1 systemd[1]: libpod-77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b.scope: Deactivated successfully.
Sep 30 21:10:53 compute-1 conmon[205162]: conmon 77f8342058597df4b970 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b.scope/container/memory.events
Sep 30 21:10:53 compute-1 podman[205350]: 2025-09-30 21:10:53.90451595 +0000 UTC m=+0.080679063 container died 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:10:53 compute-1 systemd[1]: 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b-7d05c3efcf0c13b6.timer: Deactivated successfully.
Sep 30 21:10:53 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b.
Sep 30 21:10:53 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b-userdata-shm.mount: Deactivated successfully.
Sep 30 21:10:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-82ee68a1e522b5868a081d15b59ed53564211b5c0ce00f9ff29a8c1fe8e30692-merged.mount: Deactivated successfully.
Sep 30 21:10:54 compute-1 podman[205350]: 2025-09-30 21:10:54.054761934 +0000 UTC m=+0.230925047 container cleanup 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:10:54 compute-1 podman[205350]: node_exporter
Sep 30 21:10:54 compute-1 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 21:10:54 compute-1 podman[205377]: node_exporter
Sep 30 21:10:54 compute-1 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Sep 30 21:10:54 compute-1 systemd[1]: Stopped node_exporter container.
Sep 30 21:10:54 compute-1 systemd[1]: Starting node_exporter container...
Sep 30 21:10:54 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:10:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82ee68a1e522b5868a081d15b59ed53564211b5c0ce00f9ff29a8c1fe8e30692/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82ee68a1e522b5868a081d15b59ed53564211b5c0ce00f9ff29a8c1fe8e30692/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:54 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b.
Sep 30 21:10:54 compute-1 podman[205390]: 2025-09-30 21:10:54.492890664 +0000 UTC m=+0.348201640 container init 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.504Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.504Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.504Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=arp
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=bcache
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=bonding
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=btrfs
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=conntrack
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=cpu
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=cpufreq
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=diskstats
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=edac
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=fibrechannel
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=filefd
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=filesystem
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=infiniband
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=ipvs
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=loadavg
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=mdadm
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=meminfo
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.505Z caller=node_exporter.go:117 level=info collector=netclass
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=netdev
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=netstat
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=nfs
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=nfsd
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=nvme
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=schedstat
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=sockstat
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=softnet
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=systemd
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=tapestats
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=udp_queues
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=vmstat
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=xfs
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=node_exporter.go:117 level=info collector=zfs
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.506Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Sep 30 21:10:54 compute-1 node_exporter[205405]: ts=2025-09-30T21:10:54.507Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Sep 30 21:10:54 compute-1 podman[205390]: 2025-09-30 21:10:54.527580624 +0000 UTC m=+0.382891490 container start 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:10:54 compute-1 podman[205390]: node_exporter
Sep 30 21:10:54 compute-1 systemd[1]: Started node_exporter container.
Sep 30 21:10:54 compute-1 sudo[205344]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:54 compute-1 podman[205414]: 2025-09-30 21:10:54.616973429 +0000 UTC m=+0.071406874 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:10:55 compute-1 sudo[205585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvbldpwevhisavdgrpvymfbmuuzjruyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266654.883834-1956-228060617460517/AnsiballZ_stat.py'
Sep 30 21:10:55 compute-1 sudo[205585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:55 compute-1 python3.9[205587]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:55 compute-1 sudo[205585]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:55 compute-1 sudo[205708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciaycyomppvfpcmrfyupuqdbhhmdowoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266654.883834-1956-228060617460517/AnsiballZ_copy.py'
Sep 30 21:10:55 compute-1 sudo[205708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:56 compute-1 python3.9[205710]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266654.883834-1956-228060617460517/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:56 compute-1 sudo[205708]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:56 compute-1 sudo[205860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrdjcndkzqriporxyesyrfwriycswwfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266656.561316-2006-149789611148433/AnsiballZ_container_config_data.py'
Sep 30 21:10:56 compute-1 sudo[205860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:57 compute-1 python3.9[205862]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Sep 30 21:10:57 compute-1 sudo[205860]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:57 compute-1 sudo[206012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysoterfbjkxmeonhogztvmwxzlwflats ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266657.4876678-2033-103298879969454/AnsiballZ_container_config_hash.py'
Sep 30 21:10:57 compute-1 sudo[206012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:57 compute-1 python3.9[206014]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:10:58 compute-1 sudo[206012]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:58 compute-1 sudo[206164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omwjgpjunsfvlkfmjzdztdiwifhilrev ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266658.4001627-2063-21840067853719/AnsiballZ_edpm_container_manage.py'
Sep 30 21:10:58 compute-1 sudo[206164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:58 compute-1 python3[206166]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:11:00 compute-1 podman[206179]: 2025-09-30 21:11:00.780468294 +0000 UTC m=+1.723869650 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 21:11:00 compute-1 podman[206278]: 2025-09-30 21:11:00.883218507 +0000 UTC m=+0.019167874 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 21:11:01 compute-1 podman[206278]: 2025-09-30 21:11:01.149398559 +0000 UTC m=+0.285347896 container create a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Sep 30 21:11:01 compute-1 python3[206166]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Sep 30 21:11:01 compute-1 sudo[206164]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:01 compute-1 sudo[206464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acnhxlcbtwkakujyilsjptmcfcjfyhjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266661.5171554-2087-178813611299239/AnsiballZ_stat.py'
Sep 30 21:11:01 compute-1 sudo[206464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:01 compute-1 podman[206466]: 2025-09-30 21:11:01.94027582 +0000 UTC m=+0.063129433 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid)
Sep 30 21:11:02 compute-1 python3.9[206467]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:11:02 compute-1 sudo[206464]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:02 compute-1 sudo[206639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktqousgmvopkctfagrglnzpjoihxykti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266662.5474434-2114-70915918935061/AnsiballZ_file.py'
Sep 30 21:11:02 compute-1 sudo[206639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:03 compute-1 python3.9[206641]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:03 compute-1 sudo[206639]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:03 compute-1 sudo[206790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhnvwosonyctcrymcxhcdjftgdtdglcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266663.1745534-2114-139403448068/AnsiballZ_copy.py'
Sep 30 21:11:03 compute-1 sudo[206790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:03 compute-1 python3.9[206792]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266663.1745534-2114-139403448068/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:03 compute-1 sudo[206790]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:04 compute-1 sudo[206867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghymqecvyaslfxochrjkchoeizflptnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266663.1745534-2114-139403448068/AnsiballZ_systemd.py'
Sep 30 21:11:04 compute-1 sudo[206867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:04 compute-1 python3.9[206869]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:11:04 compute-1 systemd[1]: Reloading.
Sep 30 21:11:04 compute-1 systemd-rc-local-generator[206895]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:11:04 compute-1 systemd-sysv-generator[206900]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:11:04 compute-1 sudo[206867]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:05 compute-1 sudo[206979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aptlhvjzbswiujcalvfinsmaelvcewvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266663.1745534-2114-139403448068/AnsiballZ_systemd.py'
Sep 30 21:11:05 compute-1 sudo[206979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:05 compute-1 python3.9[206981]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:11:05 compute-1 systemd[1]: Reloading.
Sep 30 21:11:05 compute-1 systemd-rc-local-generator[207011]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:11:05 compute-1 systemd-sysv-generator[207014]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:11:05 compute-1 systemd[1]: Starting podman_exporter container...
Sep 30 21:11:05 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:11:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1996c617872902a69913d31ef1fb55543f8690d065c5f570b769fd7327dabff/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1996c617872902a69913d31ef1fb55543f8690d065c5f570b769fd7327dabff/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:05 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc.
Sep 30 21:11:05 compute-1 podman[207021]: 2025-09-30 21:11:05.883888076 +0000 UTC m=+0.110280646 container init a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:11:05 compute-1 podman_exporter[207037]: ts=2025-09-30T21:11:05.900Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 21:11:05 compute-1 podman_exporter[207037]: ts=2025-09-30T21:11:05.900Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 21:11:05 compute-1 podman_exporter[207037]: ts=2025-09-30T21:11:05.900Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 21:11:05 compute-1 podman_exporter[207037]: ts=2025-09-30T21:11:05.900Z caller=handler.go:105 level=info collector=container
Sep 30 21:11:05 compute-1 podman[207021]: 2025-09-30 21:11:05.915434561 +0000 UTC m=+0.141827101 container start a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:11:05 compute-1 podman[207021]: podman_exporter
Sep 30 21:11:05 compute-1 systemd[1]: Starting Podman API Service...
Sep 30 21:11:05 compute-1 systemd[1]: Started Podman API Service.
Sep 30 21:11:05 compute-1 systemd[1]: Started podman_exporter container.
Sep 30 21:11:05 compute-1 podman[207048]: time="2025-09-30T21:11:05Z" level=info msg="/usr/bin/podman filtering at log level info"
Sep 30 21:11:05 compute-1 podman[207048]: time="2025-09-30T21:11:05Z" level=info msg="Setting parallel job count to 25"
Sep 30 21:11:05 compute-1 podman[207048]: time="2025-09-30T21:11:05Z" level=info msg="Using sqlite as database backend"
Sep 30 21:11:05 compute-1 podman[207048]: time="2025-09-30T21:11:05Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Sep 30 21:11:05 compute-1 podman[207048]: time="2025-09-30T21:11:05Z" level=info msg="Using systemd socket activation to determine API endpoint"
Sep 30 21:11:05 compute-1 podman[207048]: time="2025-09-30T21:11:05Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Sep 30 21:11:05 compute-1 sudo[206979]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:05 compute-1 podman[207048]: @ - - [30/Sep/2025:21:11:05 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 21:11:05 compute-1 podman[207048]: time="2025-09-30T21:11:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 21:11:05 compute-1 podman[207046]: 2025-09-30 21:11:05.987158574 +0000 UTC m=+0.058409547 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:05 compute-1 systemd[1]: a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc-2dcd5907c546d3f5.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:11:05 compute-1 systemd[1]: a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc-2dcd5907c546d3f5.service: Failed with result 'exit-code'.
Sep 30 21:11:06 compute-1 podman[207048]: @ - - [30/Sep/2025:21:11:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22058 "" "Go-http-client/1.1"
Sep 30 21:11:06 compute-1 podman_exporter[207037]: ts=2025-09-30T21:11:06.006Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 21:11:06 compute-1 podman_exporter[207037]: ts=2025-09-30T21:11:06.008Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 21:11:06 compute-1 podman_exporter[207037]: ts=2025-09-30T21:11:06.009Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 21:11:07 compute-1 sudo[207244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcyxooyarfifvbljqiryzhojqjqmmmek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266666.6772072-2186-237649189943938/AnsiballZ_systemd.py'
Sep 30 21:11:07 compute-1 sudo[207244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:07 compute-1 podman[207207]: 2025-09-30 21:11:07.103842593 +0000 UTC m=+0.065301820 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:11:07 compute-1 python3.9[207251]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:11:07 compute-1 systemd[1]: Stopping podman_exporter container...
Sep 30 21:11:07 compute-1 podman[207048]: @ - - [30/Sep/2025:21:11:05 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3549 "" "Go-http-client/1.1"
Sep 30 21:11:07 compute-1 systemd[1]: libpod-a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc.scope: Deactivated successfully.
Sep 30 21:11:07 compute-1 podman[207260]: 2025-09-30 21:11:07.469222804 +0000 UTC m=+0.046098637 container died a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:07 compute-1 systemd[1]: a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc-2dcd5907c546d3f5.timer: Deactivated successfully.
Sep 30 21:11:07 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc.
Sep 30 21:11:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc-userdata-shm.mount: Deactivated successfully.
Sep 30 21:11:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-d1996c617872902a69913d31ef1fb55543f8690d065c5f570b769fd7327dabff-merged.mount: Deactivated successfully.
Sep 30 21:11:07 compute-1 podman[207260]: 2025-09-30 21:11:07.675766478 +0000 UTC m=+0.252642311 container cleanup a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:07 compute-1 podman[207260]: podman_exporter
Sep 30 21:11:07 compute-1 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 21:11:07 compute-1 podman[207288]: podman_exporter
Sep 30 21:11:07 compute-1 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Sep 30 21:11:07 compute-1 systemd[1]: Stopped podman_exporter container.
Sep 30 21:11:07 compute-1 systemd[1]: Starting podman_exporter container...
Sep 30 21:11:07 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:11:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1996c617872902a69913d31ef1fb55543f8690d065c5f570b769fd7327dabff/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1996c617872902a69913d31ef1fb55543f8690d065c5f570b769fd7327dabff/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:07 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc.
Sep 30 21:11:07 compute-1 podman[207301]: 2025-09-30 21:11:07.903209832 +0000 UTC m=+0.117031337 container init a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:07 compute-1 podman_exporter[207317]: ts=2025-09-30T21:11:07.921Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 21:11:07 compute-1 podman_exporter[207317]: ts=2025-09-30T21:11:07.921Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 21:11:07 compute-1 podman_exporter[207317]: ts=2025-09-30T21:11:07.921Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 21:11:07 compute-1 podman_exporter[207317]: ts=2025-09-30T21:11:07.921Z caller=handler.go:105 level=info collector=container
Sep 30 21:11:07 compute-1 podman[207048]: @ - - [30/Sep/2025:21:11:07 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 21:11:07 compute-1 podman[207048]: time="2025-09-30T21:11:07Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 21:11:07 compute-1 podman[207301]: 2025-09-30 21:11:07.947109538 +0000 UTC m=+0.160931023 container start a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:07 compute-1 podman[207301]: podman_exporter
Sep 30 21:11:07 compute-1 podman[207048]: @ - - [30/Sep/2025:21:11:07 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22060 "" "Go-http-client/1.1"
Sep 30 21:11:07 compute-1 podman_exporter[207317]: ts=2025-09-30T21:11:07.954Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 21:11:07 compute-1 podman_exporter[207317]: ts=2025-09-30T21:11:07.955Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 21:11:07 compute-1 podman_exporter[207317]: ts=2025-09-30T21:11:07.955Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 21:11:07 compute-1 systemd[1]: Started podman_exporter container.
Sep 30 21:11:08 compute-1 sudo[207244]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:08 compute-1 podman[207327]: 2025-09-30 21:11:08.016497408 +0000 UTC m=+0.062618700 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:11:08 compute-1 sudo[207502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlhhlufyaivrpcvacnitfkwpendsfujj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266668.361912-2210-206296428897186/AnsiballZ_stat.py'
Sep 30 21:11:08 compute-1 sudo[207502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:08 compute-1 python3.9[207504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:11:08 compute-1 sudo[207502]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:09 compute-1 sudo[207625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpgbjmjvlxkmlxqhviornjonpacnluou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266668.361912-2210-206296428897186/AnsiballZ_copy.py'
Sep 30 21:11:09 compute-1 sudo[207625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:09 compute-1 python3.9[207627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266668.361912-2210-206296428897186/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:11:09 compute-1 sudo[207625]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:10 compute-1 podman[207652]: 2025-09-30 21:11:10.230279794 +0000 UTC m=+0.070729356 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Sep 30 21:11:10 compute-1 sudo[207803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmmfxhhtgrhzjjrggqrmnozfxxmfgujl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266670.363499-2261-158829267763203/AnsiballZ_container_config_data.py'
Sep 30 21:11:10 compute-1 sudo[207803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:10 compute-1 python3.9[207805]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Sep 30 21:11:10 compute-1 sudo[207803]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:11 compute-1 sudo[207955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aptkzejdpkstwamjioqqfaivvdbvuzqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266671.2693477-2288-97989137818823/AnsiballZ_container_config_hash.py'
Sep 30 21:11:11 compute-1 sudo[207955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:11 compute-1 python3.9[207957]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:11:11 compute-1 sudo[207955]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:12 compute-1 sudo[208107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyzsajmbwgpmhcmdmdmgosbwkzposchn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266672.2678227-2318-156843656807854/AnsiballZ_edpm_container_manage.py'
Sep 30 21:11:12 compute-1 sudo[208107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:12 compute-1 python3[208109]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:11:13 compute-1 podman[208134]: 2025-09-30 21:11:13.230207545 +0000 UTC m=+0.069284048 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm)
Sep 30 21:11:13 compute-1 systemd[1]: 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4-60200c873018c4a.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:11:13 compute-1 systemd[1]: 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4-60200c873018c4a.service: Failed with result 'exit-code'.
Sep 30 21:11:15 compute-1 podman[208121]: 2025-09-30 21:11:15.609018414 +0000 UTC m=+2.715308877 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 21:11:15 compute-1 podman[208235]: 2025-09-30 21:11:15.782403359 +0000 UTC m=+0.064372565 container create 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, name=ubi9-minimal, vcs-type=git, config_id=edpm, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 21:11:15 compute-1 podman[208235]: 2025-09-30 21:11:15.744231066 +0000 UTC m=+0.026200342 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 21:11:15 compute-1 python3[208109]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 21:11:15 compute-1 sudo[208107]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:16 compute-1 sudo[208424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypljsrtsfsoxuuvnrkddeiukqcuxtsfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266676.1559286-2342-133863608935282/AnsiballZ_stat.py'
Sep 30 21:11:16 compute-1 sudo[208424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:16 compute-1 python3.9[208426]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:11:16 compute-1 sudo[208424]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:17 compute-1 sudo[208578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fojnoayogvpnpcvcnfzksipjmhmlwivi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266677.074726-2369-188418334060430/AnsiballZ_file.py'
Sep 30 21:11:17 compute-1 sudo[208578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:17 compute-1 python3.9[208580]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:17 compute-1 sudo[208578]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:18 compute-1 sudo[208729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbafevdovnzgounlvcqvwvcygctcxjrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266677.7380235-2369-34021229446590/AnsiballZ_copy.py'
Sep 30 21:11:18 compute-1 sudo[208729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:18 compute-1 python3.9[208731]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266677.7380235-2369-34021229446590/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:18 compute-1 sudo[208729]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:18 compute-1 sudo[208815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kupxznmpzaghtdtpmuumqyosgbnrwbpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266677.7380235-2369-34021229446590/AnsiballZ_systemd.py'
Sep 30 21:11:18 compute-1 sudo[208815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:18 compute-1 podman[208779]: 2025-09-30 21:11:18.742363239 +0000 UTC m=+0.073259314 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:11:18 compute-1 python3.9[208822]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:11:19 compute-1 systemd[1]: Reloading.
Sep 30 21:11:19 compute-1 systemd-sysv-generator[208857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:11:19 compute-1 systemd-rc-local-generator[208853]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:11:19 compute-1 sudo[208815]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:19 compute-1 sudo[208935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msgduvzxaaiacviatrvsqckpztknbrxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266677.7380235-2369-34021229446590/AnsiballZ_systemd.py'
Sep 30 21:11:19 compute-1 sudo[208935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:19 compute-1 python3.9[208937]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:11:20 compute-1 systemd[1]: Reloading.
Sep 30 21:11:20 compute-1 systemd-sysv-generator[208969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:11:20 compute-1 systemd-rc-local-generator[208964]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:11:20 compute-1 systemd[1]: Starting openstack_network_exporter container...
Sep 30 21:11:20 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:11:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0246e54b19dd12e91e4c8fa1643620736b20b07e911a2af122f46523ac1c79dc/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0246e54b19dd12e91e4c8fa1643620736b20b07e911a2af122f46523ac1c79dc/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0246e54b19dd12e91e4c8fa1643620736b20b07e911a2af122f46523ac1c79dc/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:20 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49.
Sep 30 21:11:20 compute-1 podman[208977]: 2025-09-30 21:11:20.493856758 +0000 UTC m=+0.140103454 container init 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *bridge.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *coverage.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *datapath.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *iface.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *memory.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *ovnnorthd.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *ovn.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *ovsdbserver.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *pmd_perf.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *pmd_rxq.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: INFO    21:11:20 main.go:48: registering *vswitch.Collector
Sep 30 21:11:20 compute-1 openstack_network_exporter[208993]: NOTICE  21:11:20 main.go:76: listening on https://:9105/metrics
Sep 30 21:11:20 compute-1 podman[208977]: 2025-09-30 21:11:20.525520747 +0000 UTC m=+0.171767363 container start 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:11:20 compute-1 podman[208977]: openstack_network_exporter
Sep 30 21:11:20 compute-1 systemd[1]: Started openstack_network_exporter container.
Sep 30 21:11:20 compute-1 sudo[208935]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:20 compute-1 podman[209003]: 2025-09-30 21:11:20.614488311 +0000 UTC m=+0.078670449 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7)
Sep 30 21:11:21 compute-1 sudo[209176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbxjizodibtqsgfmzsfkatrqsmanoerw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266681.1304474-2441-240249471907951/AnsiballZ_systemd.py'
Sep 30 21:11:21 compute-1 sudo[209176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:21 compute-1 python3.9[209178]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:11:21 compute-1 systemd[1]: Stopping openstack_network_exporter container...
Sep 30 21:11:21 compute-1 systemd[1]: libpod-432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49.scope: Deactivated successfully.
Sep 30 21:11:21 compute-1 podman[209182]: 2025-09-30 21:11:21.92105341 +0000 UTC m=+0.048820259 container died 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Sep 30 21:11:21 compute-1 systemd[1]: 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49-17fd886e972fbb89.timer: Deactivated successfully.
Sep 30 21:11:21 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49.
Sep 30 21:11:21 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49-userdata-shm.mount: Deactivated successfully.
Sep 30 21:11:21 compute-1 systemd[1]: var-lib-containers-storage-overlay-0246e54b19dd12e91e4c8fa1643620736b20b07e911a2af122f46523ac1c79dc-merged.mount: Deactivated successfully.
Sep 30 21:11:22 compute-1 podman[209182]: 2025-09-30 21:11:22.547459843 +0000 UTC m=+0.675226692 container cleanup 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, version=9.6, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:11:22 compute-1 podman[209182]: openstack_network_exporter
Sep 30 21:11:22 compute-1 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 21:11:22 compute-1 podman[209210]: openstack_network_exporter
Sep 30 21:11:22 compute-1 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Sep 30 21:11:22 compute-1 systemd[1]: Stopped openstack_network_exporter container.
Sep 30 21:11:22 compute-1 systemd[1]: Starting openstack_network_exporter container...
Sep 30 21:11:22 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:11:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0246e54b19dd12e91e4c8fa1643620736b20b07e911a2af122f46523ac1c79dc/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0246e54b19dd12e91e4c8fa1643620736b20b07e911a2af122f46523ac1c79dc/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0246e54b19dd12e91e4c8fa1643620736b20b07e911a2af122f46523ac1c79dc/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:22 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49.
Sep 30 21:11:22 compute-1 podman[209223]: 2025-09-30 21:11:22.737448094 +0000 UTC m=+0.102736574 container init 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *bridge.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *coverage.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *datapath.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *iface.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *memory.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *ovnnorthd.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *ovn.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *ovsdbserver.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *pmd_perf.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *pmd_rxq.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: INFO    21:11:22 main.go:48: registering *vswitch.Collector
Sep 30 21:11:22 compute-1 openstack_network_exporter[209240]: NOTICE  21:11:22 main.go:76: listening on https://:9105/metrics
Sep 30 21:11:22 compute-1 podman[209223]: 2025-09-30 21:11:22.758361165 +0000 UTC m=+0.123649615 container start 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, vcs-type=git, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 21:11:22 compute-1 podman[209223]: openstack_network_exporter
Sep 30 21:11:22 compute-1 systemd[1]: Started openstack_network_exporter container.
Sep 30 21:11:22 compute-1 sudo[209176]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:22 compute-1 podman[209250]: 2025-09-30 21:11:22.82536888 +0000 UTC m=+0.056423783 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Sep 30 21:11:23 compute-1 sudo[209420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtwwzvdvyiynfkeqpcvutsprgxhghldi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266683.3066409-2465-94351181815982/AnsiballZ_find.py'
Sep 30 21:11:23 compute-1 sudo[209420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:23 compute-1 python3.9[209422]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:11:23 compute-1 sudo[209420]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:24 compute-1 sudo[209585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrtlmgbzvcbfpltgdgicacwlhylbghkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266684.452506-2492-68617249703688/AnsiballZ_podman_container_info.py'
Sep 30 21:11:24 compute-1 sudo[209585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:24 compute-1 podman[209546]: 2025-09-30 21:11:24.92099084 +0000 UTC m=+0.046611069 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:11:25 compute-1 python3.9[209596]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Sep 30 21:11:25 compute-1 sudo[209585]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:25 compute-1 sudo[209761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bomcbtszlerxfpkqndckopqswsfkblga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266685.3593206-2500-96551259786844/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:25 compute-1 sudo[209761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:26 compute-1 python3.9[209763]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:26 compute-1 systemd[1]: Started libpod-conmon-4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489.scope.
Sep 30 21:11:26 compute-1 podman[209764]: 2025-09-30 21:11:26.114721676 +0000 UTC m=+0.083718194 container exec 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Sep 30 21:11:26 compute-1 podman[209764]: 2025-09-30 21:11:26.151645305 +0000 UTC m=+0.120641783 container exec_died 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:11:26 compute-1 systemd[1]: libpod-conmon-4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489.scope: Deactivated successfully.
Sep 30 21:11:26 compute-1 sudo[209761]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:26 compute-1 sudo[209945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfmyqpyjkqqrvuayinjclhaytblcvncp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266686.337909-2508-29084257485739/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:26 compute-1 sudo[209945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:26 compute-1 python3.9[209947]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:26 compute-1 systemd[1]: Started libpod-conmon-4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489.scope.
Sep 30 21:11:26 compute-1 podman[209948]: 2025-09-30 21:11:26.965957974 +0000 UTC m=+0.081635098 container exec 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:11:26 compute-1 podman[209948]: 2025-09-30 21:11:26.998437935 +0000 UTC m=+0.114115059 container exec_died 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Sep 30 21:11:27 compute-1 systemd[1]: libpod-conmon-4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489.scope: Deactivated successfully.
Sep 30 21:11:27 compute-1 sudo[209945]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:27 compute-1 sudo[210128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psaiuftsrsqcfvtzdgyjcpdgtlhohkrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266687.2218614-2516-217110043732259/AnsiballZ_file.py'
Sep 30 21:11:27 compute-1 sudo[210128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:27 compute-1 python3.9[210130]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:27 compute-1 sudo[210128]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.083 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.102 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.168 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.168 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.169 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.169 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:11:28 compute-1 sudo[210280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhoeoatjutazotgdcnnftfepatawkruu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266687.981046-2525-157696337375769/AnsiballZ_podman_container_info.py'
Sep 30 21:11:28 compute-1 sudo[210280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.328 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.329 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5885MB free_disk=73.5025749206543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.329 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.329 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.397 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.398 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.423 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.463 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.464 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:11:28 compute-1 nova_compute[192795]: 2025-09-30 21:11:28.464 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:11:28 compute-1 python3.9[210282]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Sep 30 21:11:28 compute-1 sudo[210280]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:28 compute-1 sudo[210445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnwmdfzryzfkhbigsikqcbmekodsoyth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266688.7164338-2533-247964948140445/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:28 compute-1 sudo[210445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.056 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.057 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.057 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.080 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.081 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.082 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.082 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.082 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.083 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.083 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:11:29 compute-1 python3.9[210447]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:29 compute-1 systemd[1]: Started libpod-conmon-ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89.scope.
Sep 30 21:11:29 compute-1 podman[210448]: 2025-09-30 21:11:29.300325832 +0000 UTC m=+0.105304202 container exec ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:11:29 compute-1 podman[210468]: 2025-09-30 21:11:29.401502622 +0000 UTC m=+0.087347711 container exec_died ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:11:29 compute-1 podman[210448]: 2025-09-30 21:11:29.44211374 +0000 UTC m=+0.247092100 container exec_died ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:11:29 compute-1 systemd[1]: libpod-conmon-ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89.scope: Deactivated successfully.
Sep 30 21:11:29 compute-1 sudo[210445]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-1 nova_compute[192795]: 2025-09-30 21:11:29.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-1 sudo[210630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azoverlkfnifshzinggbhlmmysugqvak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266689.6866443-2541-24957551447088/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:29 compute-1 sudo[210630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:30 compute-1 python3.9[210632]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:30 compute-1 systemd[1]: Started libpod-conmon-ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89.scope.
Sep 30 21:11:30 compute-1 podman[210633]: 2025-09-30 21:11:30.3814821 +0000 UTC m=+0.227125536 container exec ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:11:30 compute-1 podman[210652]: 2025-09-30 21:11:30.497472008 +0000 UTC m=+0.104119440 container exec_died ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:11:30 compute-1 podman[210633]: 2025-09-30 21:11:30.542219736 +0000 UTC m=+0.387863152 container exec_died ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:11:30 compute-1 systemd[1]: libpod-conmon-ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89.scope: Deactivated successfully.
Sep 30 21:11:30 compute-1 sudo[210630]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:31 compute-1 sudo[210815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulryrgsiixlzmxcvulxxsemvstezpwxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266690.860188-2549-107201779492213/AnsiballZ_file.py'
Sep 30 21:11:31 compute-1 sudo[210815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:31 compute-1 python3.9[210817]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:31 compute-1 sudo[210815]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:31 compute-1 sudo[210967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxfnifxrpluumxfycqnhkoxlddjuuslo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266691.5188441-2558-260798074550235/AnsiballZ_podman_container_info.py'
Sep 30 21:11:31 compute-1 sudo[210967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:31 compute-1 python3.9[210969]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Sep 30 21:11:32 compute-1 sudo[210967]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:32 compute-1 podman[211007]: 2025-09-30 21:11:32.227738331 +0000 UTC m=+0.063954425 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, config_id=iscsid)
Sep 30 21:11:32 compute-1 sudo[211151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnkstxusrnqnpwmakordjofsdirfhguw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266692.2366881-2566-197355557502312/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:32 compute-1 sudo[211151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:32 compute-1 python3.9[211153]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:32 compute-1 systemd[1]: Started libpod-conmon-bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa.scope.
Sep 30 21:11:32 compute-1 podman[211154]: 2025-09-30 21:11:32.809382685 +0000 UTC m=+0.075835922 container exec bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:11:32 compute-1 podman[211154]: 2025-09-30 21:11:32.845096253 +0000 UTC m=+0.111549530 container exec_died bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:11:32 compute-1 sudo[211151]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:32 compute-1 systemd[1]: libpod-conmon-bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa.scope: Deactivated successfully.
Sep 30 21:11:33 compute-1 sudo[211337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmffkjxqgxmptsvyhrgxwxjtqbpgrtrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266693.0534341-2574-54609741914656/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:33 compute-1 sudo[211337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:33 compute-1 python3.9[211339]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:33 compute-1 systemd[1]: Started libpod-conmon-bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa.scope.
Sep 30 21:11:33 compute-1 podman[211340]: 2025-09-30 21:11:33.618819793 +0000 UTC m=+0.073658724 container exec bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:11:33 compute-1 podman[211340]: 2025-09-30 21:11:33.649595858 +0000 UTC m=+0.104434799 container exec_died bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:11:33 compute-1 systemd[1]: libpod-conmon-bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa.scope: Deactivated successfully.
Sep 30 21:11:33 compute-1 sudo[211337]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:34 compute-1 sudo[211521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvmtsvwsusvezdipvzgugfhbflxjvlbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266693.856347-2582-85879980442796/AnsiballZ_file.py'
Sep 30 21:11:34 compute-1 sudo[211521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:34 compute-1 python3.9[211523]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:34 compute-1 sudo[211521]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:34 compute-1 sudo[211673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeiurkdqkwfxdvupaqryrkssxcxuakxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266694.6172311-2591-65332391272273/AnsiballZ_podman_container_info.py'
Sep 30 21:11:34 compute-1 sudo[211673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:35 compute-1 python3.9[211675]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Sep 30 21:11:35 compute-1 sudo[211673]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:35 compute-1 sudo[211838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkuglzbomhurdznicosyjyybpgmrvmgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266695.44543-2599-151424503223844/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:35 compute-1 sudo[211838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:35 compute-1 python3.9[211840]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:35 compute-1 systemd[1]: Started libpod-conmon-37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804.scope.
Sep 30 21:11:35 compute-1 podman[211841]: 2025-09-30 21:11:35.973914884 +0000 UTC m=+0.065690931 container exec 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Sep 30 21:11:36 compute-1 podman[211860]: 2025-09-30 21:11:36.037485427 +0000 UTC m=+0.050838962 container exec_died 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:11:36 compute-1 podman[211841]: 2025-09-30 21:11:36.043699323 +0000 UTC m=+0.135475370 container exec_died 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:11:36 compute-1 systemd[1]: libpod-conmon-37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804.scope: Deactivated successfully.
Sep 30 21:11:36 compute-1 sudo[211838]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:36 compute-1 sudo[212023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbjhbtncaqyiouynxsfszcoighoxzpiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266696.2392523-2607-172944038485456/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:36 compute-1 sudo[212023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:36 compute-1 python3.9[212025]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:36 compute-1 systemd[1]: Started libpod-conmon-37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804.scope.
Sep 30 21:11:36 compute-1 podman[212026]: 2025-09-30 21:11:36.771762651 +0000 UTC m=+0.065596559 container exec 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:11:36 compute-1 podman[212026]: 2025-09-30 21:11:36.807668643 +0000 UTC m=+0.101502531 container exec_died 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:11:36 compute-1 sudo[212023]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:36 compute-1 systemd[1]: libpod-conmon-37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804.scope: Deactivated successfully.
Sep 30 21:11:37 compute-1 podman[212156]: 2025-09-30 21:11:37.230915434 +0000 UTC m=+0.067646724 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=multipathd)
Sep 30 21:11:37 compute-1 sudo[212226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvbxxebirdxswouewtappdnzqxldalxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266697.0174015-2615-269236805507469/AnsiballZ_file.py'
Sep 30 21:11:37 compute-1 sudo[212226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:37 compute-1 python3.9[212228]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:37 compute-1 sudo[212226]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:37 compute-1 sudo[212378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycgrxbawfszbipaekkvtejxbxvrqawpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266697.697976-2624-173964425876342/AnsiballZ_podman_container_info.py'
Sep 30 21:11:37 compute-1 sudo[212378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:38 compute-1 python3.9[212380]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Sep 30 21:11:38 compute-1 podman[212381]: 2025-09-30 21:11:38.225989217 +0000 UTC m=+0.072233617 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:38 compute-1 sudo[212378]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:11:38.672 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:11:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:11:38.673 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:11:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:11:38.673 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:11:38 compute-1 sudo[212567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwhnduzkklwodfebjqzsmqifwpqrojmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266698.4783106-2632-252937566551940/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:38 compute-1 sudo[212567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:39 compute-1 python3.9[212569]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:39 compute-1 systemd[1]: Started libpod-conmon-5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4.scope.
Sep 30 21:11:39 compute-1 podman[212570]: 2025-09-30 21:11:39.141724913 +0000 UTC m=+0.080532509 container exec 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:11:39 compute-1 podman[212570]: 2025-09-30 21:11:39.152661546 +0000 UTC m=+0.091469142 container exec_died 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:11:39 compute-1 systemd[1]: libpod-conmon-5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4.scope: Deactivated successfully.
Sep 30 21:11:39 compute-1 sudo[212567]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:39 compute-1 sudo[212753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tijnewmqngxjoidqzideewwkmqwuvntr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266699.3943174-2640-182751389579007/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:39 compute-1 sudo[212753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:39 compute-1 python3.9[212755]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:40 compute-1 systemd[1]: Started libpod-conmon-5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4.scope.
Sep 30 21:11:40 compute-1 podman[212756]: 2025-09-30 21:11:40.024883857 +0000 UTC m=+0.072513224 container exec 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 21:11:40 compute-1 podman[212756]: 2025-09-30 21:11:40.056924635 +0000 UTC m=+0.104554022 container exec_died 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:11:40 compute-1 systemd[1]: libpod-conmon-5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4.scope: Deactivated successfully.
Sep 30 21:11:40 compute-1 sudo[212753]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:40 compute-1 sudo[212956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsrzpdksisndsbyjvdvsruminervjszx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266700.3107414-2648-277047131674821/AnsiballZ_file.py'
Sep 30 21:11:40 compute-1 sudo[212956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:40 compute-1 podman[212912]: 2025-09-30 21:11:40.707280311 +0000 UTC m=+0.104270175 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:11:40 compute-1 python3.9[212963]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:40 compute-1 sudo[212956]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:41 compute-1 sudo[213116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umhlaklyskxxxyggatrijkpdraaybrvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266701.1672685-2657-65782349206859/AnsiballZ_podman_container_info.py'
Sep 30 21:11:41 compute-1 sudo[213116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:41 compute-1 python3.9[213118]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Sep 30 21:11:41 compute-1 sudo[213116]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:42 compute-1 sudo[213282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lknnmwstcwidlpwpsthetaznoqwdmkzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266701.8775704-2665-62102499832230/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:42 compute-1 sudo[213282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:42 compute-1 python3.9[213284]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:42 compute-1 systemd[1]: Started libpod-conmon-77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b.scope.
Sep 30 21:11:42 compute-1 podman[213285]: 2025-09-30 21:11:42.538177278 +0000 UTC m=+0.088146362 container exec 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:11:42 compute-1 podman[213285]: 2025-09-30 21:11:42.570157165 +0000 UTC m=+0.120126269 container exec_died 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:11:42 compute-1 systemd[1]: libpod-conmon-77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b.scope: Deactivated successfully.
Sep 30 21:11:42 compute-1 sudo[213282]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:43 compute-1 sudo[213467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhvzuikginxohfndvkpxpdwktaobhpue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266702.849534-2673-215082419474720/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:43 compute-1 sudo[213467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:43 compute-1 python3.9[213469]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:43 compute-1 systemd[1]: Started libpod-conmon-77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b.scope.
Sep 30 21:11:43 compute-1 podman[213470]: 2025-09-30 21:11:43.541179183 +0000 UTC m=+0.086698525 container exec 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:11:43 compute-1 podman[213489]: 2025-09-30 21:11:43.688415488 +0000 UTC m=+0.132415829 container exec_died 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:11:43 compute-1 podman[213470]: 2025-09-30 21:11:43.730429893 +0000 UTC m=+0.275949235 container exec_died 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:11:43 compute-1 podman[213486]: 2025-09-30 21:11:43.729266872 +0000 UTC m=+0.178001560 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:11:43 compute-1 systemd[1]: libpod-conmon-77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b.scope: Deactivated successfully.
Sep 30 21:11:43 compute-1 sudo[213467]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:44 compute-1 sudo[213671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvlcphhhbzrexwqdxxylczgjgvvfzpma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266703.9652205-2681-271427433525669/AnsiballZ_file.py'
Sep 30 21:11:44 compute-1 sudo[213671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:44 compute-1 python3.9[213673]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:44 compute-1 sudo[213671]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:45 compute-1 sudo[213823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixdpfkzhzxlnlqplypunypptxvpzkohz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266704.7397437-2690-31570381951669/AnsiballZ_podman_container_info.py'
Sep 30 21:11:45 compute-1 sudo[213823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:45 compute-1 python3.9[213825]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Sep 30 21:11:45 compute-1 sudo[213823]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:45 compute-1 sudo[213987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-danvnafnmacagylmfmbvcgtkwsnlkmgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266705.5291703-2698-280273191984085/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:45 compute-1 sudo[213987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:46 compute-1 python3.9[213989]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:46 compute-1 systemd[1]: Started libpod-conmon-a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc.scope.
Sep 30 21:11:46 compute-1 podman[213990]: 2025-09-30 21:11:46.199500651 +0000 UTC m=+0.087566738 container exec a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:46 compute-1 podman[213990]: 2025-09-30 21:11:46.233952263 +0000 UTC m=+0.122018350 container exec_died a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:11:46 compute-1 systemd[1]: libpod-conmon-a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc.scope: Deactivated successfully.
Sep 30 21:11:46 compute-1 sudo[213987]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:46 compute-1 sudo[214170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llteznzzgdyiuzumvfgzauvtmedeqjyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266706.4858255-2706-162254257252237/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:46 compute-1 sudo[214170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:47 compute-1 python3.9[214172]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:47 compute-1 systemd[1]: Started libpod-conmon-a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc.scope.
Sep 30 21:11:47 compute-1 podman[214173]: 2025-09-30 21:11:47.187206935 +0000 UTC m=+0.092879939 container exec a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:11:47 compute-1 podman[214173]: 2025-09-30 21:11:47.224942416 +0000 UTC m=+0.130615360 container exec_died a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:11:47 compute-1 systemd[1]: libpod-conmon-a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc.scope: Deactivated successfully.
Sep 30 21:11:47 compute-1 sudo[214170]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:47 compute-1 sudo[214354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwxlorrgjpvlidrqvrsutcmsjdlebtei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266707.4745471-2714-10020678865645/AnsiballZ_file.py'
Sep 30 21:11:47 compute-1 sudo[214354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:48 compute-1 python3.9[214356]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:48 compute-1 sudo[214354]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:48 compute-1 sudo[214506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hicgbitcwdcqkehlphnjpnfdndcfstqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266708.3503556-2723-116931699905856/AnsiballZ_podman_container_info.py'
Sep 30 21:11:48 compute-1 sudo[214506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:48 compute-1 python3.9[214508]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Sep 30 21:11:49 compute-1 sudo[214506]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:49 compute-1 podman[214543]: 2025-09-30 21:11:49.274730312 +0000 UTC m=+0.092532257 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:11:49 compute-1 sudo[214692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lczucqhcwdnslwwwxektcpxdybcnnrfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266709.476338-2731-85969292536067/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:49 compute-1 sudo[214692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:50 compute-1 python3.9[214694]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:50 compute-1 systemd[1]: Started libpod-conmon-432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49.scope.
Sep 30 21:11:50 compute-1 podman[214695]: 2025-09-30 21:11:50.188532746 +0000 UTC m=+0.106337702 container exec 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Sep 30 21:11:50 compute-1 podman[214695]: 2025-09-30 21:11:50.222679364 +0000 UTC m=+0.140484350 container exec_died 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:11:50 compute-1 systemd[1]: libpod-conmon-432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49.scope: Deactivated successfully.
Sep 30 21:11:50 compute-1 sudo[214692]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:50 compute-1 sudo[214880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbbmjrtjbkeltpncesvaaqbuvzxcjmqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266710.4989626-2739-111308669948639/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:50 compute-1 sudo[214880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:51 compute-1 python3.9[214882]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:51 compute-1 systemd[1]: Started libpod-conmon-432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49.scope.
Sep 30 21:11:51 compute-1 podman[214883]: 2025-09-30 21:11:51.204818836 +0000 UTC m=+0.101992584 container exec 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm)
Sep 30 21:11:51 compute-1 podman[214883]: 2025-09-30 21:11:51.237680639 +0000 UTC m=+0.134854367 container exec_died 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, architecture=x86_64)
Sep 30 21:11:51 compute-1 systemd[1]: libpod-conmon-432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49.scope: Deactivated successfully.
Sep 30 21:11:51 compute-1 sudo[214880]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:51 compute-1 sudo[215063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxwjeuxolgnpkezaxtzdmrrdxmxsyedh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266711.4846628-2747-149568569497723/AnsiballZ_file.py'
Sep 30 21:11:51 compute-1 sudo[215063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:52 compute-1 python3.9[215065]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:52 compute-1 sudo[215063]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:53 compute-1 podman[215090]: 2025-09-30 21:11:53.232873713 +0000 UTC m=+0.073783797 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-type=git, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:11:55 compute-1 podman[215111]: 2025-09-30 21:11:55.248120873 +0000 UTC m=+0.073154660 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:12:03 compute-1 podman[215136]: 2025-09-30 21:12:03.211127776 +0000 UTC m=+0.059351645 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:12:07 compute-1 sshd-session[215155]: Connection closed by 167.71.248.239 port 47296
Sep 30 21:12:08 compute-1 podman[215156]: 2025-09-30 21:12:08.22722059 +0000 UTC m=+0.074405593 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:12:09 compute-1 podman[215176]: 2025-09-30 21:12:09.221462041 +0000 UTC m=+0.064437353 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:12:11 compute-1 podman[215200]: 2025-09-30 21:12:11.27236392 +0000 UTC m=+0.112815498 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:12:14 compute-1 podman[215228]: 2025-09-30 21:12:14.217122631 +0000 UTC m=+0.065409438 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:12:20 compute-1 podman[215248]: 2025-09-30 21:12:20.226352957 +0000 UTC m=+0.063135208 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent)
Sep 30 21:12:24 compute-1 podman[215267]: 2025-09-30 21:12:24.214621108 +0000 UTC m=+0.061869054 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, vcs-type=git)
Sep 30 21:12:26 compute-1 podman[215288]: 2025-09-30 21:12:26.194135855 +0000 UTC m=+0.042212049 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:12:28 compute-1 nova_compute[192795]: 2025-09-30 21:12:28.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.733 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.733 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.734 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.734 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.875 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.875 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5998MB free_disk=73.50275802612305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.876 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.876 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.962 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.962 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:12:29 compute-1 nova_compute[192795]: 2025-09-30 21:12:29.985 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:12:30 compute-1 nova_compute[192795]: 2025-09-30 21:12:30.007 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:12:30 compute-1 nova_compute[192795]: 2025-09-30 21:12:30.009 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:12:30 compute-1 nova_compute[192795]: 2025-09-30 21:12:30.009 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:12:31 compute-1 nova_compute[192795]: 2025-09-30 21:12:31.008 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:31 compute-1 nova_compute[192795]: 2025-09-30 21:12:31.009 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:12:31 compute-1 nova_compute[192795]: 2025-09-30 21:12:31.009 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:12:31 compute-1 nova_compute[192795]: 2025-09-30 21:12:31.027 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:12:31 compute-1 nova_compute[192795]: 2025-09-30 21:12:31.028 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:31 compute-1 nova_compute[192795]: 2025-09-30 21:12:31.028 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:31 compute-1 nova_compute[192795]: 2025-09-30 21:12:31.028 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:12:31 compute-1 nova_compute[192795]: 2025-09-30 21:12:31.708 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:34 compute-1 podman[215312]: 2025-09-30 21:12:34.20610118 +0000 UTC m=+0.053312730 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3)
Sep 30 21:12:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:12:38.673 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:12:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:12:38.674 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:12:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:12:38.674 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:12:39 compute-1 podman[215333]: 2025-09-30 21:12:39.239192136 +0000 UTC m=+0.078743561 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:12:39 compute-1 podman[215355]: 2025-09-30 21:12:39.322027458 +0000 UTC m=+0.053077073 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:12:42 compute-1 podman[215379]: 2025-09-30 21:12:42.254398872 +0000 UTC m=+0.084336344 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller)
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.007 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:12:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:45 compute-1 podman[215406]: 2025-09-30 21:12:45.208109445 +0000 UTC m=+0.053337091 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:12:51 compute-1 podman[215426]: 2025-09-30 21:12:51.203051962 +0000 UTC m=+0.049131097 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:12:52 compute-1 sudo[215570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sexwemkuodgpluymagniidvuqgswhcql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266772.4941652-3287-40389796957113/AnsiballZ_file.py'
Sep 30 21:12:52 compute-1 sudo[215570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:53 compute-1 python3.9[215572]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:53 compute-1 sudo[215570]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:53 compute-1 sudo[215722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvyrivplnvtgswotrkeblvrqcdbnhsaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266773.2446895-3311-113348690095780/AnsiballZ_stat.py'
Sep 30 21:12:53 compute-1 sudo[215722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:53 compute-1 python3.9[215724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:12:53 compute-1 sudo[215722]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:54 compute-1 sudo[215845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saaesleropqgrrnekecxaruaouzlgaff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266773.2446895-3311-113348690095780/AnsiballZ_copy.py'
Sep 30 21:12:54 compute-1 sudo[215845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:54 compute-1 python3.9[215847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266773.2446895-3311-113348690095780/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:54 compute-1 sudo[215845]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:54 compute-1 PackageKit[130706]: daemon quit
Sep 30 21:12:54 compute-1 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 21:12:54 compute-1 podman[215848]: 2025-09-30 21:12:54.364466513 +0000 UTC m=+0.054215145 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Sep 30 21:12:55 compute-1 sudo[216018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcyjpzdasczpgdsvxngkoarpxcgplffq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266774.8912845-3359-126177244432632/AnsiballZ_file.py'
Sep 30 21:12:55 compute-1 sudo[216018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:55 compute-1 python3.9[216020]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:55 compute-1 sudo[216018]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:56 compute-1 sudo[216170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cseiqetgotsborlwinjuqrrbirdrmhqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266775.7455213-3383-268819675006282/AnsiballZ_stat.py'
Sep 30 21:12:56 compute-1 sudo[216170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:56 compute-1 python3.9[216172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:12:56 compute-1 sudo[216170]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:56 compute-1 sudo[216261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmteaqyseijfcezloodxinntsgslimjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266775.7455213-3383-268819675006282/AnsiballZ_file.py'
Sep 30 21:12:56 compute-1 sudo[216261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:56 compute-1 podman[216222]: 2025-09-30 21:12:56.457060749 +0000 UTC m=+0.041716568 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:12:56 compute-1 python3.9[216274]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:56 compute-1 sudo[216261]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:57 compute-1 sudo[216424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szpnpctpqoauvfcgobizxtnhurgvsvkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266777.1551628-3419-89003297301213/AnsiballZ_stat.py'
Sep 30 21:12:57 compute-1 sudo[216424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:57 compute-1 python3.9[216426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:12:57 compute-1 sudo[216424]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:57 compute-1 sudo[216502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiwtnzilfbjtiiqqullsycckzgzyaxsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266777.1551628-3419-89003297301213/AnsiballZ_file.py'
Sep 30 21:12:57 compute-1 sudo[216502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:58 compute-1 python3.9[216504]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._owtpv3i recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:58 compute-1 sudo[216502]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:58 compute-1 sudo[216654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbnpxzausxlpngdkepzciozvjwzpyebd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266778.4127574-3455-32073321127260/AnsiballZ_stat.py'
Sep 30 21:12:58 compute-1 sudo[216654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:58 compute-1 python3.9[216656]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:12:58 compute-1 sudo[216654]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:59 compute-1 sudo[216732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kflrommjttjewpkmypchuoqneoiptzyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266778.4127574-3455-32073321127260/AnsiballZ_file.py'
Sep 30 21:12:59 compute-1 sudo[216732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:59 compute-1 python3.9[216734]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:59 compute-1 sudo[216732]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:00 compute-1 sudo[216884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izedmdbwnlvyjcmdwnnmbibponotbqic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266779.86784-3494-17134702421049/AnsiballZ_command.py'
Sep 30 21:13:00 compute-1 sudo[216884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:00 compute-1 python3.9[216886]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:13:00 compute-1 sudo[216884]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:01 compute-1 sudo[217037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuweaacuyfjejuxtqolrpotznfrirwbf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266780.7264287-3518-64772013960292/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 21:13:01 compute-1 sudo[217037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:01 compute-1 python3[217039]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 21:13:01 compute-1 sudo[217037]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:02 compute-1 sudo[217189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yirirrloafrkuytbscwobejuprwwojuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266781.819322-3542-109297377468478/AnsiballZ_stat.py'
Sep 30 21:13:02 compute-1 sudo[217189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:02 compute-1 python3.9[217191]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:02 compute-1 sudo[217189]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:02 compute-1 sudo[217267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pilnintglsqdjxnmukuhaezjmlaawseb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266781.819322-3542-109297377468478/AnsiballZ_file.py'
Sep 30 21:13:02 compute-1 sudo[217267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:02 compute-1 python3.9[217269]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:02 compute-1 sudo[217267]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:03 compute-1 sudo[217419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cetgtwbnyfwbqzrglgpcffarduljexqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266783.1743975-3578-177200707447922/AnsiballZ_stat.py'
Sep 30 21:13:03 compute-1 sudo[217419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:03 compute-1 python3.9[217421]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:03 compute-1 sudo[217419]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:03 compute-1 sudo[217497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhftyiyaddmqwrqudepitvfmwvejhkkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266783.1743975-3578-177200707447922/AnsiballZ_file.py'
Sep 30 21:13:03 compute-1 sudo[217497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:04 compute-1 python3.9[217499]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:04 compute-1 sudo[217497]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:04 compute-1 sudo[217661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymzuwcrdfoibtndlelpqvocltbllmfgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266784.6133838-3614-65385630626315/AnsiballZ_stat.py'
Sep 30 21:13:04 compute-1 sudo[217661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:05 compute-1 podman[217623]: 2025-09-30 21:13:05.025850807 +0000 UTC m=+0.081187053 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Sep 30 21:13:05 compute-1 python3.9[217669]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:05 compute-1 sudo[217661]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:05 compute-1 sudo[217746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mevksuoksnjotthyyzbtnwnhcygxuexq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266784.6133838-3614-65385630626315/AnsiballZ_file.py'
Sep 30 21:13:05 compute-1 sudo[217746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:05 compute-1 python3.9[217748]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:05 compute-1 sudo[217746]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:06 compute-1 sudo[217898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mddlkwoytksrbvivuilhpmiyeqgmglzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266786.0164812-3650-56070361640277/AnsiballZ_stat.py'
Sep 30 21:13:06 compute-1 sudo[217898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:06 compute-1 python3.9[217900]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:06 compute-1 sudo[217898]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:06 compute-1 sudo[217976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvdqwbclroldggonmotzzzmrgiepiike ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266786.0164812-3650-56070361640277/AnsiballZ_file.py'
Sep 30 21:13:06 compute-1 sudo[217976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:06 compute-1 python3.9[217978]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:06 compute-1 sudo[217976]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:07 compute-1 sudo[218128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxbkmfblynpeyajkqjfyuynlckangxat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266787.3947384-3686-112719628325637/AnsiballZ_stat.py'
Sep 30 21:13:07 compute-1 sudo[218128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:07 compute-1 python3.9[218130]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:07 compute-1 sudo[218128]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:08 compute-1 sudo[218253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rosjxwsifhecgtcsnmjlcckrmifmwbap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266787.3947384-3686-112719628325637/AnsiballZ_copy.py'
Sep 30 21:13:08 compute-1 sudo[218253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:08 compute-1 python3.9[218255]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266787.3947384-3686-112719628325637/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:08 compute-1 sudo[218253]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:09 compute-1 podman[218379]: 2025-09-30 21:13:09.498305285 +0000 UTC m=+0.054807821 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 21:13:09 compute-1 podman[218380]: 2025-09-30 21:13:09.512099007 +0000 UTC m=+0.060329070 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:13:09 compute-1 sudo[218441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywcgoegxbkkmdyesesotunpwhbgowoug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266789.172155-3731-32100832170889/AnsiballZ_file.py'
Sep 30 21:13:09 compute-1 sudo[218441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:09 compute-1 python3.9[218448]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:09 compute-1 sudo[218441]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:10 compute-1 sudo[218598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezwxstxpvegidrkstnfrhjnaqmqupsxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266789.9397764-3756-52838928344876/AnsiballZ_command.py'
Sep 30 21:13:10 compute-1 sudo[218598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:10 compute-1 python3.9[218600]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:13:10 compute-1 sudo[218598]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:11 compute-1 sudo[218753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyxybskchcpnccogroiyxlzcrtjtudlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266790.7673364-3779-146731170127462/AnsiballZ_blockinfile.py'
Sep 30 21:13:11 compute-1 sudo[218753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:11 compute-1 python3.9[218755]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:11 compute-1 sudo[218753]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:12 compute-1 sudo[218905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nstuffajljhlywvrdnoxzrnvadyvkbqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266791.9063478-3806-52924643098001/AnsiballZ_command.py'
Sep 30 21:13:12 compute-1 sudo[218905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:12 compute-1 python3.9[218907]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:13:12 compute-1 sudo[218905]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:13 compute-1 sudo[219067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxoyyinrohoycavrkpmusqhhpzirapxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266792.7449954-3830-275496635803040/AnsiballZ_stat.py'
Sep 30 21:13:13 compute-1 sudo[219067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:13 compute-1 podman[219032]: 2025-09-30 21:13:13.115368157 +0000 UTC m=+0.116034423 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:13:13 compute-1 python3.9[219073]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:13:13 compute-1 sudo[219067]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:14 compute-1 sudo[219239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nigokaaeedredihfwivwhruftoftzoga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266793.706401-3854-105714321796466/AnsiballZ_command.py'
Sep 30 21:13:14 compute-1 sudo[219239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:14 compute-1 python3.9[219241]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:13:14 compute-1 sudo[219239]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:14 compute-1 sudo[219394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obiflbunltwdtpxnyesktzvjluzsyeuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266794.572159-3878-253109845886241/AnsiballZ_file.py'
Sep 30 21:13:14 compute-1 sudo[219394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:15 compute-1 python3.9[219396]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:15 compute-1 sudo[219394]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:15 compute-1 sshd-session[193119]: Connection closed by 192.168.122.30 port 38168
Sep 30 21:13:15 compute-1 sshd-session[193116]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:13:15 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Sep 30 21:13:15 compute-1 systemd[1]: session-28.scope: Consumed 1min 41.667s CPU time.
Sep 30 21:13:15 compute-1 systemd-logind[793]: Session 28 logged out. Waiting for processes to exit.
Sep 30 21:13:15 compute-1 systemd-logind[793]: Removed session 28.
Sep 30 21:13:15 compute-1 podman[219421]: 2025-09-30 21:13:15.825411283 +0000 UTC m=+0.064358128 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:13:22 compute-1 podman[219441]: 2025-09-30 21:13:22.209143392 +0000 UTC m=+0.055814238 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, tcib_managed=true)
Sep 30 21:13:25 compute-1 podman[219460]: 2025-09-30 21:13:25.209036712 +0000 UTC m=+0.055843198 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:13:27 compute-1 podman[219481]: 2025-09-30 21:13:27.241042052 +0000 UTC m=+0.077675549 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:13:28 compute-1 nova_compute[192795]: 2025-09-30 21:13:28.689 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:29 compute-1 nova_compute[192795]: 2025-09-30 21:13:29.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:29 compute-1 nova_compute[192795]: 2025-09-30 21:13:29.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:29 compute-1 nova_compute[192795]: 2025-09-30 21:13:29.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:30 compute-1 nova_compute[192795]: 2025-09-30 21:13:30.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:30 compute-1 nova_compute[192795]: 2025-09-30 21:13:30.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:13:30 compute-1 nova_compute[192795]: 2025-09-30 21:13:30.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:13:30 compute-1 nova_compute[192795]: 2025-09-30 21:13:30.712 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:13:30 compute-1 nova_compute[192795]: 2025-09-30 21:13:30.712 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:31 compute-1 nova_compute[192795]: 2025-09-30 21:13:31.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:31 compute-1 nova_compute[192795]: 2025-09-30 21:13:31.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:31 compute-1 nova_compute[192795]: 2025-09-30 21:13:31.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:13:31 compute-1 nova_compute[192795]: 2025-09-30 21:13:31.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:32 compute-1 nova_compute[192795]: 2025-09-30 21:13:32.971 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:13:32 compute-1 nova_compute[192795]: 2025-09-30 21:13:32.971 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:13:32 compute-1 nova_compute[192795]: 2025-09-30 21:13:32.972 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:13:32 compute-1 nova_compute[192795]: 2025-09-30 21:13:32.972 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.115 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.116 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6015MB free_disk=73.50273513793945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.116 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.116 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.243 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.243 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.261 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.274 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.275 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:13:33 compute-1 nova_compute[192795]: 2025-09-30 21:13:33.275 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:13:34 compute-1 nova_compute[192795]: 2025-09-30 21:13:34.274 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:35 compute-1 podman[219505]: 2025-09-30 21:13:35.23680706 +0000 UTC m=+0.079247060 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:13:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:13:38.674 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:13:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:13:38.675 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:13:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:13:38.675 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:13:40 compute-1 podman[219526]: 2025-09-30 21:13:40.205727291 +0000 UTC m=+0.047010910 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:13:40 compute-1 podman[219525]: 2025-09-30 21:13:40.225064743 +0000 UTC m=+0.073949388 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:13:43 compute-1 podman[219570]: 2025-09-30 21:13:43.226155457 +0000 UTC m=+0.074052180 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:13:46 compute-1 podman[219596]: 2025-09-30 21:13:46.207037534 +0000 UTC m=+0.050679400 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:13:53 compute-1 podman[219616]: 2025-09-30 21:13:53.19705698 +0000 UTC m=+0.044979965 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:13:56 compute-1 podman[219635]: 2025-09-30 21:13:56.236372645 +0000 UTC m=+0.075150060 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7)
Sep 30 21:13:58 compute-1 podman[219657]: 2025-09-30 21:13:58.221689285 +0000 UTC m=+0.070839884 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:13:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:13:58.235 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:13:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:13:58.236 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:13:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:13:58.237 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:14:00 compute-1 unix_chkpwd[219683]: password check failed for user (root)
Sep 30 21:14:00 compute-1 sshd-session[219681]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 21:14:03 compute-1 sshd-session[219681]: Failed password for root from 185.156.73.233 port 36900 ssh2
Sep 30 21:14:03 compute-1 sshd-session[219681]: Connection closed by authenticating user root 185.156.73.233 port 36900 [preauth]
Sep 30 21:14:06 compute-1 podman[219684]: 2025-09-30 21:14:06.210170842 +0000 UTC m=+0.050263137 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:14:11 compute-1 podman[219705]: 2025-09-30 21:14:11.22621704 +0000 UTC m=+0.052579159 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:14:11 compute-1 podman[219704]: 2025-09-30 21:14:11.232352034 +0000 UTC m=+0.061125278 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:14:14 compute-1 podman[219749]: 2025-09-30 21:14:14.219189665 +0000 UTC m=+0.066559792 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_controller)
Sep 30 21:14:17 compute-1 podman[219775]: 2025-09-30 21:14:17.203269394 +0000 UTC m=+0.047044590 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Sep 30 21:14:24 compute-1 podman[219797]: 2025-09-30 21:14:24.202263639 +0000 UTC m=+0.049645880 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:14:27 compute-1 podman[219817]: 2025-09-30 21:14:27.222242389 +0000 UTC m=+0.060969003 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:14:27 compute-1 nova_compute[192795]: 2025-09-30 21:14:27.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:27 compute-1 nova_compute[192795]: 2025-09-30 21:14:27.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:14:28 compute-1 nova_compute[192795]: 2025-09-30 21:14:28.016 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:14:28 compute-1 nova_compute[192795]: 2025-09-30 21:14:28.016 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:28 compute-1 nova_compute[192795]: 2025-09-30 21:14:28.016 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:14:28 compute-1 nova_compute[192795]: 2025-09-30 21:14:28.050 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:29 compute-1 podman[219838]: 2025-09-30 21:14:29.231256384 +0000 UTC m=+0.075402440 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:14:30 compute-1 nova_compute[192795]: 2025-09-30 21:14:30.060 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:30 compute-1 nova_compute[192795]: 2025-09-30 21:14:30.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:31 compute-1 nova_compute[192795]: 2025-09-30 21:14:31.688 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:31 compute-1 nova_compute[192795]: 2025-09-30 21:14:31.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.715 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.715 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.715 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.715 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.750 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.750 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.750 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.751 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.883 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.884 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6071MB free_disk=73.50275421142578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.884 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:14:32 compute-1 nova_compute[192795]: 2025-09-30 21:14:32.885 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.004 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.004 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.069 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.178 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.178 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.198 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.231 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.252 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.370 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.372 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:14:33 compute-1 nova_compute[192795]: 2025-09-30 21:14:33.372 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.487s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:14:34 compute-1 nova_compute[192795]: 2025-09-30 21:14:34.350 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:34 compute-1 nova_compute[192795]: 2025-09-30 21:14:34.350 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:14:37 compute-1 podman[219862]: 2025-09-30 21:14:37.214076647 +0000 UTC m=+0.063411648 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:14:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:14:38.675 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:14:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:14:38.676 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:14:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:14:38.676 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:14:42 compute-1 podman[219882]: 2025-09-30 21:14:42.205201817 +0000 UTC m=+0.050059181 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:14:42 compute-1 podman[219883]: 2025-09-30 21:14:42.218636138 +0000 UTC m=+0.056880285 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:14:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:45 compute-1 podman[219929]: 2025-09-30 21:14:45.243191631 +0000 UTC m=+0.085553701 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:14:48 compute-1 podman[219956]: 2025-09-30 21:14:48.204273675 +0000 UTC m=+0.052567599 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:14:55 compute-1 podman[219976]: 2025-09-30 21:14:55.210492752 +0000 UTC m=+0.047701108 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:14:58 compute-1 podman[219995]: 2025-09-30 21:14:58.234407837 +0000 UTC m=+0.078634287 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:15:00 compute-1 podman[220017]: 2025-09-30 21:15:00.21836956 +0000 UTC m=+0.067749706 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:15:08 compute-1 podman[220042]: 2025-09-30 21:15:08.204282049 +0000 UTC m=+0.050406655 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:15:13 compute-1 podman[220062]: 2025-09-30 21:15:13.203751624 +0000 UTC m=+0.052013383 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd)
Sep 30 21:15:13 compute-1 podman[220063]: 2025-09-30 21:15:13.2039453 +0000 UTC m=+0.048231591 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:15:16 compute-1 podman[220106]: 2025-09-30 21:15:16.232621536 +0000 UTC m=+0.078235004 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:15:19 compute-1 podman[220134]: 2025-09-30 21:15:19.202265615 +0000 UTC m=+0.047750787 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:15:26 compute-1 podman[220154]: 2025-09-30 21:15:26.238189792 +0000 UTC m=+0.077262805 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 21:15:29 compute-1 podman[220173]: 2025-09-30 21:15:29.201305156 +0000 UTC m=+0.050262791 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc.)
Sep 30 21:15:31 compute-1 podman[220194]: 2025-09-30 21:15:31.233793682 +0000 UTC m=+0.073846925 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:15:31 compute-1 nova_compute[192795]: 2025-09-30 21:15:31.688 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:31 compute-1 nova_compute[192795]: 2025-09-30 21:15:31.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:31 compute-1 nova_compute[192795]: 2025-09-30 21:15:31.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.730 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.730 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.895 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.895 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6092MB free_disk=73.50273513793945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.896 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.896 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.978 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:15:32 compute-1 nova_compute[192795]: 2025-09-30 21:15:32.979 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:15:33 compute-1 nova_compute[192795]: 2025-09-30 21:15:33.006 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:15:33 compute-1 nova_compute[192795]: 2025-09-30 21:15:33.022 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:15:33 compute-1 nova_compute[192795]: 2025-09-30 21:15:33.024 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:15:33 compute-1 nova_compute[192795]: 2025-09-30 21:15:33.024 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:34 compute-1 nova_compute[192795]: 2025-09-30 21:15:34.018 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:34 compute-1 nova_compute[192795]: 2025-09-30 21:15:34.039 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:34 compute-1 nova_compute[192795]: 2025-09-30 21:15:34.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:34 compute-1 nova_compute[192795]: 2025-09-30 21:15:34.692 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:15:34 compute-1 nova_compute[192795]: 2025-09-30 21:15:34.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:15:34 compute-1 nova_compute[192795]: 2025-09-30 21:15:34.734 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:15:34 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:15:34.903 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:15:34 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:15:34.904 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:15:35 compute-1 nova_compute[192795]: 2025-09-30 21:15:35.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:35 compute-1 nova_compute[192795]: 2025-09-30 21:15:35.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:15:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:15:38.676 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:15:38.677 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:15:38.677 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:39 compute-1 podman[220218]: 2025-09-30 21:15:39.201055762 +0000 UTC m=+0.047129448 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:15:44 compute-1 podman[220240]: 2025-09-30 21:15:44.216283542 +0000 UTC m=+0.052569508 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:15:44 compute-1 podman[220239]: 2025-09-30 21:15:44.216272231 +0000 UTC m=+0.057363599 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:15:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:15:44.906 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:15:47 compute-1 podman[220282]: 2025-09-30 21:15:47.225151735 +0000 UTC m=+0.073623749 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.478 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.478 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.500 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.647 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.648 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.652 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.653 2 INFO nova.compute.claims [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.773 2 DEBUG nova.compute.provider_tree [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.784 2 DEBUG nova.scheduler.client.report [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.808 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.809 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.863 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.864 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.882 2 INFO nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:15:47 compute-1 nova_compute[192795]: 2025-09-30 21:15:47.911 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:15:48 compute-1 nova_compute[192795]: 2025-09-30 21:15:48.150 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:15:48 compute-1 nova_compute[192795]: 2025-09-30 21:15:48.151 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:15:48 compute-1 nova_compute[192795]: 2025-09-30 21:15:48.152 2 INFO nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Creating image(s)
Sep 30 21:15:48 compute-1 nova_compute[192795]: 2025-09-30 21:15:48.152 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "/var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:48 compute-1 nova_compute[192795]: 2025-09-30 21:15:48.153 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "/var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:48 compute-1 nova_compute[192795]: 2025-09-30 21:15:48.153 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "/var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:48 compute-1 nova_compute[192795]: 2025-09-30 21:15:48.154 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:48 compute-1 nova_compute[192795]: 2025-09-30 21:15:48.154 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:48 compute-1 nova_compute[192795]: 2025-09-30 21:15:48.918 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Automatically allocating a network for project 9ee15beb428a4c5e9726b65c80920da9. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.175 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:50 compute-1 podman[220308]: 2025-09-30 21:15:50.21505211 +0000 UTC m=+0.057937836 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.229 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.part --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.230 2 DEBUG nova.virt.images [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] 86b6907c-d747-4e98-8897-42105915831d was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.232 2 DEBUG nova.privsep.utils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.232 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.part /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.396 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.part /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.converted" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.400 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.450 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.converted --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.453 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:50 compute-1 nova_compute[192795]: 2025-09-30 21:15:50.465 2 INFO oslo.privsep.daemon [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpe0gw4y8d/privsep.sock']
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.211 2 INFO oslo.privsep.daemon [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Spawned new privsep daemon via rootwrap
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.024 54 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.029 54 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.032 54 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.032 54 INFO oslo.privsep.daemon [-] privsep daemon running as pid 54
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.289 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.339 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.340 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.341 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.352 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.402 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.404 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.444 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.445 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.446 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.498 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.499 2 DEBUG nova.virt.disk.api [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Checking if we can resize image /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.500 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.586 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.587 2 DEBUG nova.virt.disk.api [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Cannot resize image /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.588 2 DEBUG nova.objects.instance [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lazy-loading 'migration_context' on Instance uuid 059d7f34-3cb7-4d78-aeee-184c964e6787 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.606 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.607 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Ensure instance console log exists: /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.608 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.608 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:51 compute-1 nova_compute[192795]: 2025-09-30 21:15:51.609 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:57 compute-1 podman[220363]: 2025-09-30 21:15:57.218363367 +0000 UTC m=+0.064518741 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:16:00 compute-1 podman[220383]: 2025-09-30 21:16:00.243250703 +0000 UTC m=+0.086459006 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9)
Sep 30 21:16:02 compute-1 podman[220405]: 2025-09-30 21:16:02.208503728 +0000 UTC m=+0.053706012 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:16:08 compute-1 nova_compute[192795]: 2025-09-30 21:16:08.865 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Automatically allocated network: {'id': '776029b5-6115-4a10-b114-56157e8d42e8', 'name': 'auto_allocated_network', 'tenant_id': '9ee15beb428a4c5e9726b65c80920da9', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['20d80525-eb42-4d6a-9761-1ab47a242279', 'af065b13-5619-478f-8481-1ded808231da'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-09-30T21:15:49Z', 'updated_at': '2025-09-30T21:16:07Z', 'revision_number': 4, 'project_id': '9ee15beb428a4c5e9726b65c80920da9'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Sep 30 21:16:08 compute-1 nova_compute[192795]: 2025-09-30 21:16:08.885 2 WARNING oslo_policy.policy [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Sep 30 21:16:08 compute-1 nova_compute[192795]: 2025-09-30 21:16:08.887 2 WARNING oslo_policy.policy [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Sep 30 21:16:08 compute-1 nova_compute[192795]: 2025-09-30 21:16:08.891 2 DEBUG nova.policy [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b0c6f6dda88549aabf6a4b039406af8e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ee15beb428a4c5e9726b65c80920da9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:16:10 compute-1 podman[220430]: 2025-09-30 21:16:10.213078373 +0000 UTC m=+0.050555729 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:16:11 compute-1 nova_compute[192795]: 2025-09-30 21:16:11.053 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Successfully created port: f3d16df7-d485-4027-b3dc-8671f00a8485 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.321 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.321 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.350 2 DEBUG nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.469 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.470 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.476 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.477 2 INFO nova.compute.claims [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.656 2 DEBUG nova.compute.provider_tree [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.706 2 ERROR nova.scheduler.client.report [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [req-db64c01b-9a97-490b-abdd-a0cc31c2ac62] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID e551d5b4-e9f6-409e-b2a1-508a20c11333.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-db64c01b-9a97-490b-abdd-a0cc31c2ac62"}]}
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.731 2 DEBUG nova.scheduler.client.report [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.759 2 DEBUG nova.scheduler.client.report [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.759 2 DEBUG nova.compute.provider_tree [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.778 2 DEBUG nova.scheduler.client.report [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.801 2 DEBUG nova.scheduler.client.report [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.859 2 DEBUG nova.compute.provider_tree [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.934 2 DEBUG nova.scheduler.client.report [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Updated inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with generation 8 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.935 2 DEBUG nova.compute.provider_tree [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Updating resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 generation from 8 to 9 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.935 2 DEBUG nova.compute.provider_tree [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.959 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:12 compute-1 nova_compute[192795]: 2025-09-30 21:16:12.959 2 DEBUG nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.043 2 DEBUG nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.043 2 DEBUG nova.network.neutron [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.063 2 INFO nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.082 2 DEBUG nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.200 2 DEBUG nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.201 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.202 2 INFO nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Creating image(s)
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.202 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.202 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.203 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.215 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.285 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.286 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.287 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.297 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.379 2 DEBUG nova.policy [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96460712956e4f038121397afa979163', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.383 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.384 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.419 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.420 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.420 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.468 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.469 2 DEBUG nova.virt.disk.api [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Checking if we can resize image /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.469 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.518 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.519 2 DEBUG nova.virt.disk.api [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Cannot resize image /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.520 2 DEBUG nova.objects.instance [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lazy-loading 'migration_context' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.536 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.537 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Ensure instance console log exists: /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.537 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.537 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:13 compute-1 nova_compute[192795]: 2025-09-30 21:16:13.537 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:14 compute-1 nova_compute[192795]: 2025-09-30 21:16:14.004 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Successfully updated port: f3d16df7-d485-4027-b3dc-8671f00a8485 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:16:14 compute-1 nova_compute[192795]: 2025-09-30 21:16:14.029 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "refresh_cache-059d7f34-3cb7-4d78-aeee-184c964e6787" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:14 compute-1 nova_compute[192795]: 2025-09-30 21:16:14.029 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquired lock "refresh_cache-059d7f34-3cb7-4d78-aeee-184c964e6787" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:14 compute-1 nova_compute[192795]: 2025-09-30 21:16:14.029 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:16:14 compute-1 nova_compute[192795]: 2025-09-30 21:16:14.241 2 DEBUG nova.network.neutron [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Successfully created port: 70b5da71-314a-4c92-9db2-fb08b57a6736 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:16:14 compute-1 nova_compute[192795]: 2025-09-30 21:16:14.364 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:16:14 compute-1 nova_compute[192795]: 2025-09-30 21:16:14.848 2 DEBUG nova.compute.manager [req-85153d90-5427-4de9-af46-12d1ec3e5b75 req-b0d22b05-9132-4827-94ff-dfbc304929c4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-changed-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:14 compute-1 nova_compute[192795]: 2025-09-30 21:16:14.849 2 DEBUG nova.compute.manager [req-85153d90-5427-4de9-af46-12d1ec3e5b75 req-b0d22b05-9132-4827-94ff-dfbc304929c4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Refreshing instance network info cache due to event network-changed-f3d16df7-d485-4027-b3dc-8671f00a8485. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:16:14 compute-1 nova_compute[192795]: 2025-09-30 21:16:14.849 2 DEBUG oslo_concurrency.lockutils [req-85153d90-5427-4de9-af46-12d1ec3e5b75 req-b0d22b05-9132-4827-94ff-dfbc304929c4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-059d7f34-3cb7-4d78-aeee-184c964e6787" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:15 compute-1 podman[220467]: 2025-09-30 21:16:15.303309011 +0000 UTC m=+0.134923549 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:16:15 compute-1 podman[220468]: 2025-09-30 21:16:15.312178141 +0000 UTC m=+0.124499528 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:16:15 compute-1 nova_compute[192795]: 2025-09-30 21:16:15.530 2 DEBUG nova.network.neutron [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Successfully updated port: 70b5da71-314a-4c92-9db2-fb08b57a6736 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:16:15 compute-1 nova_compute[192795]: 2025-09-30 21:16:15.551 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:15 compute-1 nova_compute[192795]: 2025-09-30 21:16:15.551 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquired lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:15 compute-1 nova_compute[192795]: 2025-09-30 21:16:15.551 2 DEBUG nova.network.neutron [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:16:16 compute-1 nova_compute[192795]: 2025-09-30 21:16:16.288 2 DEBUG nova.compute.manager [req-f102fb5d-9f0a-4340-9de8-34e59ed0bb35 req-08953702-7ee0-4f68-8f4f-377f5b5c73b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-changed-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:16 compute-1 nova_compute[192795]: 2025-09-30 21:16:16.288 2 DEBUG nova.compute.manager [req-f102fb5d-9f0a-4340-9de8-34e59ed0bb35 req-08953702-7ee0-4f68-8f4f-377f5b5c73b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Refreshing instance network info cache due to event network-changed-70b5da71-314a-4c92-9db2-fb08b57a6736. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:16:16 compute-1 nova_compute[192795]: 2025-09-30 21:16:16.289 2 DEBUG oslo_concurrency.lockutils [req-f102fb5d-9f0a-4340-9de8-34e59ed0bb35 req-08953702-7ee0-4f68-8f4f-377f5b5c73b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:16 compute-1 nova_compute[192795]: 2025-09-30 21:16:16.315 2 DEBUG nova.network.neutron [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.100 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Updating instance_info_cache with network_info: [{"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.141 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Releasing lock "refresh_cache-059d7f34-3cb7-4d78-aeee-184c964e6787" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.141 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Instance network_info: |[{"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.142 2 DEBUG oslo_concurrency.lockutils [req-85153d90-5427-4de9-af46-12d1ec3e5b75 req-b0d22b05-9132-4827-94ff-dfbc304929c4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-059d7f34-3cb7-4d78-aeee-184c964e6787" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.142 2 DEBUG nova.network.neutron [req-85153d90-5427-4de9-af46-12d1ec3e5b75 req-b0d22b05-9132-4827-94ff-dfbc304929c4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Refreshing network info cache for port f3d16df7-d485-4027-b3dc-8671f00a8485 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.144 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Start _get_guest_xml network_info=[{"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.149 2 WARNING nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.153 2 DEBUG nova.virt.libvirt.host [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.153 2 DEBUG nova.virt.libvirt.host [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.158 2 DEBUG nova.virt.libvirt.host [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.158 2 DEBUG nova.virt.libvirt.host [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.159 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.160 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.160 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.160 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.160 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.161 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.161 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.161 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.161 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.162 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.162 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.162 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.165 2 DEBUG nova.privsep.utils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.166 2 DEBUG nova.virt.libvirt.vif [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-423040831-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-423040831-1',id=2,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ee15beb428a4c5e9726b65c80920da9',ramdisk_id='',reservation_id='r-vbxtb26e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-649897189',owner_user_name='tempest-AutoAllocateNetworkTest-649897189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:15:47Z,user_data=None,user_id='b0c6f6dda88549aabf6a4b039406af8e',uuid=059d7f34-3cb7-4d78-aeee-184c964e6787,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.167 2 DEBUG nova.network.os_vif_util [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converting VIF {"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.167 2 DEBUG nova.network.os_vif_util [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:bc:74,bridge_name='br-int',has_traffic_filtering=True,id=f3d16df7-d485-4027-b3dc-8671f00a8485,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3d16df7-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.169 2 DEBUG nova.objects.instance [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 059d7f34-3cb7-4d78-aeee-184c964e6787 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.198 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <uuid>059d7f34-3cb7-4d78-aeee-184c964e6787</uuid>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <name>instance-00000002</name>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:name>tempest-tempest.common.compute-instance-423040831-1</nova:name>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:16:17</nova:creationTime>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:user uuid="b0c6f6dda88549aabf6a4b039406af8e">tempest-AutoAllocateNetworkTest-649897189-project-member</nova:user>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:project uuid="9ee15beb428a4c5e9726b65c80920da9">tempest-AutoAllocateNetworkTest-649897189</nova:project>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:port uuid="f3d16df7-d485-4027-b3dc-8671f00a8485">
Sep 30 21:16:17 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="fdfe:381f:8400::15f" ipVersion="6"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.1.0.4" ipVersion="4"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <system>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="serial">059d7f34-3cb7-4d78-aeee-184c964e6787</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="uuid">059d7f34-3cb7-4d78-aeee-184c964e6787</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </system>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <os>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </os>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <features>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </features>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk.config"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:7e:bc:74"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <target dev="tapf3d16df7-d4"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/console.log" append="off"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <video>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </video>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:16:17 compute-1 nova_compute[192795]: </domain>
Sep 30 21:16:17 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.199 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Preparing to wait for external event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.199 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.200 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.200 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.200 2 DEBUG nova.virt.libvirt.vif [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-423040831-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-423040831-1',id=2,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ee15beb428a4c5e9726b65c80920da9',ramdisk_id='',reservation_id='r-vbxtb26e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-649897189',owner_user_name='tempest-AutoAllocateNetworkTest-649897189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:15:47Z,user_data=None,user_id='b0c6f6dda88549aabf6a4b039406af8e',uuid=059d7f34-3cb7-4d78-aeee-184c964e6787,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.200 2 DEBUG nova.network.os_vif_util [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converting VIF {"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.201 2 DEBUG nova.network.os_vif_util [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:bc:74,bridge_name='br-int',has_traffic_filtering=True,id=f3d16df7-d485-4027-b3dc-8671f00a8485,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3d16df7-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.202 2 DEBUG os_vif [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:bc:74,bridge_name='br-int',has_traffic_filtering=True,id=f3d16df7-d485-4027-b3dc-8671f00a8485,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3d16df7-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.232 2 DEBUG ovsdbapp.backend.ovs_idl [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.232 2 DEBUG ovsdbapp.backend.ovs_idl [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.233 2 DEBUG ovsdbapp.backend.ovs_idl [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [POLLOUT] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.257 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.257 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.259 2 INFO oslo.privsep.daemon [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpw20gg62i/privsep.sock']
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.637 2 DEBUG nova.network.neutron [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.811 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Releasing lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.812 2 DEBUG nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Instance network_info: |[{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.812 2 DEBUG oslo_concurrency.lockutils [req-f102fb5d-9f0a-4340-9de8-34e59ed0bb35 req-08953702-7ee0-4f68-8f4f-377f5b5c73b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.812 2 DEBUG nova.network.neutron [req-f102fb5d-9f0a-4340-9de8-34e59ed0bb35 req-08953702-7ee0-4f68-8f4f-377f5b5c73b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Refreshing network info cache for port 70b5da71-314a-4c92-9db2-fb08b57a6736 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.815 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Start _get_guest_xml network_info=[{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.818 2 WARNING nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.822 2 DEBUG nova.virt.libvirt.host [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.823 2 DEBUG nova.virt.libvirt.host [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.825 2 DEBUG nova.virt.libvirt.host [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.826 2 DEBUG nova.virt.libvirt.host [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.826 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.827 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.827 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.827 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.828 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.828 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.828 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.828 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.828 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.829 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.829 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.829 2 DEBUG nova.virt.hardware [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.832 2 DEBUG nova.virt.libvirt.vif [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:16:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1363935032',display_name='tempest-LiveMigrationTest-server-1363935032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1363935032',id=5,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-0x92j8qn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:16:13Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=252d5457-8837-4aa6-b309-c3139e8db7ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.833 2 DEBUG nova.network.os_vif_util [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converting VIF {"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.833 2 DEBUG nova.network.os_vif_util [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.836 2 DEBUG nova.objects.instance [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lazy-loading 'pci_devices' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.938 2 INFO oslo.privsep.daemon [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Spawned new privsep daemon via rootwrap
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.828 90 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.832 90 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.834 90 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.835 90 INFO oslo.privsep.daemon [-] privsep daemon running as pid 90
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.995 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <uuid>252d5457-8837-4aa6-b309-c3139e8db7ed</uuid>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <name>instance-00000005</name>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:name>tempest-LiveMigrationTest-server-1363935032</nova:name>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:16:17</nova:creationTime>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:user uuid="4b263d7c3e3141f999e8eabf49e8190c">tempest-LiveMigrationTest-2029274765-project-member</nova:user>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:project uuid="96460712956e4f038121397afa979163">tempest-LiveMigrationTest-2029274765</nova:project>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         <nova:port uuid="70b5da71-314a-4c92-9db2-fb08b57a6736">
Sep 30 21:16:17 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <system>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="serial">252d5457-8837-4aa6-b309-c3139e8db7ed</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="uuid">252d5457-8837-4aa6-b309-c3139e8db7ed</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </system>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <os>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </os>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <features>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </features>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:a9:31:8d"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <target dev="tap70b5da71-31"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/console.log" append="off"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <video>
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </video>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:16:17 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:16:17 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:16:17 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:16:17 compute-1 nova_compute[192795]: </domain>
Sep 30 21:16:17 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.996 2 DEBUG nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Preparing to wait for external event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.997 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.997 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.997 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.998 2 DEBUG nova.virt.libvirt.vif [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:16:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1363935032',display_name='tempest-LiveMigrationTest-server-1363935032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1363935032',id=5,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-0x92j8qn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:16:13Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=252d5457-8837-4aa6-b309-c3139e8db7ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.998 2 DEBUG nova.network.os_vif_util [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converting VIF {"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.999 2 DEBUG nova.network.os_vif_util [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:17 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.999 2 DEBUG os_vif [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:17.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.000 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.000 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.254 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70b5da71-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.255 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap70b5da71-31, col_values=(('external_ids', {'iface-id': '70b5da71-314a-4c92-9db2-fb08b57a6736', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:31:8d', 'vm-uuid': '252d5457-8837-4aa6-b309-c3139e8db7ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:18 compute-1 NetworkManager[51724]: <info>  [1759266978.2583] manager: (tap70b5da71-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.265 2 INFO os_vif [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31')
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.266 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3d16df7-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.266 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf3d16df7-d4, col_values=(('external_ids', {'iface-id': 'f3d16df7-d485-4027-b3dc-8671f00a8485', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7e:bc:74', 'vm-uuid': '059d7f34-3cb7-4d78-aeee-184c964e6787'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:18 compute-1 podman[220522]: 2025-09-30 21:16:18.267544858 +0000 UTC m=+0.115034472 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:16:18 compute-1 NetworkManager[51724]: <info>  [1759266978.2685] manager: (tapf3d16df7-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.275 2 INFO os_vif [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:bc:74,bridge_name='br-int',has_traffic_filtering=True,id=f3d16df7-d485-4027-b3dc-8671f00a8485,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3d16df7-d4')
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.320 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.321 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.321 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] No VIF found with MAC fa:16:3e:a9:31:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.322 2 INFO nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Using config drive
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.324 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.324 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.324 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] No VIF found with MAC fa:16:3e:7e:bc:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:16:18 compute-1 nova_compute[192795]: 2025-09-30 21:16:18.325 2 INFO nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Using config drive
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.257 2 INFO nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Creating config drive at /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.262 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp88yf_3xu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.385 2 DEBUG oslo_concurrency.processutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp88yf_3xu" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:19 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Sep 30 21:16:19 compute-1 kernel: tap70b5da71-31: entered promiscuous mode
Sep 30 21:16:19 compute-1 NetworkManager[51724]: <info>  [1759266979.4568] manager: (tap70b5da71-31): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Sep 30 21:16:19 compute-1 ovn_controller[94902]: 2025-09-30T21:16:19Z|00027|binding|INFO|Claiming lport 70b5da71-314a-4c92-9db2-fb08b57a6736 for this chassis.
Sep 30 21:16:19 compute-1 ovn_controller[94902]: 2025-09-30T21:16:19Z|00028|binding|INFO|70b5da71-314a-4c92-9db2-fb08b57a6736: Claiming fa:16:3e:a9:31:8d 10.100.0.7
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:19.476 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:31:8d 10.100.0.7'], port_security=['fa:16:3e:a9:31:8d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '2', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=70b5da71-314a-4c92-9db2-fb08b57a6736) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:19.477 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 70b5da71-314a-4c92-9db2-fb08b57a6736 in datapath 16d40025-1087-460f-a42f-c007f6eff406 bound to our chassis
Sep 30 21:16:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:19.480 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:16:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:19.482 103861 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpqctk91l2/privsep.sock']
Sep 30 21:16:19 compute-1 systemd-udevd[220574]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:16:19 compute-1 NetworkManager[51724]: <info>  [1759266979.5064] device (tap70b5da71-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:16:19 compute-1 NetworkManager[51724]: <info>  [1759266979.5072] device (tap70b5da71-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:16:19 compute-1 systemd-machined[152783]: New machine qemu-1-instance-00000005.
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:19 compute-1 ovn_controller[94902]: 2025-09-30T21:16:19Z|00029|binding|INFO|Setting lport 70b5da71-314a-4c92-9db2-fb08b57a6736 ovn-installed in OVS
Sep 30 21:16:19 compute-1 ovn_controller[94902]: 2025-09-30T21:16:19Z|00030|binding|INFO|Setting lport 70b5da71-314a-4c92-9db2-fb08b57a6736 up in Southbound
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:19 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000005.
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.555 2 DEBUG nova.network.neutron [req-f102fb5d-9f0a-4340-9de8-34e59ed0bb35 req-08953702-7ee0-4f68-8f4f-377f5b5c73b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updated VIF entry in instance network info cache for port 70b5da71-314a-4c92-9db2-fb08b57a6736. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.556 2 DEBUG nova.network.neutron [req-f102fb5d-9f0a-4340-9de8-34e59ed0bb35 req-08953702-7ee0-4f68-8f4f-377f5b5c73b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:19 compute-1 nova_compute[192795]: 2025-09-30 21:16:19.571 2 DEBUG oslo_concurrency.lockutils [req-f102fb5d-9f0a-4340-9de8-34e59ed0bb35 req-08953702-7ee0-4f68-8f4f-377f5b5c73b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.039 2 INFO nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Creating config drive at /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk.config
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.045 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp24kfewa7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:20.124 103861 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:20.125 103861 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpqctk91l2/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:19.990 220603 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:19.994 220603 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:19.996 220603 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:19.997 220603 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220603
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:20.128 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ce382ca2-4538-4385-bd79-a56cc2a9456f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.168 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp24kfewa7" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.217 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759266980.2169213, 252d5457-8837-4aa6-b309-c3139e8db7ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.218 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Started (Lifecycle Event)
Sep 30 21:16:20 compute-1 kernel: tapf3d16df7-d4: entered promiscuous mode
Sep 30 21:16:20 compute-1 NetworkManager[51724]: <info>  [1759266980.2252] manager: (tapf3d16df7-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:20 compute-1 ovn_controller[94902]: 2025-09-30T21:16:20Z|00031|binding|INFO|Claiming lport f3d16df7-d485-4027-b3dc-8671f00a8485 for this chassis.
Sep 30 21:16:20 compute-1 ovn_controller[94902]: 2025-09-30T21:16:20Z|00032|binding|INFO|f3d16df7-d485-4027-b3dc-8671f00a8485: Claiming fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:20 compute-1 NetworkManager[51724]: <info>  [1759266980.2364] device (tapf3d16df7-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:16:20 compute-1 NetworkManager[51724]: <info>  [1759266980.2374] device (tapf3d16df7-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.239 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.242 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759266980.220052, 252d5457-8837-4aa6-b309-c3139e8db7ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.242 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Paused (Lifecycle Event)
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:20.244 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f'], port_security=['fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.4/26 fdfe:381f:8400::15f/64', 'neutron:device_id': '059d7f34-3cb7-4d78-aeee-184c964e6787', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-776029b5-6115-4a10-b114-56157e8d42e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ee15beb428a4c5e9726b65c80920da9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b51fed5f-83ee-4757-b171-916e34c6fdb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac973fb-5d64-4a91-8b3a-7dff1eb497e1, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f3d16df7-d485-4027-b3dc-8671f00a8485) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.275 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.284 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:20 compute-1 ovn_controller[94902]: 2025-09-30T21:16:20Z|00033|binding|INFO|Setting lport f3d16df7-d485-4027-b3dc-8671f00a8485 ovn-installed in OVS
Sep 30 21:16:20 compute-1 ovn_controller[94902]: 2025-09-30T21:16:20Z|00034|binding|INFO|Setting lport f3d16df7-d485-4027-b3dc-8671f00a8485 up in Southbound
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:20 compute-1 systemd-machined[152783]: New machine qemu-2-instance-00000002.
Sep 30 21:16:20 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.316 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:16:20 compute-1 podman[220623]: 2025-09-30 21:16:20.341501522 +0000 UTC m=+0.060759413 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.361 2 DEBUG nova.compute.manager [req-afda5ebd-998a-4871-9f57-c80c40bf776b req-edf9378e-fe85-4c06-9863-c3e7b4e229cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.361 2 DEBUG oslo_concurrency.lockutils [req-afda5ebd-998a-4871-9f57-c80c40bf776b req-edf9378e-fe85-4c06-9863-c3e7b4e229cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.362 2 DEBUG oslo_concurrency.lockutils [req-afda5ebd-998a-4871-9f57-c80c40bf776b req-edf9378e-fe85-4c06-9863-c3e7b4e229cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.362 2 DEBUG oslo_concurrency.lockutils [req-afda5ebd-998a-4871-9f57-c80c40bf776b req-edf9378e-fe85-4c06-9863-c3e7b4e229cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.362 2 DEBUG nova.compute.manager [req-afda5ebd-998a-4871-9f57-c80c40bf776b req-edf9378e-fe85-4c06-9863-c3e7b4e229cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Processing event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.363 2 DEBUG nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.366 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759266980.36609, 252d5457-8837-4aa6-b309-c3139e8db7ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.366 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Resumed (Lifecycle Event)
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.379 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.383 2 INFO nova.virt.libvirt.driver [-] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Instance spawned successfully.
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.383 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.388 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.391 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.410 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.414 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.414 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.415 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.415 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.415 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.416 2 DEBUG nova.virt.libvirt.driver [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.495 2 INFO nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Took 7.29 seconds to spawn the instance on the hypervisor.
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.495 2 DEBUG nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.578 2 INFO nova.compute.manager [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Took 8.15 seconds to build instance.
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.600 2 DEBUG oslo_concurrency.lockutils [None req-1db5eaf2-0074-4ba6-a405-5da945f5c185 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:20.658 220603 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:20.659 220603 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:20.659 220603 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.928 2 DEBUG nova.network.neutron [req-85153d90-5427-4de9-af46-12d1ec3e5b75 req-b0d22b05-9132-4827-94ff-dfbc304929c4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Updated VIF entry in instance network info cache for port f3d16df7-d485-4027-b3dc-8671f00a8485. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.929 2 DEBUG nova.network.neutron [req-85153d90-5427-4de9-af46-12d1ec3e5b75 req-b0d22b05-9132-4827-94ff-dfbc304929c4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Updating instance_info_cache with network_info: [{"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:20 compute-1 nova_compute[192795]: 2025-09-30 21:16:20.952 2 DEBUG oslo_concurrency.lockutils [req-85153d90-5427-4de9-af46-12d1ec3e5b75 req-b0d22b05-9132-4827-94ff-dfbc304929c4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-059d7f34-3cb7-4d78-aeee-184c964e6787" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:21 compute-1 nova_compute[192795]: 2025-09-30 21:16:21.162 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759266981.1611736, 059d7f34-3cb7-4d78-aeee-184c964e6787 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:21 compute-1 nova_compute[192795]: 2025-09-30 21:16:21.162 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] VM Started (Lifecycle Event)
Sep 30 21:16:21 compute-1 nova_compute[192795]: 2025-09-30 21:16:21.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:21 compute-1 nova_compute[192795]: 2025-09-30 21:16:21.244 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:21 compute-1 nova_compute[192795]: 2025-09-30 21:16:21.249 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759266981.1613503, 059d7f34-3cb7-4d78-aeee-184c964e6787 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:21 compute-1 nova_compute[192795]: 2025-09-30 21:16:21.249 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] VM Paused (Lifecycle Event)
Sep 30 21:16:21 compute-1 nova_compute[192795]: 2025-09-30 21:16:21.269 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:21 compute-1 nova_compute[192795]: 2025-09-30 21:16:21.273 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:21 compute-1 nova_compute[192795]: 2025-09-30 21:16:21.291 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:16:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:21.352 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9ecfe0a9-2e53-4dec-beb0-2fb71bfdbb9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:21.353 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap16d40025-11 in ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:16:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:21.356 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap16d40025-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:16:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:21.356 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d4644fab-868b-4d18-923a-ecf2d897cee3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:21.359 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[155c64a2-76f0-40d6-a0be-15e17db1d763]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:21.392 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[38d627d9-3f07-4822-a289-358c00286686]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:21.422 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e601cfd9-0cd7-47fe-ad05-480e3a0ba81b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:21.426 103861 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpfbf9kuk3/privsep.sock']
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.168 103861 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.169 103861 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpfbf9kuk3/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.041 220668 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.045 220668 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.048 220668 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.048 220668 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220668
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.172 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[dd0d9dac-8bf4-4d6b-9fc3-0800fe230725]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.657 2 DEBUG nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.657 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.658 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.658 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.658 2 DEBUG nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.658 2 WARNING nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state None.
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.658 2 DEBUG nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.659 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.659 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.659 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.659 2 DEBUG nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Processing event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.659 2 DEBUG nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.659 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.659 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.660 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.660 2 DEBUG nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] No waiting events found dispatching network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.660 2 WARNING nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received unexpected event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 for instance with vm_state building and task_state spawning.
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.661 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.665 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.666 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759266982.6655104, 059d7f34-3cb7-4d78-aeee-184c964e6787 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.666 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] VM Resumed (Lifecycle Event)
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.670 2 INFO nova.virt.libvirt.driver [-] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Instance spawned successfully.
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.671 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.692 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.694 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.694 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.694 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.695 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.695 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.696 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.703 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.734 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.755 220668 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.755 220668 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:22.755 220668 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.783 2 INFO nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Took 34.63 seconds to spawn the instance on the hypervisor.
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.783 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.896 2 INFO nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Took 35.33 seconds to build instance.
Sep 30 21:16:22 compute-1 nova_compute[192795]: 2025-09-30 21:16:22.915 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 35.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:23 compute-1 nova_compute[192795]: 2025-09-30 21:16:23.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.331 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f4591b98-682c-4b95-a912-8961f8eaee01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.340 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a9ac20-cae1-40e6-8012-b8387e05fe76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 NetworkManager[51724]: <info>  [1759266983.3419] manager: (tap16d40025-10): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.375 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b753f00e-4287-452c-87b7-c1ed0b19c316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.378 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[73d26666-c346-4284-89a5-2d17a30d5470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 systemd-udevd[220680]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:16:23 compute-1 NetworkManager[51724]: <info>  [1759266983.4027] device (tap16d40025-10): carrier: link connected
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.410 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7b93c3-354e-40d8-ba49-b86e4244ec91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.436 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf68d5b-5832-49d9-aef3-4354c16c31cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367907, 'reachable_time': 32046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220682, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.459 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0eb939ef-99bb-47fa-b062-08b443e37493]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe64:c752'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 367907, 'tstamp': 367907}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220698, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.481 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a09a2fb6-0c3c-4b90-ae99-a556f35306f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367907, 'reachable_time': 32046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220699, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.526 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[32c9d941-3fd5-4c96-8ad5-806696ccb0af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.618 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[16791839-6285-4a1c-a349-a5b8eb9b7cc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.621 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.621 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.622 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16d40025-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:23 compute-1 NetworkManager[51724]: <info>  [1759266983.6259] manager: (tap16d40025-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Sep 30 21:16:23 compute-1 kernel: tap16d40025-10: entered promiscuous mode
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.629 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16d40025-10, col_values=(('external_ids', {'iface-id': '0c66892e-7baf-4f9a-a329-dd0545dbf700'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:23 compute-1 nova_compute[192795]: 2025-09-30 21:16:23.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.634 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:16:23 compute-1 ovn_controller[94902]: 2025-09-30T21:16:23Z|00035|binding|INFO|Releasing lport 0c66892e-7baf-4f9a-a329-dd0545dbf700 from this chassis (sb_readonly=0)
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.635 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5af38a-8153-4c17-b1ea-2cc517bc086d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.637 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:16:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:23.638 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'env', 'PROCESS_TAG=haproxy-16d40025-1087-460f-a42f-c007f6eff406', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/16d40025-1087-460f-a42f-c007f6eff406.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:16:23 compute-1 nova_compute[192795]: 2025-09-30 21:16:23.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:24 compute-1 podman[220731]: 2025-09-30 21:16:24.075137471 +0000 UTC m=+0.083943330 container create 24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:16:24 compute-1 podman[220731]: 2025-09-30 21:16:24.035575041 +0000 UTC m=+0.044380900 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:16:24 compute-1 systemd[1]: Started libpod-conmon-24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b.scope.
Sep 30 21:16:24 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:16:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77cc9e9680b24ad3c576bee9adf7d7faa748f07809e3f8526ba7be22934a0d87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:16:24 compute-1 podman[220731]: 2025-09-30 21:16:24.224715836 +0000 UTC m=+0.233521715 container init 24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:16:24 compute-1 podman[220731]: 2025-09-30 21:16:24.230205043 +0000 UTC m=+0.239010892 container start 24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:16:24 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[220746]: [NOTICE]   (220750) : New worker (220752) forked
Sep 30 21:16:24 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[220746]: [NOTICE]   (220750) : Loading success.
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.330 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f3d16df7-d485-4027-b3dc-8671f00a8485 in datapath 776029b5-6115-4a10-b114-56157e8d42e8 unbound from our chassis
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.334 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 776029b5-6115-4a10-b114-56157e8d42e8
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.353 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8a22c7ed-0f11-4aee-be25-62bf95b26fc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.354 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap776029b5-61 in ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.357 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap776029b5-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.357 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[83eaea48-e804-4627-b3aa-0e71c43949a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.358 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[eb98fecf-e65f-4b4c-bf4a-bfc9c7f5f9ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.389 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[e948c63e-67ef-4ae1-8262-498734649f36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.414 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4b6e18be-8fe8-4e5b-9793-2e3508267ca3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.458 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[14196b83-a26b-4d1c-a71b-fabef53a8444]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.464 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4d4507-affc-46b6-a57a-c8450fd7f396]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 systemd-udevd[220687]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:16:24 compute-1 NetworkManager[51724]: <info>  [1759266984.4690] manager: (tap776029b5-60): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.509 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ef933866-9851-4c31-973f-273ac5a7653f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.518 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[af8f681e-b027-4d2b-806e-5d2be42cfc7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 NetworkManager[51724]: <info>  [1759266984.5550] device (tap776029b5-60): carrier: link connected
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.569 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ad50abfd-3ee9-44d0-b828-448daabec3dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.598 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7c2049d0-4e8f-4988-93cc-104a71a4475e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap776029b5-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:b6:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368022, 'reachable_time': 35622, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220773, 'error': None, 'target': 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.625 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7db77d-b15f-4696-91cc-e7e1e318a999]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee3:b6bc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368022, 'tstamp': 368022}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220774, 'error': None, 'target': 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.661 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[caa0a2ae-3fd4-44f2-83e0-c6e21b92ab0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap776029b5-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:b6:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368022, 'reachable_time': 35622, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220775, 'error': None, 'target': 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.708 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6f737e1f-11ca-42c6-a2e9-dd8010103ea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.799 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0bea97e1-d10b-4a79-b347-e04a0fb0d709]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.802 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap776029b5-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.803 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.804 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap776029b5-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:24 compute-1 kernel: tap776029b5-60: entered promiscuous mode
Sep 30 21:16:24 compute-1 NetworkManager[51724]: <info>  [1759266984.8109] manager: (tap776029b5-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Sep 30 21:16:24 compute-1 nova_compute[192795]: 2025-09-30 21:16:24.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.811 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap776029b5-60, col_values=(('external_ids', {'iface-id': '8241e4b2-4ddf-4f99-a3a9-12824f39f1bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:24 compute-1 ovn_controller[94902]: 2025-09-30T21:16:24Z|00036|binding|INFO|Releasing lport 8241e4b2-4ddf-4f99-a3a9-12824f39f1bf from this chassis (sb_readonly=0)
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.815 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/776029b5-6115-4a10-b114-56157e8d42e8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/776029b5-6115-4a10-b114-56157e8d42e8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.824 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fca24cac-8938-4167-bdd4-fa7b0b9ad9ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.825 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-776029b5-6115-4a10-b114-56157e8d42e8
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/776029b5-6115-4a10-b114-56157e8d42e8.pid.haproxy
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 776029b5-6115-4a10-b114-56157e8d42e8
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:16:24 compute-1 nova_compute[192795]: 2025-09-30 21:16:24.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:24.829 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'env', 'PROCESS_TAG=haproxy-776029b5-6115-4a10-b114-56157e8d42e8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/776029b5-6115-4a10-b114-56157e8d42e8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:16:25 compute-1 podman[220807]: 2025-09-30 21:16:25.2379066 +0000 UTC m=+0.050296991 container create 3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:16:25 compute-1 systemd[1]: Started libpod-conmon-3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48.scope.
Sep 30 21:16:25 compute-1 podman[220807]: 2025-09-30 21:16:25.213534451 +0000 UTC m=+0.025924872 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:16:25 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:16:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ed9658e08aece593c1b38f3430c5f28894d8edabc3e6e6b2e68a5c055a9240f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:16:25 compute-1 podman[220807]: 2025-09-30 21:16:25.336575098 +0000 UTC m=+0.148965529 container init 3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:16:25 compute-1 podman[220807]: 2025-09-30 21:16:25.350470594 +0000 UTC m=+0.162860995 container start 3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:16:25 compute-1 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220823]: [NOTICE]   (220827) : New worker (220829) forked
Sep 30 21:16:25 compute-1 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220823]: [NOTICE]   (220827) : Loading success.
Sep 30 21:16:25 compute-1 nova_compute[192795]: 2025-09-30 21:16:25.584 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Check if temp file /var/lib/nova/instances/tmp0tc_e70n exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Sep 30 21:16:25 compute-1 nova_compute[192795]: 2025-09-30 21:16:25.591 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:25 compute-1 nova_compute[192795]: 2025-09-30 21:16:25.658 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:25 compute-1 nova_compute[192795]: 2025-09-30 21:16:25.659 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:25 compute-1 nova_compute[192795]: 2025-09-30 21:16:25.722 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:25 compute-1 nova_compute[192795]: 2025-09-30 21:16:25.724 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0tc_e70n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Sep 30 21:16:26 compute-1 nova_compute[192795]: 2025-09-30 21:16:26.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:27 compute-1 nova_compute[192795]: 2025-09-30 21:16:27.734 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:27 compute-1 nova_compute[192795]: 2025-09-30 21:16:27.796 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:27 compute-1 nova_compute[192795]: 2025-09-30 21:16:27.798 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:27 compute-1 nova_compute[192795]: 2025-09-30 21:16:27.858 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:27 compute-1 nova_compute[192795]: 2025-09-30 21:16:27.860 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:27 compute-1 nova_compute[192795]: 2025-09-30 21:16:27.860 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:27 compute-1 nova_compute[192795]: 2025-09-30 21:16:27.871 2 INFO nova.compute.rpcapi [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Sep 30 21:16:27 compute-1 nova_compute[192795]: 2025-09-30 21:16:27.872 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:28 compute-1 podman[220850]: 2025-09-30 21:16:28.216588687 +0000 UTC m=+0.057340821 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:16:28 compute-1 nova_compute[192795]: 2025-09-30 21:16:28.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:30 compute-1 nova_compute[192795]: 2025-09-30 21:16:30.004 2 DEBUG oslo_concurrency.processutils [None req-39297d32-3ba0-4549-a216-750b23615dd0 5db32a0fde7e4cda9719d918934f978b 1a99d5acf09e4d508c82e1053c830fbf - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:30 compute-1 nova_compute[192795]: 2025-09-30 21:16:30.051 2 DEBUG oslo_concurrency.processutils [None req-39297d32-3ba0-4549-a216-750b23615dd0 5db32a0fde7e4cda9719d918934f978b 1a99d5acf09e4d508c82e1053c830fbf - - default default] CMD "env LANG=C uptime" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:30 compute-1 sshd-session[220871]: Accepted publickey for nova from 192.168.122.100 port 47158 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:16:30 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:16:30 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:16:30 compute-1 systemd-logind[793]: New session 29 of user nova.
Sep 30 21:16:30 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:16:30 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:16:30 compute-1 systemd[220891]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:16:30 compute-1 podman[220873]: 2025-09-30 21:16:30.449657023 +0000 UTC m=+0.083092007 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 21:16:30 compute-1 systemd[220891]: Queued start job for default target Main User Target.
Sep 30 21:16:30 compute-1 systemd[220891]: Created slice User Application Slice.
Sep 30 21:16:30 compute-1 systemd[220891]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:16:30 compute-1 systemd[220891]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:16:30 compute-1 systemd[220891]: Reached target Paths.
Sep 30 21:16:30 compute-1 systemd[220891]: Reached target Timers.
Sep 30 21:16:30 compute-1 systemd[220891]: Starting D-Bus User Message Bus Socket...
Sep 30 21:16:30 compute-1 systemd[220891]: Starting Create User's Volatile Files and Directories...
Sep 30 21:16:30 compute-1 systemd[220891]: Finished Create User's Volatile Files and Directories.
Sep 30 21:16:30 compute-1 systemd[220891]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:16:30 compute-1 systemd[220891]: Reached target Sockets.
Sep 30 21:16:30 compute-1 systemd[220891]: Reached target Basic System.
Sep 30 21:16:30 compute-1 systemd[220891]: Reached target Main User Target.
Sep 30 21:16:30 compute-1 systemd[220891]: Startup finished in 149ms.
Sep 30 21:16:30 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:16:30 compute-1 systemd[1]: Started Session 29 of User nova.
Sep 30 21:16:30 compute-1 sshd-session[220871]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:16:30 compute-1 sshd-session[220910]: Received disconnect from 192.168.122.100 port 47158:11: disconnected by user
Sep 30 21:16:30 compute-1 sshd-session[220910]: Disconnected from user nova 192.168.122.100 port 47158
Sep 30 21:16:30 compute-1 sshd-session[220871]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:16:30 compute-1 systemd-logind[793]: Session 29 logged out. Waiting for processes to exit.
Sep 30 21:16:30 compute-1 systemd[1]: session-29.scope: Deactivated successfully.
Sep 30 21:16:30 compute-1 systemd-logind[793]: Removed session 29.
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.624 2 DEBUG oslo_concurrency.lockutils [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.624 2 DEBUG oslo_concurrency.lockutils [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.625 2 DEBUG oslo_concurrency.lockutils [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.625 2 DEBUG oslo_concurrency.lockutils [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.625 2 DEBUG oslo_concurrency.lockutils [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.648 2 INFO nova.compute.manager [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Terminating instance
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.667 2 DEBUG nova.compute.manager [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.688 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:31 compute-1 kernel: tapf3d16df7-d4 (unregistering): left promiscuous mode
Sep 30 21:16:31 compute-1 NetworkManager[51724]: <info>  [1759266991.6969] device (tapf3d16df7-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00037|binding|INFO|Releasing lport f3d16df7-d485-4027-b3dc-8671f00a8485 from this chassis (sb_readonly=0)
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00038|binding|INFO|Setting lport f3d16df7-d485-4027-b3dc-8671f00a8485 down in Southbound
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00039|binding|INFO|Removing iface tapf3d16df7-d4 ovn-installed in OVS
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:31.726 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f'], port_security=['fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.4/26 fdfe:381f:8400::15f/64', 'neutron:device_id': '059d7f34-3cb7-4d78-aeee-184c964e6787', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-776029b5-6115-4a10-b114-56157e8d42e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ee15beb428a4c5e9726b65c80920da9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b51fed5f-83ee-4757-b171-916e34c6fdb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac973fb-5d64-4a91-8b3a-7dff1eb497e1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f3d16df7-d485-4027-b3dc-8671f00a8485) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:31.728 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f3d16df7-d485-4027-b3dc-8671f00a8485 in datapath 776029b5-6115-4a10-b114-56157e8d42e8 unbound from our chassis
Sep 30 21:16:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:31.731 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 776029b5-6115-4a10-b114-56157e8d42e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:16:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:31.732 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f669781e-2647-4f80-ab7e-58e351d0a8cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:31.734 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 namespace which is not needed anymore
Sep 30 21:16:31 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Sep 30 21:16:31 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 9.860s CPU time.
Sep 30 21:16:31 compute-1 systemd-machined[152783]: Machine qemu-2-instance-00000002 terminated.
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.833 2 DEBUG nova.compute.manager [req-de50b523-5125-4bce-adf0-960503d68064 req-4be32e27-10c7-4b8b-be95-a20bf231bfb1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.833 2 DEBUG oslo_concurrency.lockutils [req-de50b523-5125-4bce-adf0-960503d68064 req-4be32e27-10c7-4b8b-be95-a20bf231bfb1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.833 2 DEBUG oslo_concurrency.lockutils [req-de50b523-5125-4bce-adf0-960503d68064 req-4be32e27-10c7-4b8b-be95-a20bf231bfb1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.833 2 DEBUG oslo_concurrency.lockutils [req-de50b523-5125-4bce-adf0-960503d68064 req-4be32e27-10c7-4b8b-be95-a20bf231bfb1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.834 2 DEBUG nova.compute.manager [req-de50b523-5125-4bce-adf0-960503d68064 req-4be32e27-10c7-4b8b-be95-a20bf231bfb1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.834 2 DEBUG nova.compute.manager [req-de50b523-5125-4bce-adf0-960503d68064 req-4be32e27-10c7-4b8b-be95-a20bf231bfb1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:16:31 compute-1 NetworkManager[51724]: <info>  [1759266991.8942] manager: (tapf3d16df7-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Sep 30 21:16:31 compute-1 kernel: tapf3d16df7-d4: entered promiscuous mode
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00040|binding|INFO|Claiming lport f3d16df7-d485-4027-b3dc-8671f00a8485 for this chassis.
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00041|binding|INFO|f3d16df7-d485-4027-b3dc-8671f00a8485: Claiming fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f
Sep 30 21:16:31 compute-1 kernel: tapf3d16df7-d4 (unregistering): left promiscuous mode
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:31 compute-1 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220823]: [NOTICE]   (220827) : haproxy version is 2.8.14-c23fe91
Sep 30 21:16:31 compute-1 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220823]: [NOTICE]   (220827) : path to executable is /usr/sbin/haproxy
Sep 30 21:16:31 compute-1 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220823]: [WARNING]  (220827) : Exiting Master process...
Sep 30 21:16:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:31.906 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f'], port_security=['fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.4/26 fdfe:381f:8400::15f/64', 'neutron:device_id': '059d7f34-3cb7-4d78-aeee-184c964e6787', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-776029b5-6115-4a10-b114-56157e8d42e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ee15beb428a4c5e9726b65c80920da9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b51fed5f-83ee-4757-b171-916e34c6fdb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac973fb-5d64-4a91-8b3a-7dff1eb497e1, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f3d16df7-d485-4027-b3dc-8671f00a8485) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:31 compute-1 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220823]: [ALERT]    (220827) : Current worker (220829) exited with code 143 (Terminated)
Sep 30 21:16:31 compute-1 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220823]: [WARNING]  (220827) : All workers exited. Exiting... (0)
Sep 30 21:16:31 compute-1 systemd[1]: libpod-3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48.scope: Deactivated successfully.
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00042|binding|INFO|Setting lport f3d16df7-d485-4027-b3dc-8671f00a8485 ovn-installed in OVS
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00043|binding|INFO|Setting lport f3d16df7-d485-4027-b3dc-8671f00a8485 up in Southbound
Sep 30 21:16:31 compute-1 podman[220935]: 2025-09-30 21:16:31.926811461 +0000 UTC m=+0.069656223 container died 3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00044|binding|INFO|Releasing lport f3d16df7-d485-4027-b3dc-8671f00a8485 from this chassis (sb_readonly=1)
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00045|if_status|INFO|Not setting lport f3d16df7-d485-4027-b3dc-8671f00a8485 down as sb is readonly
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00046|binding|INFO|Removing iface tapf3d16df7-d4 ovn-installed in OVS
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00047|binding|INFO|Releasing lport f3d16df7-d485-4027-b3dc-8671f00a8485 from this chassis (sb_readonly=0)
Sep 30 21:16:31 compute-1 ovn_controller[94902]: 2025-09-30T21:16:31Z|00048|binding|INFO|Setting lport f3d16df7-d485-4027-b3dc-8671f00a8485 down in Southbound
Sep 30 21:16:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:31.938 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f'], port_security=['fa:16:3e:7e:bc:74 10.1.0.4 fdfe:381f:8400::15f'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.4/26 fdfe:381f:8400::15f/64', 'neutron:device_id': '059d7f34-3cb7-4d78-aeee-184c964e6787', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-776029b5-6115-4a10-b114-56157e8d42e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ee15beb428a4c5e9726b65c80920da9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b51fed5f-83ee-4757-b171-916e34c6fdb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac973fb-5d64-4a91-8b3a-7dff1eb497e1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f3d16df7-d485-4027-b3dc-8671f00a8485) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.964 2 INFO nova.virt.libvirt.driver [-] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Instance destroyed successfully.
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.965 2 DEBUG nova.objects.instance [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lazy-loading 'resources' on Instance uuid 059d7f34-3cb7-4d78-aeee-184c964e6787 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.982 2 DEBUG nova.virt.libvirt.vif [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-423040831-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-423040831-1',id=2,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:16:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9ee15beb428a4c5e9726b65c80920da9',ramdisk_id='',reservation_id='r-vbxtb26e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-649897189',owner_user_name='tempest-AutoAllocateNetworkTest-649897189-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:16:22Z,user_data=None,user_id='b0c6f6dda88549aabf6a4b039406af8e',uuid=059d7f34-3cb7-4d78-aeee-184c964e6787,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.982 2 DEBUG nova.network.os_vif_util [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converting VIF {"id": "f3d16df7-d485-4027-b3dc-8671f00a8485", "address": "fa:16:3e:7e:bc:74", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::15f", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3d16df7-d4", "ovs_interfaceid": "f3d16df7-d485-4027-b3dc-8671f00a8485", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.983 2 DEBUG nova.network.os_vif_util [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:bc:74,bridge_name='br-int',has_traffic_filtering=True,id=f3d16df7-d485-4027-b3dc-8671f00a8485,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3d16df7-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.984 2 DEBUG os_vif [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:bc:74,bridge_name='br-int',has_traffic_filtering=True,id=f3d16df7-d485-4027-b3dc-8671f00a8485,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3d16df7-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.986 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3d16df7-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.993 2 INFO os_vif [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:bc:74,bridge_name='br-int',has_traffic_filtering=True,id=f3d16df7-d485-4027-b3dc-8671f00a8485,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3d16df7-d4')
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.994 2 INFO nova.virt.libvirt.driver [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Deleting instance files /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787_del
Sep 30 21:16:31 compute-1 nova_compute[192795]: 2025-09-30 21:16:31.995 2 INFO nova.virt.libvirt.driver [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Deletion of /var/lib/nova/instances/059d7f34-3cb7-4d78-aeee-184c964e6787_del complete
Sep 30 21:16:31 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48-userdata-shm.mount: Deactivated successfully.
Sep 30 21:16:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-4ed9658e08aece593c1b38f3430c5f28894d8edabc3e6e6b2e68a5c055a9240f-merged.mount: Deactivated successfully.
Sep 30 21:16:32 compute-1 podman[220935]: 2025-09-30 21:16:32.028866171 +0000 UTC m=+0.171710883 container cleanup 3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:16:32 compute-1 systemd[1]: libpod-conmon-3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48.scope: Deactivated successfully.
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.126 2 DEBUG nova.virt.libvirt.host [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.127 2 INFO nova.virt.libvirt.host [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] UEFI support detected
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.130 2 INFO nova.compute.manager [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Took 0.46 seconds to destroy the instance on the hypervisor.
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.130 2 DEBUG oslo.service.loopingcall [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.130 2 DEBUG nova.compute.manager [-] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.130 2 DEBUG nova.network.neutron [-] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:16:32 compute-1 podman[220985]: 2025-09-30 21:16:32.135610357 +0000 UTC m=+0.075172233 container remove 3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923)
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.142 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[80ce8751-85c0-4add-b455-2cf7e9eb7501]: (4, ('Tue Sep 30 09:16:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 (3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48)\n3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48\nTue Sep 30 09:16:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 (3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48)\n3868838830d66795efece13b4d0e898a3f0c42db531504f42e56e6b54c688b48\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.145 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[937b305d-c98d-453f-9576-f08aa22a7df1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.146 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap776029b5-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:32 compute-1 kernel: tap776029b5-60: left promiscuous mode
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.172 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6291bc5f-3408-4538-b2f1-ec945cf65e66]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.201 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1f923156-f2b9-4c6b-b555-3972c7d40b02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.202 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ba1578-4c3a-484e-8add-920254806e40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.226 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cb61c75b-f8ea-4e13-a599-cadb2b0d9fa6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368012, 'reachable_time': 23657, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221006, 'error': None, 'target': 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:32 compute-1 systemd[1]: run-netns-ovnmeta\x2d776029b5\x2d6115\x2d4a10\x2db114\x2d56157e8d42e8.mount: Deactivated successfully.
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.244 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.246 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[2e085254-31c2-41b0-b98a-f8c2eec78bd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.248 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f3d16df7-d485-4027-b3dc-8671f00a8485 in datapath 776029b5-6115-4a10-b114-56157e8d42e8 unbound from our chassis
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.250 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 776029b5-6115-4a10-b114-56157e8d42e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.251 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a9e09a-c000-4f4d-8871-76b179b73ae3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.251 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f3d16df7-d485-4027-b3dc-8671f00a8485 in datapath 776029b5-6115-4a10-b114-56157e8d42e8 unbound from our chassis
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.253 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 776029b5-6115-4a10-b114-56157e8d42e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:16:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:32.253 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d689c25d-a919-47e1-8ecd-023e46259a8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:32 compute-1 podman[221008]: 2025-09-30 21:16:32.390109338 +0000 UTC m=+0.091070174 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:16:32 compute-1 ovn_controller[94902]: 2025-09-30T21:16:32Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:31:8d 10.100.0.7
Sep 30 21:16:32 compute-1 ovn_controller[94902]: 2025-09-30T21:16:32Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:31:8d 10.100.0.7
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.772 2 INFO nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Took 4.91 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.772 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.799 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0tc_e70n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(f4a68d88-31d7-4bd6-8c59-8f3d78cae759),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.825 2 DEBUG nova.objects.instance [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lazy-loading 'migration_context' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.827 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.830 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.831 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.850 2 DEBUG nova.virt.libvirt.vif [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:16:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1363935032',display_name='tempest-LiveMigrationTest-server-1363935032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1363935032',id=5,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:16:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-0x92j8qn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:16:20Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=252d5457-8837-4aa6-b309-c3139e8db7ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.850 2 DEBUG nova.network.os_vif_util [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converting VIF {"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.851 2 DEBUG nova.network.os_vif_util [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.851 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 21:16:32 compute-1 nova_compute[192795]:   <mac address="fa:16:3e:a9:31:8d"/>
Sep 30 21:16:32 compute-1 nova_compute[192795]:   <model type="virtio"/>
Sep 30 21:16:32 compute-1 nova_compute[192795]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:16:32 compute-1 nova_compute[192795]:   <mtu size="1442"/>
Sep 30 21:16:32 compute-1 nova_compute[192795]:   <target dev="tap70b5da71-31"/>
Sep 30 21:16:32 compute-1 nova_compute[192795]: </interface>
Sep 30 21:16:32 compute-1 nova_compute[192795]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Sep 30 21:16:32 compute-1 nova_compute[192795]: 2025-09-30 21:16:32.852 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.234 2 DEBUG nova.network.neutron [-] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.255 2 INFO nova.compute.manager [-] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Took 1.12 seconds to deallocate network for instance.
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.334 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.335 2 INFO nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.339 2 DEBUG oslo_concurrency.lockutils [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.339 2 DEBUG oslo_concurrency.lockutils [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.414 2 DEBUG nova.compute.manager [req-2936a158-0d4e-4135-9d81-0a6ab08c3691 req-cca120d8-f091-474c-9ace-e647f4adfb92 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-vif-deleted-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.457 2 DEBUG nova.compute.provider_tree [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.460 2 INFO nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.478 2 DEBUG nova.scheduler.client.report [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.498 2 DEBUG oslo_concurrency.lockutils [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.532 2 INFO nova.scheduler.client.report [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Deleted allocations for instance 059d7f34-3cb7-4d78-aeee-184c964e6787
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.650 2 DEBUG oslo_concurrency.lockutils [None req-80d24532-59d7-4e8c-8467-d5a2a46e9e6c b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.963 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:33 compute-1 nova_compute[192795]: 2025-09-30 21:16:33.965 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.075 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-vif-unplugged-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.075 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.076 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.076 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.077 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] No waiting events found dispatching network-vif-unplugged-f3d16df7-d485-4027-b3dc-8671f00a8485 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.077 2 WARNING nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received unexpected event network-vif-unplugged-f3d16df7-d485-4027-b3dc-8671f00a8485 for instance with vm_state deleted and task_state None.
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.077 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.078 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.078 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.078 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.079 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.079 2 WARNING nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.080 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.080 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.080 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.081 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.081 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] No waiting events found dispatching network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.081 2 WARNING nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received unexpected event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 for instance with vm_state deleted and task_state None.
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.082 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-changed-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.082 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Refreshing instance network info cache due to event network-changed-70b5da71-314a-4c92-9db2-fb08b57a6736. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.082 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.083 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.083 2 DEBUG nova.network.neutron [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Refreshing network info cache for port 70b5da71-314a-4c92-9db2-fb08b57a6736 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.469 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.470 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.732 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.733 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.733 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.734 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.832 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.901 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.902 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.975 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.976 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:16:34 compute-1 nova_compute[192795]: 2025-09-30 21:16:34.978 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.173 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.175 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5578MB free_disk=73.43733215332031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.175 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.175 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.216 2 INFO nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating resource usage from migration f4a68d88-31d7-4bd6-8c59-8f3d78cae759
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.253 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Migration f4a68d88-31d7-4bd6-8c59-8f3d78cae759 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.253 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.254 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.298 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.313 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.340 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.340 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.479 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.480 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:16:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:35.708 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:35.710 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.985 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:35 compute-1 nova_compute[192795]: 2025-09-30 21:16:35.986 2 DEBUG nova.virt.libvirt.migration [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.164 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759266996.164075, 252d5457-8837-4aa6-b309-c3139e8db7ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.166 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Paused (Lifecycle Event)
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.340 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.341 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.341 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.363 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:36 compute-1 kernel: tap70b5da71-31 (unregistering): left promiscuous mode
Sep 30 21:16:36 compute-1 NetworkManager[51724]: <info>  [1759266996.3728] device (tap70b5da71-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:16:36 compute-1 ovn_controller[94902]: 2025-09-30T21:16:36Z|00049|binding|INFO|Releasing lport 70b5da71-314a-4c92-9db2-fb08b57a6736 from this chassis (sb_readonly=0)
Sep 30 21:16:36 compute-1 ovn_controller[94902]: 2025-09-30T21:16:36Z|00050|binding|INFO|Setting lport 70b5da71-314a-4c92-9db2-fb08b57a6736 down in Southbound
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:36 compute-1 ovn_controller[94902]: 2025-09-30T21:16:36Z|00051|binding|INFO|Removing iface tap70b5da71-31 ovn-installed in OVS
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.400 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:31:8d 10.100.0.7'], port_security=['fa:16:3e:a9:31:8d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3b817c7f-1137-4e8f-8263-8c5e6eddafa4'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '8', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=70b5da71-314a-4c92-9db2-fb08b57a6736) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.401 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 70b5da71-314a-4c92-9db2-fb08b57a6736 in datapath 16d40025-1087-460f-a42f-c007f6eff406 unbound from our chassis
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.403 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16d40025-1087-460f-a42f-c007f6eff406, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.404 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[84eeb1c0-cab2-4844-befc-11492b1424f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.405 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 namespace which is not needed anymore
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:36 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Deactivated successfully.
Sep 30 21:16:36 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Consumed 13.608s CPU time.
Sep 30 21:16:36 compute-1 systemd-machined[152783]: Machine qemu-1-instance-00000005 terminated.
Sep 30 21:16:36 compute-1 virtqemud[192217]: cannot parse process status data
Sep 30 21:16:36 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[220746]: [NOTICE]   (220750) : haproxy version is 2.8.14-c23fe91
Sep 30 21:16:36 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[220746]: [NOTICE]   (220750) : path to executable is /usr/sbin/haproxy
Sep 30 21:16:36 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[220746]: [WARNING]  (220750) : Exiting Master process...
Sep 30 21:16:36 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[220746]: [ALERT]    (220750) : Current worker (220752) exited with code 143 (Terminated)
Sep 30 21:16:36 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[220746]: [WARNING]  (220750) : All workers exited. Exiting... (0)
Sep 30 21:16:36 compute-1 systemd[1]: libpod-24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b.scope: Deactivated successfully.
Sep 30 21:16:36 compute-1 podman[221070]: 2025-09-30 21:16:36.568408329 +0000 UTC m=+0.050403334 container died 24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:16:36 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b-userdata-shm.mount: Deactivated successfully.
Sep 30 21:16:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-77cc9e9680b24ad3c576bee9adf7d7faa748f07809e3f8526ba7be22934a0d87-merged.mount: Deactivated successfully.
Sep 30 21:16:36 compute-1 podman[221070]: 2025-09-30 21:16:36.610206588 +0000 UTC m=+0.092201593 container cleanup 24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.614 2 DEBUG nova.virt.libvirt.guest [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.617 2 INFO nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration operation has completed
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.617 2 INFO nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] _post_live_migration() is started..
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.619 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.619 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.619 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.620 2 DEBUG nova.network.neutron [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updated VIF entry in instance network info cache for port 70b5da71-314a-4c92-9db2-fb08b57a6736. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.620 2 DEBUG nova.network.neutron [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:36 compute-1 systemd[1]: libpod-conmon-24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b.scope: Deactivated successfully.
Sep 30 21:16:36 compute-1 podman[221117]: 2025-09-30 21:16:36.68385517 +0000 UTC m=+0.051353080 container remove 24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.690 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f987f85d-1502-4583-8f39-231c35e15f30]: (4, ('Tue Sep 30 09:16:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 (24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b)\n24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b\nTue Sep 30 09:16:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 (24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b)\n24aa6862d1b81f55937f1c11da858703dc0f45eede8035e9a534d12ebde7894b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.691 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[de609f73-d50b-4085-b3f9-19c5ef7ea6c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.692 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:36 compute-1 kernel: tap16d40025-10: left promiscuous mode
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.717 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[303c8a1c-a7b1-42c3-ae4b-520190e19d3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.747 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.748 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.748 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.751 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.751 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.752 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.752 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.752 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.752 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] No waiting events found dispatching network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.752 2 WARNING nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received unexpected event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 for instance with vm_state deleted and task_state None.
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.753 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.753 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.753 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.753 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.753 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] No waiting events found dispatching network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.754 2 WARNING nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received unexpected event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 for instance with vm_state deleted and task_state None.
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.754 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-vif-unplugged-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.754 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.754 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.754 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.754 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] No waiting events found dispatching network-vif-unplugged-f3d16df7-d485-4027-b3dc-8671f00a8485 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.755 2 WARNING nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received unexpected event network-vif-unplugged-f3d16df7-d485-4027-b3dc-8671f00a8485 for instance with vm_state deleted and task_state None.
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.755 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[163b3423-11dd-4485-ada6-dbf4b7936417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.755 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.756 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.756 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.756 2 DEBUG oslo_concurrency.lockutils [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "059d7f34-3cb7-4d78-aeee-184c964e6787-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.756 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d092986c-a014-4aaa-81da-e4573be9aa8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.756 2 DEBUG nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] No waiting events found dispatching network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.756 2 WARNING nova.compute.manager [req-edb8d74e-f13a-4190-be40-81517acc96f5 req-df9417af-55ae-45de-a473-116be74bc799 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Received unexpected event network-vif-plugged-f3d16df7-d485-4027-b3dc-8671f00a8485 for instance with vm_state deleted and task_state None.
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.762 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.762 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.762 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.762 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.781 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf7dd78-0df7-4a21-bee6-31631ed0b24e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367899, 'reachable_time': 33953, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221136, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:36 compute-1 systemd[1]: run-netns-ovnmeta\x2d16d40025\x2d1087\x2d460f\x2da42f\x2dc007f6eff406.mount: Deactivated successfully.
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.787 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:16:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:36.787 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[b29bbecf-77d5-4483-9d0e-cd46e310c772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:36 compute-1 nova_compute[192795]: 2025-09-30 21:16:36.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.612 2 DEBUG nova.network.neutron [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Activated binding for port 70b5da71-314a-4c92-9db2-fb08b57a6736 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.613 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.614 2 DEBUG nova.virt.libvirt.vif [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:16:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1363935032',display_name='tempest-LiveMigrationTest-server-1363935032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1363935032',id=5,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:16:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-0x92j8qn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:16:24Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=252d5457-8837-4aa6-b309-c3139e8db7ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.615 2 DEBUG nova.network.os_vif_util [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converting VIF {"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.616 2 DEBUG nova.network.os_vif_util [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.617 2 DEBUG os_vif [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.620 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70b5da71-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.670 2 INFO os_vif [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31')
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.670 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.671 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.671 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.671 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.672 2 INFO nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Deleting instance files /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed_del
Sep 30 21:16:37 compute-1 nova_compute[192795]: 2025-09-30 21:16:37.673 2 INFO nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Deletion of /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed_del complete
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.446 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.462 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.462 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:16:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:38.677 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:38.678 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:38.678 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.701 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.701 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.701 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.702 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.702 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.702 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.702 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.702 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.703 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.703 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.703 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.703 2 WARNING nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.703 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.704 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.704 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.704 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.704 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.704 2 WARNING nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.705 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.705 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.705 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.705 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.705 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.705 2 WARNING nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.706 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.706 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.706 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.706 2 DEBUG oslo_concurrency.lockutils [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.706 2 DEBUG nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:38 compute-1 nova_compute[192795]: 2025-09-30 21:16:38.706 2 WARNING nova.compute.manager [req-424da200-f3c4-4480-a18f-02ee9b648765 req-bc7bf135-dd3f-4433-94a7-0047404edd96 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:16:39 compute-1 nova_compute[192795]: 2025-09-30 21:16:39.393 2 DEBUG nova.compute.manager [req-abdfcab1-96dd-42fc-955f-cecbbd7a9fe1 req-6b58161e-a4d8-43b8-bf28-0873894e6c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:39 compute-1 nova_compute[192795]: 2025-09-30 21:16:39.394 2 DEBUG oslo_concurrency.lockutils [req-abdfcab1-96dd-42fc-955f-cecbbd7a9fe1 req-6b58161e-a4d8-43b8-bf28-0873894e6c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:39 compute-1 nova_compute[192795]: 2025-09-30 21:16:39.394 2 DEBUG oslo_concurrency.lockutils [req-abdfcab1-96dd-42fc-955f-cecbbd7a9fe1 req-6b58161e-a4d8-43b8-bf28-0873894e6c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:39 compute-1 nova_compute[192795]: 2025-09-30 21:16:39.394 2 DEBUG oslo_concurrency.lockutils [req-abdfcab1-96dd-42fc-955f-cecbbd7a9fe1 req-6b58161e-a4d8-43b8-bf28-0873894e6c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:39 compute-1 nova_compute[192795]: 2025-09-30 21:16:39.394 2 DEBUG nova.compute.manager [req-abdfcab1-96dd-42fc-955f-cecbbd7a9fe1 req-6b58161e-a4d8-43b8-bf28-0873894e6c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:39 compute-1 nova_compute[192795]: 2025-09-30 21:16:39.395 2 DEBUG nova.compute.manager [req-abdfcab1-96dd-42fc-955f-cecbbd7a9fe1 req-6b58161e-a4d8-43b8-bf28-0873894e6c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:16:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:40.713 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:40 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:16:40 compute-1 systemd[220891]: Activating special unit Exit the Session...
Sep 30 21:16:40 compute-1 systemd[220891]: Stopped target Main User Target.
Sep 30 21:16:40 compute-1 systemd[220891]: Stopped target Basic System.
Sep 30 21:16:40 compute-1 systemd[220891]: Stopped target Paths.
Sep 30 21:16:40 compute-1 systemd[220891]: Stopped target Sockets.
Sep 30 21:16:40 compute-1 systemd[220891]: Stopped target Timers.
Sep 30 21:16:40 compute-1 systemd[220891]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:16:40 compute-1 systemd[220891]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:16:40 compute-1 systemd[220891]: Closed D-Bus User Message Bus Socket.
Sep 30 21:16:40 compute-1 systemd[220891]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:16:40 compute-1 systemd[220891]: Removed slice User Application Slice.
Sep 30 21:16:40 compute-1 systemd[220891]: Reached target Shutdown.
Sep 30 21:16:40 compute-1 systemd[220891]: Finished Exit the Session.
Sep 30 21:16:40 compute-1 systemd[220891]: Reached target Exit the Session.
Sep 30 21:16:40 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:16:40 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:16:40 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:16:40 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:16:40 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:16:40 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:16:40 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:16:40 compute-1 podman[221137]: 2025-09-30 21:16:40.994706145 +0000 UTC m=+0.079113341 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:16:41 compute-1 nova_compute[192795]: 2025-09-30 21:16:41.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:42 compute-1 nova_compute[192795]: 2025-09-30 21:16:42.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.636 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.637 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.638 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.678 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.680 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.681 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.681 2 DEBUG nova.compute.resource_tracker [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.885 2 WARNING nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.887 2 DEBUG nova.compute.resource_tracker [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5761MB free_disk=73.46647262573242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.887 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.888 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.934 2 DEBUG nova.compute.resource_tracker [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Migration for instance 252d5457-8837-4aa6-b309-c3139e8db7ed refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.957 2 DEBUG nova.compute.resource_tracker [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.986 2 DEBUG nova.compute.resource_tracker [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Migration f4a68d88-31d7-4bd6-8c59-8f3d78cae759 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.986 2 DEBUG nova.compute.resource_tracker [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:16:43 compute-1 nova_compute[192795]: 2025-09-30 21:16:43.987 2 DEBUG nova.compute.resource_tracker [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:16:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:16:44 compute-1 nova_compute[192795]: 2025-09-30 21:16:44.021 2 DEBUG nova.compute.provider_tree [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:16:44 compute-1 nova_compute[192795]: 2025-09-30 21:16:44.041 2 DEBUG nova.scheduler.client.report [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:16:44 compute-1 nova_compute[192795]: 2025-09-30 21:16:44.076 2 DEBUG nova.compute.resource_tracker [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:16:44 compute-1 nova_compute[192795]: 2025-09-30 21:16:44.077 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:44 compute-1 nova_compute[192795]: 2025-09-30 21:16:44.097 2 INFO nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Sep 30 21:16:44 compute-1 nova_compute[192795]: 2025-09-30 21:16:44.233 2 INFO nova.scheduler.client.report [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Deleted allocation for migration f4a68d88-31d7-4bd6-8c59-8f3d78cae759
Sep 30 21:16:44 compute-1 nova_compute[192795]: 2025-09-30 21:16:44.234 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.178 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Creating tmpfile /var/lib/nova/instances/tmpfa4_vdvr to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Sep 30 21:16:46 compute-1 podman[221162]: 2025-09-30 21:16:46.231518955 +0000 UTC m=+0.069340445 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:16:46 compute-1 podman[221161]: 2025-09-30 21:16:46.249311307 +0000 UTC m=+0.092929695 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=multipathd)
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.403 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfa4_vdvr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.792 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.793 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.815 2 DEBUG nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.938 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.939 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.949 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.949 2 INFO nova.compute.claims [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.962 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759266991.9618092, 059d7f34-3cb7-4d78-aeee-184c964e6787 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:46 compute-1 nova_compute[192795]: 2025-09-30 21:16:46.963 2 INFO nova.compute.manager [-] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] VM Stopped (Lifecycle Event)
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.001 2 DEBUG nova.compute.manager [None req-9bbc9daf-bebe-44be-9927-29a6cb9e9631 - - - - - -] [instance: 059d7f34-3cb7-4d78-aeee-184c964e6787] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.187 2 DEBUG nova.compute.provider_tree [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.205 2 DEBUG nova.scheduler.client.report [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.231 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.232 2 DEBUG nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.293 2 DEBUG nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.294 2 DEBUG nova.network.neutron [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.315 2 INFO nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.335 2 DEBUG nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.491 2 DEBUG nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.493 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.493 2 INFO nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Creating image(s)
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.494 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "/var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.494 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "/var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.495 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "/var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.508 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.606 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.608 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.610 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.624 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.707 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.709 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.759 2 DEBUG nova.network.neutron [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.760 2 DEBUG nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.769 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.769 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.770 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.859 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.861 2 DEBUG nova.virt.disk.api [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Checking if we can resize image /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.862 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.930 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.932 2 DEBUG nova.virt.disk.api [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Cannot resize image /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.932 2 DEBUG nova.objects.instance [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lazy-loading 'migration_context' on Instance uuid e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.963 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.964 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Ensure instance console log exists: /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.965 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.965 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.966 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.969 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.976 2 WARNING nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.982 2 DEBUG nova.virt.libvirt.host [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.982 2 DEBUG nova.virt.libvirt.host [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.986 2 DEBUG nova.virt.libvirt.host [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.987 2 DEBUG nova.virt.libvirt.host [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.988 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.988 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.989 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.989 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.989 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.989 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.989 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.989 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.990 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.990 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.990 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.990 2 DEBUG nova.virt.hardware [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:16:47 compute-1 nova_compute[192795]: 2025-09-30 21:16:47.993 2 DEBUG nova.objects.instance [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lazy-loading 'pci_devices' on Instance uuid e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.008 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <uuid>e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63</uuid>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <name>instance-00000006</name>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersOnMultiNodesTest-server-364072886</nova:name>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:16:47</nova:creationTime>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:16:48 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:16:48 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:16:48 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:16:48 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:16:48 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:16:48 compute-1 nova_compute[192795]:         <nova:user uuid="a283310a99174bb794a56e8355b40a03">tempest-ServersOnMultiNodesTest-102021708-project-member</nova:user>
Sep 30 21:16:48 compute-1 nova_compute[192795]:         <nova:project uuid="0d752ea56b394666bd18bda096b07530">tempest-ServersOnMultiNodesTest-102021708</nova:project>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <system>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <entry name="serial">e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63</entry>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <entry name="uuid">e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63</entry>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     </system>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <os>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   </os>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <features>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   </features>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk.config"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/console.log" append="off"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <video>
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     </video>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:16:48 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:16:48 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:16:48 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:16:48 compute-1 nova_compute[192795]: </domain>
Sep 30 21:16:48 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.070 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.071 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.071 2 INFO nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Using config drive
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.196 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfa4_vdvr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.230 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.230 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquired lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.231 2 DEBUG nova.network.neutron [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.281 2 INFO nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Creating config drive at /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk.config
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.287 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5qro148p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:48 compute-1 nova_compute[192795]: 2025-09-30 21:16:48.417 2 DEBUG oslo_concurrency.processutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5qro148p" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:48 compute-1 systemd-machined[152783]: New machine qemu-3-instance-00000006.
Sep 30 21:16:48 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Sep 30 21:16:48 compute-1 podman[221224]: 2025-09-30 21:16:48.600908948 +0000 UTC m=+0.106055908 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.530 2 DEBUG nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.532 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.533 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267009.529758, e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.533 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] VM Resumed (Lifecycle Event)
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.541 2 INFO nova.virt.libvirt.driver [-] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Instance spawned successfully.
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.543 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.576 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.586 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.593 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.594 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.595 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.596 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.597 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.598 2 DEBUG nova.virt.libvirt.driver [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.611 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.612 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267009.531541, e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.613 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] VM Started (Lifecycle Event)
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.639 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.645 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.668 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.678 2 INFO nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Took 2.19 seconds to spawn the instance on the hypervisor.
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.679 2 DEBUG nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.763 2 INFO nova.compute.manager [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Took 2.86 seconds to build instance.
Sep 30 21:16:49 compute-1 nova_compute[192795]: 2025-09-30 21:16:49.784 2 DEBUG oslo_concurrency.lockutils [None req-2ace83bc-55b0-441a-9c10-6fe348b067a6 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:50 compute-1 nova_compute[192795]: 2025-09-30 21:16:50.805 2 DEBUG nova.network.neutron [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:50 compute-1 nova_compute[192795]: 2025-09-30 21:16:50.838 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Releasing lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:50 compute-1 nova_compute[192795]: 2025-09-30 21:16:50.862 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfa4_vdvr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Sep 30 21:16:50 compute-1 nova_compute[192795]: 2025-09-30 21:16:50.863 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Creating instance directory: /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Sep 30 21:16:50 compute-1 nova_compute[192795]: 2025-09-30 21:16:50.866 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Creating disk.info with the contents: {'/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk': 'qcow2', '/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Sep 30 21:16:50 compute-1 nova_compute[192795]: 2025-09-30 21:16:50.867 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Sep 30 21:16:50 compute-1 nova_compute[192795]: 2025-09-30 21:16:50.868 2 DEBUG nova.objects.instance [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:50 compute-1 nova_compute[192795]: 2025-09-30 21:16:50.910 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.007 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.008 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.009 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.032 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.093 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.095 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.142 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.144 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.145 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.220 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.222 2 DEBUG nova.virt.disk.api [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Checking if we can resize image /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.223 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:51 compute-1 podman[221276]: 2025-09-30 21:16:51.264206307 +0000 UTC m=+0.092343947 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.292 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.294 2 DEBUG nova.virt.disk.api [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Cannot resize image /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.295 2 DEBUG nova.objects.instance [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lazy-loading 'migration_context' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.336 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.367 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config 485376" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.374 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config to /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.375 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.611 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759266996.610089, 252d5457-8837-4aa6-b309-c3139e8db7ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.612 2 INFO nova.compute.manager [-] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Stopped (Lifecycle Event)
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.634 2 DEBUG nova.compute.manager [None req-41d7c228-0e43-4d6c-a950-173c10c03ec9 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.961 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed" returned: 0 in 0.586s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.962 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.963 2 DEBUG nova.virt.libvirt.vif [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:16:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1363935032',display_name='tempest-LiveMigrationTest-server-1363935032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1363935032',id=5,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:16:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-0x92j8qn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:16:42Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=252d5457-8837-4aa6-b309-c3139e8db7ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.964 2 DEBUG nova.network.os_vif_util [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converting VIF {"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.964 2 DEBUG nova.network.os_vif_util [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.965 2 DEBUG os_vif [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.966 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.966 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70b5da71-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.972 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap70b5da71-31, col_values=(('external_ids', {'iface-id': '70b5da71-314a-4c92-9db2-fb08b57a6736', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:31:8d', 'vm-uuid': '252d5457-8837-4aa6-b309-c3139e8db7ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:51 compute-1 NetworkManager[51724]: <info>  [1759267011.9758] manager: (tap70b5da71-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.982 2 INFO os_vif [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31')
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.982 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Sep 30 21:16:51 compute-1 nova_compute[192795]: 2025-09-30 21:16:51.983 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfa4_vdvr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Sep 30 21:16:53 compute-1 nova_compute[192795]: 2025-09-30 21:16:53.998 2 DEBUG nova.network.neutron [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Port 70b5da71-314a-4c92-9db2-fb08b57a6736 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Sep 30 21:16:54 compute-1 nova_compute[192795]: 2025-09-30 21:16:54.008 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfa4_vdvr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Sep 30 21:16:54 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 21:16:54 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 21:16:54 compute-1 kernel: tap70b5da71-31: entered promiscuous mode
Sep 30 21:16:54 compute-1 NetworkManager[51724]: <info>  [1759267014.4516] manager: (tap70b5da71-31): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Sep 30 21:16:54 compute-1 ovn_controller[94902]: 2025-09-30T21:16:54Z|00052|binding|INFO|Claiming lport 70b5da71-314a-4c92-9db2-fb08b57a6736 for this additional chassis.
Sep 30 21:16:54 compute-1 ovn_controller[94902]: 2025-09-30T21:16:54Z|00053|binding|INFO|70b5da71-314a-4c92-9db2-fb08b57a6736: Claiming fa:16:3e:a9:31:8d 10.100.0.7
Sep 30 21:16:54 compute-1 nova_compute[192795]: 2025-09-30 21:16:54.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:54 compute-1 nova_compute[192795]: 2025-09-30 21:16:54.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:54 compute-1 systemd-udevd[221341]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:16:54 compute-1 ovn_controller[94902]: 2025-09-30T21:16:54Z|00054|binding|INFO|Setting lport 70b5da71-314a-4c92-9db2-fb08b57a6736 ovn-installed in OVS
Sep 30 21:16:54 compute-1 systemd-machined[152783]: New machine qemu-4-instance-00000005.
Sep 30 21:16:54 compute-1 nova_compute[192795]: 2025-09-30 21:16:54.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:54 compute-1 NetworkManager[51724]: <info>  [1759267014.5371] device (tap70b5da71-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:16:54 compute-1 NetworkManager[51724]: <info>  [1759267014.5385] device (tap70b5da71-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:16:54 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000005.
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.023 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267016.022605, 252d5457-8837-4aa6-b309-c3139e8db7ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.025 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Started (Lifecycle Event)
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.046 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.817 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267016.8164856, 252d5457-8837-4aa6-b309-c3139e8db7ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.818 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Resumed (Lifecycle Event)
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.839 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.843 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.862 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Sep 30 21:16:56 compute-1 nova_compute[192795]: 2025-09-30 21:16:56.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:58 compute-1 ovn_controller[94902]: 2025-09-30T21:16:58Z|00055|binding|INFO|Claiming lport 70b5da71-314a-4c92-9db2-fb08b57a6736 for this chassis.
Sep 30 21:16:58 compute-1 ovn_controller[94902]: 2025-09-30T21:16:58Z|00056|binding|INFO|70b5da71-314a-4c92-9db2-fb08b57a6736: Claiming fa:16:3e:a9:31:8d 10.100.0.7
Sep 30 21:16:58 compute-1 ovn_controller[94902]: 2025-09-30T21:16:58Z|00057|binding|INFO|Setting lport 70b5da71-314a-4c92-9db2-fb08b57a6736 up in Southbound
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.058 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:31:8d 10.100.0.7'], port_security=['fa:16:3e:a9:31:8d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '21', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=70b5da71-314a-4c92-9db2-fb08b57a6736) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.061 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 70b5da71-314a-4c92-9db2-fb08b57a6736 in datapath 16d40025-1087-460f-a42f-c007f6eff406 bound to our chassis
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.063 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.082 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[913a41b6-8010-4697-9eed-c75b43f1bb48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.083 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap16d40025-11 in ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.085 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap16d40025-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.085 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d114c052-5039-4da8-bc0d-c120b3ad173d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.086 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2f3d5e8d-fe73-42f8-8ee0-a7050f89f63a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.116 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa9c7a0-95ac-4b38-aa03-24c21e652b93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.154 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b7e6e8f3-a0a5-4ac8-93d3-ff6046347378]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.191 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b79c6455-a173-47a0-9f11-c8e5c9f2060a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.197 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[04aa052e-5ca5-4f1c-97eb-575c1aeb7762]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 NetworkManager[51724]: <info>  [1759267018.1989] manager: (tap16d40025-10): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Sep 30 21:16:58 compute-1 nova_compute[192795]: 2025-09-30 21:16:58.226 2 INFO nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Post operation of migration started
Sep 30 21:16:58 compute-1 systemd-udevd[221376]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.251 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc179a0-db07-446d-8c22-66c01860ac4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.254 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[caa44c2a-f949-4abb-b638-495d1159b599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 NetworkManager[51724]: <info>  [1759267018.2839] device (tap16d40025-10): carrier: link connected
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.291 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d4d8f6-14b6-452a-9e78-b2102cf643d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.307 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[780c2951-1377-4a79-8e48-122ef3f4d200]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371395, 'reachable_time': 22266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221405, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.322 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb78b90-cdce-4e2e-850a-6e2ec1169379]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe64:c752'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371395, 'tstamp': 371395}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221406, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.347 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5acdd2cd-66e5-4f2f-985a-37a6b903e786]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371395, 'reachable_time': 22266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221410, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 podman[221383]: 2025-09-30 21:16:58.389070348 +0000 UTC m=+0.111635120 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.392 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9f68973d-4c63-443b-8a75-a54a1df48fc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.486 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8afc59a5-d844-49ec-bb57-55ecfab8da65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.488 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.488 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.489 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16d40025-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:58 compute-1 NetworkManager[51724]: <info>  [1759267018.4923] manager: (tap16d40025-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Sep 30 21:16:58 compute-1 kernel: tap16d40025-10: entered promiscuous mode
Sep 30 21:16:58 compute-1 nova_compute[192795]: 2025-09-30 21:16:58.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.496 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16d40025-10, col_values=(('external_ids', {'iface-id': '0c66892e-7baf-4f9a-a329-dd0545dbf700'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:58 compute-1 ovn_controller[94902]: 2025-09-30T21:16:58Z|00058|binding|INFO|Releasing lport 0c66892e-7baf-4f9a-a329-dd0545dbf700 from this chassis (sb_readonly=0)
Sep 30 21:16:58 compute-1 nova_compute[192795]: 2025-09-30 21:16:58.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.499 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.500 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e8220ece-95e2-4fb1-95d8-1373a559909e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.501 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:16:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:16:58.502 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'env', 'PROCESS_TAG=haproxy-16d40025-1087-460f-a42f-c007f6eff406', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/16d40025-1087-460f-a42f-c007f6eff406.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:16:58 compute-1 nova_compute[192795]: 2025-09-30 21:16:58.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:58 compute-1 podman[221449]: 2025-09-30 21:16:58.900132986 +0000 UTC m=+0.051879244 container create 5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:16:58 compute-1 systemd[1]: Started libpod-conmon-5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5.scope.
Sep 30 21:16:58 compute-1 podman[221449]: 2025-09-30 21:16:58.873713181 +0000 UTC m=+0.025459459 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:16:58 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:16:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77042a2d836d8793894449ee2cc50aecbdfb2eacae855148d8250178088f11c3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:16:59 compute-1 podman[221449]: 2025-09-30 21:16:59.006609685 +0000 UTC m=+0.158355993 container init 5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:16:59 compute-1 podman[221449]: 2025-09-30 21:16:59.013493281 +0000 UTC m=+0.165239549 container start 5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:16:59 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221465]: [NOTICE]   (221469) : New worker (221471) forked
Sep 30 21:16:59 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221465]: [NOTICE]   (221469) : Loading success.
Sep 30 21:16:59 compute-1 nova_compute[192795]: 2025-09-30 21:16:59.076 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:59 compute-1 nova_compute[192795]: 2025-09-30 21:16:59.077 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquired lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:59 compute-1 nova_compute[192795]: 2025-09-30 21:16:59.077 2 DEBUG nova.network.neutron [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:00 compute-1 nova_compute[192795]: 2025-09-30 21:17:00.794 2 DEBUG nova.network.neutron [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:00 compute-1 nova_compute[192795]: 2025-09-30 21:17:00.816 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Releasing lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:00 compute-1 nova_compute[192795]: 2025-09-30 21:17:00.845 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:00 compute-1 nova_compute[192795]: 2025-09-30 21:17:00.846 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:00 compute-1 nova_compute[192795]: 2025-09-30 21:17:00.846 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:00 compute-1 nova_compute[192795]: 2025-09-30 21:17:00.853 2 INFO nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 21:17:00 compute-1 virtqemud[192217]: Domain id=4 name='instance-00000005' uuid=252d5457-8837-4aa6-b309-c3139e8db7ed is tainted: custom-monitor
Sep 30 21:17:01 compute-1 podman[221481]: 2025-09-30 21:17:01.258571373 +0000 UTC m=+0.089653905 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:17:01 compute-1 nova_compute[192795]: 2025-09-30 21:17:01.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:01 compute-1 nova_compute[192795]: 2025-09-30 21:17:01.863 2 INFO nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 21:17:01 compute-1 nova_compute[192795]: 2025-09-30 21:17:01.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:02 compute-1 nova_compute[192795]: 2025-09-30 21:17:02.871 2 INFO nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 21:17:02 compute-1 nova_compute[192795]: 2025-09-30 21:17:02.878 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:02 compute-1 nova_compute[192795]: 2025-09-30 21:17:02.899 2 DEBUG nova.objects.instance [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.084 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "964c6b99-4f8a-4aa9-a519-7bde11e8a447" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.085 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "964c6b99-4f8a-4aa9-a519-7bde11e8a447" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.124 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.228 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.229 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-1 podman[221517]: 2025-09-30 21:17:03.232868393 +0000 UTC m=+0.069315286 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.235 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.240 2 INFO nova.compute.claims [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.410 2 DEBUG nova.compute.provider_tree [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.448 2 DEBUG nova.scheduler.client.report [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.475 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.500 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "eb3c2680-7766-4b45-bf3b-d565d1ab9164" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.501 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "eb3c2680-7766-4b45-bf3b-d565d1ab9164" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.525 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] No node specified, defaulting to compute-1.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.679 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "eb3c2680-7766-4b45-bf3b-d565d1ab9164" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.681 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.758 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.759 2 DEBUG nova.network.neutron [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.774 2 INFO nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.795 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.903 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.905 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.905 2 INFO nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Creating image(s)
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.906 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "/var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.906 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "/var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.907 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "/var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.926 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.990 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.991 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-1 nova_compute[192795]: 2025-09-30 21:17:03.992 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.004 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.096 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.097 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.243 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk 1073741824" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.244 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.244 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.291 2 DEBUG nova.network.neutron [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.291 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.305 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.305 2 DEBUG nova.virt.disk.api [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Checking if we can resize image /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.306 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.395 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.396 2 DEBUG nova.virt.disk.api [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Cannot resize image /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.397 2 DEBUG nova.objects.instance [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lazy-loading 'migration_context' on Instance uuid 964c6b99-4f8a-4aa9-a519-7bde11e8a447 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.408 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.408 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Ensure instance console log exists: /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.408 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.409 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.409 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.410 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.415 2 WARNING nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.419 2 DEBUG nova.virt.libvirt.host [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.420 2 DEBUG nova.virt.libvirt.host [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.423 2 DEBUG nova.virt.libvirt.host [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.423 2 DEBUG nova.virt.libvirt.host [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.424 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.425 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.425 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.425 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.425 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.426 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.426 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.426 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.426 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.426 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.427 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.427 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.430 2 DEBUG nova.objects.instance [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lazy-loading 'pci_devices' on Instance uuid 964c6b99-4f8a-4aa9-a519-7bde11e8a447 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.457 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <uuid>964c6b99-4f8a-4aa9-a519-7bde11e8a447</uuid>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <name>instance-0000000a</name>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersOnMultiNodesTest-server-1032185460-2</nova:name>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:17:04</nova:creationTime>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:17:04 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:17:04 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:17:04 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:17:04 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:17:04 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:17:04 compute-1 nova_compute[192795]:         <nova:user uuid="a283310a99174bb794a56e8355b40a03">tempest-ServersOnMultiNodesTest-102021708-project-member</nova:user>
Sep 30 21:17:04 compute-1 nova_compute[192795]:         <nova:project uuid="0d752ea56b394666bd18bda096b07530">tempest-ServersOnMultiNodesTest-102021708</nova:project>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <system>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <entry name="serial">964c6b99-4f8a-4aa9-a519-7bde11e8a447</entry>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <entry name="uuid">964c6b99-4f8a-4aa9-a519-7bde11e8a447</entry>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     </system>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <os>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   </os>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <features>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   </features>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk.config"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/console.log" append="off"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <video>
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     </video>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:17:04 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:17:04 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:17:04 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:17:04 compute-1 nova_compute[192795]: </domain>
Sep 30 21:17:04 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.553 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.554 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.554 2 INFO nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Using config drive
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.707 2 INFO nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Creating config drive at /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk.config
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.711 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiylzbhnf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-1 nova_compute[192795]: 2025-09-30 21:17:04.835 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiylzbhnf" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-1 systemd-machined[152783]: New machine qemu-5-instance-0000000a.
Sep 30 21:17:04 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-0000000a.
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.061 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267026.0608408, 964c6b99-4f8a-4aa9-a519-7bde11e8a447 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.063 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] VM Resumed (Lifecycle Event)
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.074 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.078 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.086 2 INFO nova.virt.libvirt.driver [-] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Instance spawned successfully.
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.087 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.094 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.099 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.130 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.131 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267026.0611756, 964c6b99-4f8a-4aa9-a519-7bde11e8a447 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.131 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] VM Started (Lifecycle Event)
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.137 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.138 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.139 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.139 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.140 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.141 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.180 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.185 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.225 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.259 2 INFO nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Took 2.36 seconds to spawn the instance on the hypervisor.
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.260 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.347 2 INFO nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Took 3.16 seconds to build instance.
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.416 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "964c6b99-4f8a-4aa9-a519-7bde11e8a447" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:06 compute-1 nova_compute[192795]: 2025-09-30 21:17:06.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.396 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "964c6b99-4f8a-4aa9-a519-7bde11e8a447" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.397 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "964c6b99-4f8a-4aa9-a519-7bde11e8a447" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.397 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "964c6b99-4f8a-4aa9-a519-7bde11e8a447-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.398 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "964c6b99-4f8a-4aa9-a519-7bde11e8a447-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.398 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "964c6b99-4f8a-4aa9-a519-7bde11e8a447-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.414 2 INFO nova.compute.manager [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Terminating instance
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.430 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "refresh_cache-964c6b99-4f8a-4aa9-a519-7bde11e8a447" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.431 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquired lock "refresh_cache-964c6b99-4f8a-4aa9-a519-7bde11e8a447" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.431 2 DEBUG nova.network.neutron [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:08 compute-1 nova_compute[192795]: 2025-09-30 21:17:08.633 2 DEBUG nova.network.neutron [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.029 2 DEBUG nova.network.neutron [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.045 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Releasing lock "refresh_cache-964c6b99-4f8a-4aa9-a519-7bde11e8a447" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.045 2 DEBUG nova.compute.manager [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:17:09 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Sep 30 21:17:09 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Consumed 4.006s CPU time.
Sep 30 21:17:09 compute-1 systemd-machined[152783]: Machine qemu-5-instance-0000000a terminated.
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.312 2 INFO nova.virt.libvirt.driver [-] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Instance destroyed successfully.
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.313 2 DEBUG nova.objects.instance [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lazy-loading 'resources' on Instance uuid 964c6b99-4f8a-4aa9-a519-7bde11e8a447 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.356 2 INFO nova.virt.libvirt.driver [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Deleting instance files /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447_del
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.358 2 INFO nova.virt.libvirt.driver [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Deletion of /var/lib/nova/instances/964c6b99-4f8a-4aa9-a519-7bde11e8a447_del complete
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.455 2 INFO nova.compute.manager [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.456 2 DEBUG oslo.service.loopingcall [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.457 2 DEBUG nova.compute.manager [-] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.457 2 DEBUG nova.network.neutron [-] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.631 2 DEBUG nova.network.neutron [-] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.647 2 DEBUG nova.network.neutron [-] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.658 2 INFO nova.compute.manager [-] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Took 0.20 seconds to deallocate network for instance.
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.730 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.730 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.849 2 DEBUG nova.compute.provider_tree [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.878 2 DEBUG nova.scheduler.client.report [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.906 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.937 2 INFO nova.scheduler.client.report [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Deleted allocations for instance 964c6b99-4f8a-4aa9-a519-7bde11e8a447
Sep 30 21:17:09 compute-1 nova_compute[192795]: 2025-09-30 21:17:09.997 2 DEBUG oslo_concurrency.lockutils [None req-1e0b0598-a457-41b4-b459-0dddbd994cf7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "964c6b99-4f8a-4aa9-a519-7bde11e8a447" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:11 compute-1 podman[221593]: 2025-09-30 21:17:11.263914184 +0000 UTC m=+0.094080276 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:17:11 compute-1 nova_compute[192795]: 2025-09-30 21:17:11.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:11 compute-1 nova_compute[192795]: 2025-09-30 21:17:11.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:14 compute-1 nova_compute[192795]: 2025-09-30 21:17:14.971 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:14 compute-1 nova_compute[192795]: 2025-09-30 21:17:14.972 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:14 compute-1 nova_compute[192795]: 2025-09-30 21:17:14.972 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:14 compute-1 nova_compute[192795]: 2025-09-30 21:17:14.972 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:14 compute-1 nova_compute[192795]: 2025-09-30 21:17:14.972 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:14 compute-1 nova_compute[192795]: 2025-09-30 21:17:14.982 2 INFO nova.compute.manager [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Terminating instance
Sep 30 21:17:14 compute-1 nova_compute[192795]: 2025-09-30 21:17:14.993 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "refresh_cache-e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:14 compute-1 nova_compute[192795]: 2025-09-30 21:17:14.994 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquired lock "refresh_cache-e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:14 compute-1 nova_compute[192795]: 2025-09-30 21:17:14.994 2 DEBUG nova.network.neutron [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.200 2 DEBUG nova.network.neutron [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.496 2 DEBUG nova.network.neutron [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.511 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Releasing lock "refresh_cache-e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.512 2 DEBUG nova.compute.manager [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:17:15 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Sep 30 21:17:15 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 14.205s CPU time.
Sep 30 21:17:15 compute-1 systemd-machined[152783]: Machine qemu-3-instance-00000006 terminated.
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.771 2 INFO nova.virt.libvirt.driver [-] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Instance destroyed successfully.
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.773 2 DEBUG nova.objects.instance [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lazy-loading 'resources' on Instance uuid e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.787 2 INFO nova.virt.libvirt.driver [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Deleting instance files /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63_del
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.788 2 INFO nova.virt.libvirt.driver [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Deletion of /var/lib/nova/instances/e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63_del complete
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.867 2 INFO nova.compute.manager [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Took 0.35 seconds to destroy the instance on the hypervisor.
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.868 2 DEBUG oslo.service.loopingcall [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.868 2 DEBUG nova.compute.manager [-] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:17:15 compute-1 nova_compute[192795]: 2025-09-30 21:17:15.868 2 DEBUG nova.network.neutron [-] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.001 2 DEBUG nova.network.neutron [-] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.017 2 DEBUG nova.network.neutron [-] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.035 2 INFO nova.compute.manager [-] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Took 0.17 seconds to deallocate network for instance.
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.123 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.124 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.227 2 DEBUG nova.compute.provider_tree [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.252 2 DEBUG nova.scheduler.client.report [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.275 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.299 2 INFO nova.scheduler.client.report [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Deleted allocations for instance e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.386 2 DEBUG oslo_concurrency.lockutils [None req-48d9c971-5816-47d7-bbf8-01a4b6c623b7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.414s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:16 compute-1 nova_compute[192795]: 2025-09-30 21:17:16.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:17 compute-1 podman[221624]: 2025-09-30 21:17:17.232991704 +0000 UTC m=+0.070302551 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:17:17 compute-1 podman[221623]: 2025-09-30 21:17:17.241161375 +0000 UTC m=+0.087092136 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:17:19 compute-1 podman[221669]: 2025-09-30 21:17:19.31065452 +0000 UTC m=+0.137926361 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:17:21 compute-1 nova_compute[192795]: 2025-09-30 21:17:21.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:22 compute-1 podman[221695]: 2025-09-30 21:17:22.24100741 +0000 UTC m=+0.072001768 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.715 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "709e44da-6758-4be0-9022-83a511ea88eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.715 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.731 2 DEBUG nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.823 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.824 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.831 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.832 2 INFO nova.compute.claims [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.966 2 DEBUG nova.compute.provider_tree [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:22 compute-1 nova_compute[192795]: 2025-09-30 21:17:22.981 2 DEBUG nova.scheduler.client.report [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.003 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.004 2 DEBUG nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.068 2 DEBUG nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.069 2 DEBUG nova.network.neutron [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.092 2 INFO nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.117 2 DEBUG nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.241 2 DEBUG nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.243 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.243 2 INFO nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Creating image(s)
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.244 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "/var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.244 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "/var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.245 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "/var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.257 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.355 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.356 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.357 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.368 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.425 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.426 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.782 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk 1073741824" returned: 0 in 0.355s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.783 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.783 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.838 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.839 2 DEBUG nova.virt.disk.api [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Checking if we can resize image /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.840 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.919 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.921 2 DEBUG nova.virt.disk.api [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Cannot resize image /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.922 2 DEBUG nova.objects.instance [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lazy-loading 'migration_context' on Instance uuid 709e44da-6758-4be0-9022-83a511ea88eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.949 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.949 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Ensure instance console log exists: /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.950 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.950 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:23 compute-1 nova_compute[192795]: 2025-09-30 21:17:23.950 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:24 compute-1 nova_compute[192795]: 2025-09-30 21:17:24.123 2 DEBUG nova.policy [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0345cb3cbb454545b3c51ba5db7a66fc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8682f64a38a84e09b7a6bf7d33b1a5aa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:17:24 compute-1 nova_compute[192795]: 2025-09-30 21:17:24.310 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267029.3096473, 964c6b99-4f8a-4aa9-a519-7bde11e8a447 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:24 compute-1 nova_compute[192795]: 2025-09-30 21:17:24.310 2 INFO nova.compute.manager [-] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] VM Stopped (Lifecycle Event)
Sep 30 21:17:24 compute-1 nova_compute[192795]: 2025-09-30 21:17:24.333 2 DEBUG nova.compute.manager [None req-3c04089a-6f20-4d5b-8b4e-48245fb1610e - - - - - -] [instance: 964c6b99-4f8a-4aa9-a519-7bde11e8a447] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:26 compute-1 nova_compute[192795]: 2025-09-30 21:17:26.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:26 compute-1 nova_compute[192795]: 2025-09-30 21:17:26.495 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Creating tmpfile /var/lib/nova/instances/tmpghl6923x to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Sep 30 21:17:26 compute-1 nova_compute[192795]: 2025-09-30 21:17:26.496 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpghl6923x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Sep 30 21:17:26 compute-1 nova_compute[192795]: 2025-09-30 21:17:26.595 2 DEBUG nova.network.neutron [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Successfully created port: 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:17:26 compute-1 nova_compute[192795]: 2025-09-30 21:17:26.677 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Creating tmpfile /var/lib/nova/instances/tmps3z4e6mm to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Sep 30 21:17:26 compute-1 nova_compute[192795]: 2025-09-30 21:17:26.678 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps3z4e6mm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Sep 30 21:17:27 compute-1 nova_compute[192795]: 2025-09-30 21:17:27.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.018 2 DEBUG nova.network.neutron [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Successfully updated port: 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.035 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.036 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquired lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.036 2 DEBUG nova.network.neutron [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.116 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpghl6923x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.155 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.156 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquired lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.156 2 DEBUG nova.network.neutron [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.192 2 DEBUG nova.compute.manager [req-8737b714-f2bf-4026-a000-7db86342eacb req-7b7ad2a6-3ca4-4823-a025-00e0355cfe12 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received event network-changed-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.193 2 DEBUG nova.compute.manager [req-8737b714-f2bf-4026-a000-7db86342eacb req-7b7ad2a6-3ca4-4823-a025-00e0355cfe12 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Refreshing instance network info cache due to event network-changed-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.193 2 DEBUG oslo_concurrency.lockutils [req-8737b714-f2bf-4026-a000-7db86342eacb req-7b7ad2a6-3ca4-4823-a025-00e0355cfe12 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:28 compute-1 nova_compute[192795]: 2025-09-30 21:17:28.946 2 DEBUG nova.network.neutron [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:17:29 compute-1 podman[221730]: 2025-09-30 21:17:29.21575352 +0000 UTC m=+0.057291840 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_metadata_agent)
Sep 30 21:17:29 compute-1 nova_compute[192795]: 2025-09-30 21:17:29.976 2 DEBUG nova.network.neutron [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Updating instance_info_cache with network_info: [{"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.013 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Releasing lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.013 2 DEBUG nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Instance network_info: |[{"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.014 2 DEBUG oslo_concurrency.lockutils [req-8737b714-f2bf-4026-a000-7db86342eacb req-7b7ad2a6-3ca4-4823-a025-00e0355cfe12 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.014 2 DEBUG nova.network.neutron [req-8737b714-f2bf-4026-a000-7db86342eacb req-7b7ad2a6-3ca4-4823-a025-00e0355cfe12 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Refreshing network info cache for port 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.016 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Start _get_guest_xml network_info=[{"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.021 2 WARNING nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.025 2 DEBUG nova.virt.libvirt.host [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.026 2 DEBUG nova.virt.libvirt.host [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.029 2 DEBUG nova.virt.libvirt.host [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.029 2 DEBUG nova.virt.libvirt.host [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.030 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.030 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='342903211',id=15,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-1442213639',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.031 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.031 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.031 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.031 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.031 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.032 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.032 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.032 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.032 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.032 2 DEBUG nova.virt.hardware [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.036 2 DEBUG nova.virt.libvirt.vif [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:17:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-83880644',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-83880644',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(15),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-83880644',id=13,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=15,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAnWpckxr0DtyXLfS0GzNvnNTp1bsVurnd6EmGZMCR98B6S3IXvoBvWtgyqHzjIndUqkBC7I7pwh8DnLzZoREkpILcp0NScAgAtJWk4up+BRbmxNlgueA/Mrcv3rAMqgAA==',key_name='tempest-keypair-1892732760',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8682f64a38a84e09b7a6bf7d33b1a5aa',ramdisk_id='',reservation_id='r-gvtbmxlm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-676823569',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-676823569-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:17:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0345cb3cbb454545b3c51ba5db7a66fc',uuid=709e44da-6758-4be0-9022-83a511ea88eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.036 2 DEBUG nova.network.os_vif_util [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Converting VIF {"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.037 2 DEBUG nova.network.os_vif_util [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:98:46,bridge_name='br-int',has_traffic_filtering=True,id=35b0bc25-77bf-479e-9d9b-4e20e3a9da8f,network=Network(53e65989-8c98-4f88-9650-e62767650e5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35b0bc25-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.038 2 DEBUG nova.objects.instance [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lazy-loading 'pci_devices' on Instance uuid 709e44da-6758-4be0-9022-83a511ea88eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.054 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <uuid>709e44da-6758-4be0-9022-83a511ea88eb</uuid>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <name>instance-0000000d</name>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-83880644</nova:name>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:17:30</nova:creationTime>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <nova:flavor name="tempest-flavor_with_ephemeral_0-1442213639">
Sep 30 21:17:30 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:17:30 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:17:30 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:17:30 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:17:30 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:17:30 compute-1 nova_compute[192795]:         <nova:user uuid="0345cb3cbb454545b3c51ba5db7a66fc">tempest-ServersWithSpecificFlavorTestJSON-676823569-project-member</nova:user>
Sep 30 21:17:30 compute-1 nova_compute[192795]:         <nova:project uuid="8682f64a38a84e09b7a6bf7d33b1a5aa">tempest-ServersWithSpecificFlavorTestJSON-676823569</nova:project>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:17:30 compute-1 nova_compute[192795]:         <nova:port uuid="35b0bc25-77bf-479e-9d9b-4e20e3a9da8f">
Sep 30 21:17:30 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <system>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <entry name="serial">709e44da-6758-4be0-9022-83a511ea88eb</entry>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <entry name="uuid">709e44da-6758-4be0-9022-83a511ea88eb</entry>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     </system>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <os>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   </os>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <features>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   </features>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk.config"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:af:98:46"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <target dev="tap35b0bc25-77"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/console.log" append="off"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <video>
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     </video>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:17:30 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:17:30 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:17:30 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:17:30 compute-1 nova_compute[192795]: </domain>
Sep 30 21:17:30 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.055 2 DEBUG nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Preparing to wait for external event network-vif-plugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.056 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "709e44da-6758-4be0-9022-83a511ea88eb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.056 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.056 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.057 2 DEBUG nova.virt.libvirt.vif [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:17:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-83880644',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-83880644',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(15),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-83880644',id=13,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=15,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAnWpckxr0DtyXLfS0GzNvnNTp1bsVurnd6EmGZMCR98B6S3IXvoBvWtgyqHzjIndUqkBC7I7pwh8DnLzZoREkpILcp0NScAgAtJWk4up+BRbmxNlgueA/Mrcv3rAMqgAA==',key_name='tempest-keypair-1892732760',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8682f64a38a84e09b7a6bf7d33b1a5aa',ramdisk_id='',reservation_id='r-gvtbmxlm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-676823569',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-676823569-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:17:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0345cb3cbb454545b3c51ba5db7a66fc',uuid=709e44da-6758-4be0-9022-83a511ea88eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.058 2 DEBUG nova.network.os_vif_util [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Converting VIF {"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.058 2 DEBUG nova.network.os_vif_util [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:98:46,bridge_name='br-int',has_traffic_filtering=True,id=35b0bc25-77bf-479e-9d9b-4e20e3a9da8f,network=Network(53e65989-8c98-4f88-9650-e62767650e5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35b0bc25-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.059 2 DEBUG os_vif [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:98:46,bridge_name='br-int',has_traffic_filtering=True,id=35b0bc25-77bf-479e-9d9b-4e20e3a9da8f,network=Network(53e65989-8c98-4f88-9650-e62767650e5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35b0bc25-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.060 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.061 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.067 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap35b0bc25-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.068 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap35b0bc25-77, col_values=(('external_ids', {'iface-id': '35b0bc25-77bf-479e-9d9b-4e20e3a9da8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:98:46', 'vm-uuid': '709e44da-6758-4be0-9022-83a511ea88eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:30 compute-1 NetworkManager[51724]: <info>  [1759267050.0713] manager: (tap35b0bc25-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.079 2 INFO os_vif [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:98:46,bridge_name='br-int',has_traffic_filtering=True,id=35b0bc25-77bf-479e-9d9b-4e20e3a9da8f,network=Network(53e65989-8c98-4f88-9650-e62767650e5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35b0bc25-77')
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.136 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.136 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.137 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] No VIF found with MAC fa:16:3e:af:98:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.137 2 INFO nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Using config drive
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.293 2 DEBUG nova.network.neutron [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.328 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Releasing lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.340 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpghl6923x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.341 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Creating instance directory: /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.341 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Creating disk.info with the contents: {'/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk': 'qcow2', '/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.342 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.342 2 DEBUG nova.objects.instance [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7f0e9e16-1467-41e5-b5b0-965591aa014c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.370 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.449 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.450 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.450 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.461 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.517 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.519 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.667 2 INFO nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Creating config drive at /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk.config
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.680 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1tzz5sh6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.768 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267035.7670574, e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.769 2 INFO nova.compute.manager [-] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] VM Stopped (Lifecycle Event)
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.795 2 DEBUG nova.compute.manager [None req-51817860-2615-4739-b8d7-8dcefa550401 - - - - - -] [instance: e5ff8a3f-b0c3-42d6-a5e6-150e8c7a0a63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.808 2 DEBUG oslo_concurrency.processutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1tzz5sh6" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.811 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk 1073741824" returned: 0 in 0.293s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.812 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.813 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:30 compute-1 kernel: tap35b0bc25-77: entered promiscuous mode
Sep 30 21:17:30 compute-1 ovn_controller[94902]: 2025-09-30T21:17:30Z|00059|binding|INFO|Claiming lport 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f for this chassis.
Sep 30 21:17:30 compute-1 NetworkManager[51724]: <info>  [1759267050.8573] manager: (tap35b0bc25-77): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Sep 30 21:17:30 compute-1 ovn_controller[94902]: 2025-09-30T21:17:30Z|00060|binding|INFO|35b0bc25-77bf-479e-9d9b-4e20e3a9da8f: Claiming fa:16:3e:af:98:46 10.100.0.11
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.872 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:98:46 10.100.0.11'], port_security=['fa:16:3e:af:98:46 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '709e44da-6758-4be0-9022-83a511ea88eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53e65989-8c98-4f88-9650-e62767650e5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8682f64a38a84e09b7a6bf7d33b1a5aa', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1f96f39e-fd28-443a-9a6a-f8657fefd81a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=438afc58-4fac-4a65-8830-7d1a0f05079c, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=35b0bc25-77bf-479e-9d9b-4e20e3a9da8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.873 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f in datapath 53e65989-8c98-4f88-9650-e62767650e5a bound to our chassis
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.875 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 53e65989-8c98-4f88-9650-e62767650e5a
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.879 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.880 2 DEBUG nova.virt.disk.api [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Checking if we can resize image /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.881 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:30 compute-1 systemd-udevd[221781]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:30 compute-1 systemd-machined[152783]: New machine qemu-6-instance-0000000d.
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.888 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8da500a5-e8b1-4b8c-b8ea-f374c3af4435]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.889 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap53e65989-81 in ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.891 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap53e65989-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.891 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3764a606-110d-442f-9e63-6fdd78897edf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.892 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f66a87-166c-414a-b627-02b837789554]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:30 compute-1 NetworkManager[51724]: <info>  [1759267050.8959] device (tap35b0bc25-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:17:30 compute-1 NetworkManager[51724]: <info>  [1759267050.8969] device (tap35b0bc25-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.904 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[2f57c6ce-77ca-49f3-b532-364eca73cbc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:30 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-0000000d.
Sep 30 21:17:30 compute-1 ovn_controller[94902]: 2025-09-30T21:17:30Z|00061|binding|INFO|Setting lport 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f ovn-installed in OVS
Sep 30 21:17:30 compute-1 ovn_controller[94902]: 2025-09-30T21:17:30Z|00062|binding|INFO|Setting lport 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f up in Southbound
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.932 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[105f5c39-5232-4824-b7d2-cb081f4989cd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.946 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.947 2 DEBUG nova.virt.disk.api [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Cannot resize image /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.947 2 DEBUG nova.objects.instance [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f0e9e16-1467-41e5-b5b0-965591aa014c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.959 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[99ffa4d3-c0d2-48a6-8674-4eecf88a3cde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.964 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[27585c91-72c2-477c-a274-0766fe001293]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:30 compute-1 systemd-udevd[221786]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:30 compute-1 NetworkManager[51724]: <info>  [1759267050.9662] manager: (tap53e65989-80): new Veth device (/org/freedesktop/NetworkManager/Devices/38)
Sep 30 21:17:30 compute-1 nova_compute[192795]: 2025-09-30 21:17:30.978 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.995 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[13677c0d-0cfa-41d1-887f-b7275ec0fc46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:30.998 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5a397a49-3c3e-440a-8dbe-907080beec57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.018 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config 485376" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.019 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config to /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.019 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:31 compute-1 NetworkManager[51724]: <info>  [1759267051.0228] device (tap53e65989-80): carrier: link connected
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.027 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0d35e4-bb77-4223-a87a-20816635ac53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.045 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1f075873-d752-4be2-8c81-eae722937200]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap53e65989-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:37:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374669, 'reachable_time': 36486, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221821, 'error': None, 'target': 'ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.064 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[063f37ca-399c-44b2-b9ba-b5a40fcfdc9d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:375b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 374669, 'tstamp': 374669}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221823, 'error': None, 'target': 'ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.082 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[940ccf3f-5c0c-49ce-b760-7ced8d6f0d2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap53e65989-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:37:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374669, 'reachable_time': 36486, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221825, 'error': None, 'target': 'ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.113 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bc35de3e-23f1-484c-97cf-deb91dd1e891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.193 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3d89b321-5feb-4d43-9611-1cc71bb5734f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.196 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53e65989-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.196 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.197 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap53e65989-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:31 compute-1 NetworkManager[51724]: <info>  [1759267051.2010] manager: (tap53e65989-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Sep 30 21:17:31 compute-1 kernel: tap53e65989-80: entered promiscuous mode
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.204 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap53e65989-80, col_values=(('external_ids', {'iface-id': '37466c73-c271-482d-b1bc-89db9b4d4f7e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:31 compute-1 ovn_controller[94902]: 2025-09-30T21:17:31Z|00063|binding|INFO|Releasing lport 37466c73-c271-482d-b1bc-89db9b4d4f7e from this chassis (sb_readonly=0)
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.205 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/53e65989-8c98-4f88-9650-e62767650e5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/53e65989-8c98-4f88-9650-e62767650e5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.206 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[01175161-4279-4eb0-8ea9-8599b230ef83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.207 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-53e65989-8c98-4f88-9650-e62767650e5a
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/53e65989-8c98-4f88-9650-e62767650e5a.pid.haproxy
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 53e65989-8c98-4f88-9650-e62767650e5a
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:17:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:31.207 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a', 'env', 'PROCESS_TAG=haproxy-53e65989-8c98-4f88-9650-e62767650e5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/53e65989-8c98-4f88-9650-e62767650e5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.244 2 DEBUG nova.compute.manager [req-bae7da2b-79ac-495b-a8ed-b400ace78ce7 req-8bb53949-0a8e-42be-9282-480bc03f43a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received event network-vif-plugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.244 2 DEBUG oslo_concurrency.lockutils [req-bae7da2b-79ac-495b-a8ed-b400ace78ce7 req-8bb53949-0a8e-42be-9282-480bc03f43a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "709e44da-6758-4be0-9022-83a511ea88eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.244 2 DEBUG oslo_concurrency.lockutils [req-bae7da2b-79ac-495b-a8ed-b400ace78ce7 req-8bb53949-0a8e-42be-9282-480bc03f43a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.245 2 DEBUG oslo_concurrency.lockutils [req-bae7da2b-79ac-495b-a8ed-b400ace78ce7 req-8bb53949-0a8e-42be-9282-480bc03f43a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.245 2 DEBUG nova.compute.manager [req-bae7da2b-79ac-495b-a8ed-b400ace78ce7 req-8bb53949-0a8e-42be-9282-480bc03f43a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Processing event network-vif-plugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.362 2 DEBUG nova.network.neutron [req-8737b714-f2bf-4026-a000-7db86342eacb req-7b7ad2a6-3ca4-4823-a025-00e0355cfe12 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Updated VIF entry in instance network info cache for port 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.363 2 DEBUG nova.network.neutron [req-8737b714-f2bf-4026-a000-7db86342eacb req-7b7ad2a6-3ca4-4823-a025-00e0355cfe12 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Updating instance_info_cache with network_info: [{"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.385 2 DEBUG oslo_concurrency.lockutils [req-8737b714-f2bf-4026-a000-7db86342eacb req-7b7ad2a6-3ca4-4823-a025-00e0355cfe12 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.605 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c" returned: 0 in 0.586s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.606 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.607 2 DEBUG nova.virt.libvirt.vif [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:17:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-772926731',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-772926731',id=11,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-kiix6ynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:17:21Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.607 2 DEBUG nova.network.os_vif_util [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converting VIF {"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.608 2 DEBUG nova.network.os_vif_util [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.608 2 DEBUG os_vif [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.609 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.609 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.611 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb8ecc8e-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.612 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb8ecc8e-9c, col_values=(('external_ids', {'iface-id': 'bb8ecc8e-9cf4-4901-9788-83c49356f983', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:9d:62', 'vm-uuid': '7f0e9e16-1467-41e5-b5b0-965591aa014c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:31 compute-1 NetworkManager[51724]: <info>  [1759267051.6149] manager: (tapbb8ecc8e-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.620 2 INFO os_vif [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c')
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.621 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.621 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpghl6923x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.665 2 DEBUG nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.667 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267051.6665664, 709e44da-6758-4be0-9022-83a511ea88eb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.667 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] VM Started (Lifecycle Event)
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.669 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.673 2 INFO nova.virt.libvirt.driver [-] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Instance spawned successfully.
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.673 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:17:31 compute-1 podman[221863]: 2025-09-30 21:17:31.582062249 +0000 UTC m=+0.026350413 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.714 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.717 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:31 compute-1 podman[221863]: 2025-09-30 21:17:31.804707759 +0000 UTC m=+0.248995903 container create c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.862 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.864 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267051.6666932, 709e44da-6758-4be0-9022-83a511ea88eb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.865 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] VM Paused (Lifecycle Event)
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.872 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.874 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.875 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.875 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.876 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.877 2 DEBUG nova.virt.libvirt.driver [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.885 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.891 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267051.6695156, 709e44da-6758-4be0-9022-83a511ea88eb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.892 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] VM Resumed (Lifecycle Event)
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.923 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.928 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:31 compute-1 systemd[1]: Started libpod-conmon-c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a.scope.
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.948 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:31 compute-1 podman[221878]: 2025-09-30 21:17:31.959173625 +0000 UTC m=+0.098452362 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:17:31 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:17:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47c6a4e755055d156c589c5cedc0b58ada78438221c659c5705c089b02dd2132/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.973 2 INFO nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Took 8.73 seconds to spawn the instance on the hypervisor.
Sep 30 21:17:31 compute-1 nova_compute[192795]: 2025-09-30 21:17:31.973 2 DEBUG nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:32 compute-1 podman[221863]: 2025-09-30 21:17:32.030985368 +0000 UTC m=+0.475273512 container init c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:17:32 compute-1 podman[221863]: 2025-09-30 21:17:32.041396109 +0000 UTC m=+0.485684253 container start c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:17:32 compute-1 neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a[221896]: [NOTICE]   (221903) : New worker (221905) forked
Sep 30 21:17:32 compute-1 neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a[221896]: [NOTICE]   (221903) : Loading success.
Sep 30 21:17:32 compute-1 nova_compute[192795]: 2025-09-30 21:17:32.071 2 INFO nova.compute.manager [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Took 9.28 seconds to build instance.
Sep 30 21:17:32 compute-1 nova_compute[192795]: 2025-09-30 21:17:32.087 2 DEBUG oslo_concurrency.lockutils [None req-e2f8caa5-e0a0-49c5-b7d8-a5154fe5ef1d 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.098 2 DEBUG nova.network.neutron [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.113 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpghl6923x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Sep 30 21:17:33 compute-1 NetworkManager[51724]: <info>  [1759267053.3951] manager: (tapbb8ecc8e-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Sep 30 21:17:33 compute-1 kernel: tapbb8ecc8e-9c: entered promiscuous mode
Sep 30 21:17:33 compute-1 systemd-udevd[221811]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:33 compute-1 ovn_controller[94902]: 2025-09-30T21:17:33Z|00064|binding|INFO|Claiming lport bb8ecc8e-9cf4-4901-9788-83c49356f983 for this additional chassis.
Sep 30 21:17:33 compute-1 ovn_controller[94902]: 2025-09-30T21:17:33Z|00065|binding|INFO|bb8ecc8e-9cf4-4901-9788-83c49356f983: Claiming fa:16:3e:45:9d:62 10.100.0.12
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.402 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:33 compute-1 NetworkManager[51724]: <info>  [1759267053.4098] device (tapbb8ecc8e-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:17:33 compute-1 NetworkManager[51724]: <info>  [1759267053.4114] device (tapbb8ecc8e-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:17:33 compute-1 podman[221915]: 2025-09-30 21:17:33.441565656 +0000 UTC m=+0.086934311 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.453 2 DEBUG nova.compute.manager [req-e1c55dc5-fa1a-414a-8f5c-8c122a8f0754 req-6998bf3c-b966-43e2-8ade-b8b280f67753 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received event network-vif-plugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.454 2 DEBUG oslo_concurrency.lockutils [req-e1c55dc5-fa1a-414a-8f5c-8c122a8f0754 req-6998bf3c-b966-43e2-8ade-b8b280f67753 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "709e44da-6758-4be0-9022-83a511ea88eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.454 2 DEBUG oslo_concurrency.lockutils [req-e1c55dc5-fa1a-414a-8f5c-8c122a8f0754 req-6998bf3c-b966-43e2-8ade-b8b280f67753 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.454 2 DEBUG oslo_concurrency.lockutils [req-e1c55dc5-fa1a-414a-8f5c-8c122a8f0754 req-6998bf3c-b966-43e2-8ade-b8b280f67753 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.454 2 DEBUG nova.compute.manager [req-e1c55dc5-fa1a-414a-8f5c-8c122a8f0754 req-6998bf3c-b966-43e2-8ade-b8b280f67753 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] No waiting events found dispatching network-vif-plugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.455 2 WARNING nova.compute.manager [req-e1c55dc5-fa1a-414a-8f5c-8c122a8f0754 req-6998bf3c-b966-43e2-8ade-b8b280f67753 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received unexpected event network-vif-plugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f for instance with vm_state active and task_state None.
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:33 compute-1 systemd-machined[152783]: New machine qemu-7-instance-0000000b.
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:33 compute-1 ovn_controller[94902]: 2025-09-30T21:17:33Z|00066|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 ovn-installed in OVS
Sep 30 21:17:33 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-0000000b.
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:33 compute-1 nova_compute[192795]: 2025-09-30 21:17:33.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:34 compute-1 NetworkManager[51724]: <info>  [1759267054.1822] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/42)
Sep 30 21:17:34 compute-1 NetworkManager[51724]: <info>  [1759267054.1829] device (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 21:17:34 compute-1 NetworkManager[51724]: <info>  [1759267054.1839] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/43)
Sep 30 21:17:34 compute-1 NetworkManager[51724]: <info>  [1759267054.1843] device (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 21:17:34 compute-1 NetworkManager[51724]: <info>  [1759267054.1850] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Sep 30 21:17:34 compute-1 NetworkManager[51724]: <info>  [1759267054.1856] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Sep 30 21:17:34 compute-1 NetworkManager[51724]: <info>  [1759267054.1859] device (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Sep 30 21:17:34 compute-1 NetworkManager[51724]: <info>  [1759267054.1862] device (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:34 compute-1 ovn_controller[94902]: 2025-09-30T21:17:34Z|00067|binding|INFO|Releasing lport 37466c73-c271-482d-b1bc-89db9b4d4f7e from this chassis (sb_readonly=0)
Sep 30 21:17:34 compute-1 ovn_controller[94902]: 2025-09-30T21:17:34Z|00068|binding|INFO|Releasing lport 0c66892e-7baf-4f9a-a329-dd0545dbf700 from this chassis (sb_readonly=0)
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.481 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267054.4812868, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.482 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Started (Lifecycle Event)
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.538 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.644 2 DEBUG nova.compute.manager [req-7cce9a54-721e-4f90-8eec-206866d4b241 req-8c0cd427-4a71-438d-9004-304370cecdde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received event network-changed-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.644 2 DEBUG nova.compute.manager [req-7cce9a54-721e-4f90-8eec-206866d4b241 req-8c0cd427-4a71-438d-9004-304370cecdde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Refreshing instance network info cache due to event network-changed-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.645 2 DEBUG oslo_concurrency.lockutils [req-7cce9a54-721e-4f90-8eec-206866d4b241 req-8c0cd427-4a71-438d-9004-304370cecdde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.645 2 DEBUG oslo_concurrency.lockutils [req-7cce9a54-721e-4f90-8eec-206866d4b241 req-8c0cd427-4a71-438d-9004-304370cecdde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:34 compute-1 nova_compute[192795]: 2025-09-30 21:17:34.645 2 DEBUG nova.network.neutron [req-7cce9a54-721e-4f90-8eec-206866d4b241 req-8c0cd427-4a71-438d-9004-304370cecdde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Refreshing network info cache for port 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:17:35 compute-1 nova_compute[192795]: 2025-09-30 21:17:35.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.457 2 DEBUG nova.network.neutron [req-7cce9a54-721e-4f90-8eec-206866d4b241 req-8c0cd427-4a71-438d-9004-304370cecdde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Updated VIF entry in instance network info cache for port 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.457 2 DEBUG nova.network.neutron [req-7cce9a54-721e-4f90-8eec-206866d4b241 req-8c0cd427-4a71-438d-9004-304370cecdde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Updating instance_info_cache with network_info: [{"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.508 2 DEBUG oslo_concurrency.lockutils [req-7cce9a54-721e-4f90-8eec-206866d4b241 req-8c0cd427-4a71-438d-9004-304370cecdde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.574 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267056.573654, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.575 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Resumed (Lifecycle Event)
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.820 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.823 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.825 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.826 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.826 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.826 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.855 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.913 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.972 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:36 compute-1 nova_compute[192795]: 2025-09-30 21:17:36.973 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.034 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.043 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.106 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.107 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.167 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.172 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.230 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.231 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.294 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.474 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.476 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5314MB free_disk=73.4039535522461GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.476 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.477 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.548 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Migration for instance 7f0e9e16-1467-41e5-b5b0-965591aa014c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.549 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Migration for instance b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.596 2 INFO nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating resource usage from migration 0a2a09d7-020a-40a6-b7d6-da99acd63254
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.597 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Starting to track incoming migration 0a2a09d7-020a-40a6-b7d6-da99acd63254 with flavor afe5c12d-500a-499b-9438-9e9c37698acc _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.617 2 INFO nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating resource usage from migration 75c88f2a-0d52-4e9d-8432-8454d77751e5
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.618 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Starting to track incoming migration 75c88f2a-0d52-4e9d-8432-8454d77751e5 with flavor afe5c12d-500a-499b-9438-9e9c37698acc _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:17:37 compute-1 ovn_controller[94902]: 2025-09-30T21:17:37Z|00069|binding|INFO|Claiming lport bb8ecc8e-9cf4-4901-9788-83c49356f983 for this chassis.
Sep 30 21:17:37 compute-1 ovn_controller[94902]: 2025-09-30T21:17:37Z|00070|binding|INFO|bb8ecc8e-9cf4-4901-9788-83c49356f983: Claiming fa:16:3e:45:9d:62 10.100.0.12
Sep 30 21:17:37 compute-1 ovn_controller[94902]: 2025-09-30T21:17:37Z|00071|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 up in Southbound
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.685 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:9d:62 10.100.0.12'], port_security=['fa:16:3e:45:9d:62 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=bb8ecc8e-9cf4-4901-9788-83c49356f983) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.685 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 252d5457-8837-4aa6-b309-c3139e8db7ed actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.687 103861 INFO neutron.agent.ovn.metadata.agent [-] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 bound to our chassis
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.687 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 709e44da-6758-4be0-9022-83a511ea88eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.690 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.705 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e971bfea-0758-4f2e-b79e-2c79c2377491]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.707 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap934fff90-51 in ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.707 2 WARNING nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 7f0e9e16-1467-41e5-b5b0-965591aa014c has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.709 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap934fff90-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.709 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8185f001-3533-49e7-8f15-718069bbaa62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.709 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[43d09a7b-698d-48a0-aa7e-f3b85ca6d261]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.721 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[1eb30d99-fd60-434e-a217-efcf995bf30c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.725 2 WARNING nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.726 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.726 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.749 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6530c3-528a-4bc2-9efc-fbf02b04f2b0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.776 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[baafc1da-0294-45dc-823e-32b5b8cff722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.782 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[aecb217c-5964-45a9-ab57-a1109a8a066b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 NetworkManager[51724]: <info>  [1759267057.7859] manager: (tap934fff90-50): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Sep 30 21:17:37 compute-1 systemd-udevd[222013]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.830 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc60720-9d54-424a-851e-08a1ac60f254]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.836 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[60301bf7-6fb2-4af5-ac4f-ad67529cd189]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.851 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:37 compute-1 NetworkManager[51724]: <info>  [1759267057.8670] device (tap934fff90-50): carrier: link connected
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.869 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.874 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[c11a8748-8491-4162-911f-5c9c4971f005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.897 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8eef182d-48ca-49c0-992f-9a53a1e8a1dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375354, 'reachable_time': 17406, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222032, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.910 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.911 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.919 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[248dd50c-f97e-499e-a2a3-2e21d55a997f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe18:3765'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375354, 'tstamp': 375354}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222033, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 nova_compute[192795]: 2025-09-30 21:17:37.925 2 INFO nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Post operation of migration started
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.945 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a351bb15-6cff-4377-bdfe-1e80ebfeade0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375354, 'reachable_time': 17406, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222034, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:37.987 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[87482e16-a85f-438f-ae75-03f2940dd89f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.058 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[19239d7f-1b21-4126-91f3-66aabf377740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.060 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.060 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.061 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap934fff90-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:38 compute-1 NetworkManager[51724]: <info>  [1759267058.0647] manager: (tap934fff90-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Sep 30 21:17:38 compute-1 kernel: tap934fff90-50: entered promiscuous mode
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.076 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap934fff90-50, col_values=(('external_ids', {'iface-id': 'b21a7164-770c-4265-ad15-a3e058ec1a56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:38 compute-1 ovn_controller[94902]: 2025-09-30T21:17:38Z|00072|binding|INFO|Releasing lport b21a7164-770c-4265-ad15-a3e058ec1a56 from this chassis (sb_readonly=0)
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.081 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.082 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c0dacf06-4c48-4143-ad86-c5ae244bb2c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.083 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.085 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'env', 'PROCESS_TAG=haproxy-934fff90-5446-41f1-a5ad-d2568cb337b1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/934fff90-5446-41f1-a5ad-d2568cb337b1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.442 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.443 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquired lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.443 2 DEBUG nova.network.neutron [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:38 compute-1 podman[222065]: 2025-09-30 21:17:38.522280367 +0000 UTC m=+0.102237345 container create d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:17:38 compute-1 podman[222065]: 2025-09-30 21:17:38.455131132 +0000 UTC m=+0.035088140 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:17:38 compute-1 systemd[1]: Started libpod-conmon-d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552.scope.
Sep 30 21:17:38 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:17:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298d531c0c7b0413c31f3f7c19e227eaf1df8d8a51fcb60fc4e1d0546b614038/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:17:38 compute-1 podman[222065]: 2025-09-30 21:17:38.617106871 +0000 UTC m=+0.197063899 container init d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:17:38 compute-1 podman[222065]: 2025-09-30 21:17:38.623597756 +0000 UTC m=+0.203554744 container start d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 21:17:38 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222080]: [NOTICE]   (222084) : New worker (222086) forked
Sep 30 21:17:38 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222080]: [NOTICE]   (222084) : Loading success.
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.679 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.680 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:38.682 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.911 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.936 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.937 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:17:38 compute-1 nova_compute[192795]: 2025-09-30 21:17:38.937 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:17:39 compute-1 nova_compute[192795]: 2025-09-30 21:17:39.359 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:39 compute-1 nova_compute[192795]: 2025-09-30 21:17:39.359 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:39 compute-1 nova_compute[192795]: 2025-09-30 21:17:39.360 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:17:39 compute-1 nova_compute[192795]: 2025-09-30 21:17:39.360 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 709e44da-6758-4be0-9022-83a511ea88eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.315 2 DEBUG nova.network.neutron [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.338 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Releasing lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.369 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.370 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.371 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.377 2 INFO nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 21:17:40 compute-1 virtqemud[192217]: Domain id=7 name='instance-0000000b' uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c is tainted: custom-monitor
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.838 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Updating instance_info_cache with network_info: [{"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.856 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-709e44da-6758-4be0-9022-83a511ea88eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.856 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.857 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:40 compute-1 nova_compute[192795]: 2025-09-30 21:17:40.858 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:17:41 compute-1 nova_compute[192795]: 2025-09-30 21:17:41.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:41 compute-1 nova_compute[192795]: 2025-09-30 21:17:41.387 2 INFO nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 21:17:41 compute-1 nova_compute[192795]: 2025-09-30 21:17:41.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:42 compute-1 podman[222095]: 2025-09-30 21:17:42.248125356 +0000 UTC m=+0.080243961 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:17:42 compute-1 nova_compute[192795]: 2025-09-30 21:17:42.396 2 INFO nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 21:17:42 compute-1 nova_compute[192795]: 2025-09-30 21:17:42.405 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:42 compute-1 nova_compute[192795]: 2025-09-30 21:17:42.452 2 DEBUG nova.objects.instance [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:17:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:43.929 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:43 compute-1 nova_compute[192795]: 2025-09-30 21:17:43.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:43.929 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:17:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:43.931 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:44 compute-1 nova_compute[192795]: 2025-09-30 21:17:44.632 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps3z4e6mm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Sep 30 21:17:44 compute-1 nova_compute[192795]: 2025-09-30 21:17:44.667 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:44 compute-1 nova_compute[192795]: 2025-09-30 21:17:44.668 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquired lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:44 compute-1 nova_compute[192795]: 2025-09-30 21:17:44.669 2 DEBUG nova.network.neutron [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:44 compute-1 ovn_controller[94902]: 2025-09-30T21:17:44Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:af:98:46 10.100.0.11
Sep 30 21:17:44 compute-1 ovn_controller[94902]: 2025-09-30T21:17:44Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:98:46 10.100.0.11
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.497 2 DEBUG nova.network.neutron [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating instance_info_cache with network_info: [{"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.547 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Releasing lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.566 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps3z4e6mm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.567 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Creating instance directory: /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.567 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Creating disk.info with the contents: {'/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk': 'qcow2', '/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.568 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.568 2 DEBUG nova.objects.instance [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lazy-loading 'trusted_certs' on Instance uuid b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.609 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.668 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.669 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.670 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.682 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.776 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.777 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.817 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.818 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.819 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.911 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.912 2 DEBUG nova.virt.disk.api [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Checking if we can resize image /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.912 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.965 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.966 2 DEBUG nova.virt.disk.api [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Cannot resize image /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.967 2 DEBUG nova.objects.instance [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lazy-loading 'migration_context' on Instance uuid b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:46 compute-1 nova_compute[192795]: 2025-09-30 21:17:46.985 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.007 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config 485376" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.009 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config to /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.010 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.228 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Check if temp file /var/lib/nova/instances/tmpxbhoey6v exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.229 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxbhoey6v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.593 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" returned: 0 in 0.583s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.594 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.595 2 DEBUG nova.virt.libvirt.vif [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-2078302607',display_name='tempest-LiveMigrationTest-server-2078302607',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-2078302607',id=12,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-1871um1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:17:20Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.595 2 DEBUG nova.network.os_vif_util [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converting VIF {"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.596 2 DEBUG nova.network.os_vif_util [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.596 2 DEBUG os_vif [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.597 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.597 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.602 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap730a9e74-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.602 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap730a9e74-90, col_values=(('external_ids', {'iface-id': '730a9e74-900e-49b2-a5c3-043d6da1a52b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:47:71', 'vm-uuid': 'b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:47 compute-1 NetworkManager[51724]: <info>  [1759267067.6054] manager: (tap730a9e74-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.613 2 INFO os_vif [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90')
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.614 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Sep 30 21:17:47 compute-1 nova_compute[192795]: 2025-09-30 21:17:47.614 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps3z4e6mm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Sep 30 21:17:48 compute-1 nova_compute[192795]: 2025-09-30 21:17:48.150 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:48 compute-1 nova_compute[192795]: 2025-09-30 21:17:48.216 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:48 compute-1 nova_compute[192795]: 2025-09-30 21:17:48.217 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:48 compute-1 podman[222156]: 2025-09-30 21:17:48.2514178 +0000 UTC m=+0.083332494 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:17:48 compute-1 podman[222158]: 2025-09-30 21:17:48.264753711 +0000 UTC m=+0.076967092 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:17:48 compute-1 nova_compute[192795]: 2025-09-30 21:17:48.272 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:50 compute-1 podman[222206]: 2025-09-30 21:17:50.263312087 +0000 UTC m=+0.096403287 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.442 2 DEBUG oslo_concurrency.lockutils [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "709e44da-6758-4be0-9022-83a511ea88eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.442 2 DEBUG oslo_concurrency.lockutils [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.443 2 DEBUG oslo_concurrency.lockutils [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "709e44da-6758-4be0-9022-83a511ea88eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.443 2 DEBUG oslo_concurrency.lockutils [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.443 2 DEBUG oslo_concurrency.lockutils [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.456 2 INFO nova.compute.manager [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Terminating instance
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.478 2 DEBUG nova.compute.manager [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:17:51 compute-1 kernel: tap35b0bc25-77 (unregistering): left promiscuous mode
Sep 30 21:17:51 compute-1 NetworkManager[51724]: <info>  [1759267071.5080] device (tap35b0bc25-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:17:51 compute-1 ovn_controller[94902]: 2025-09-30T21:17:51Z|00073|binding|INFO|Releasing lport 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f from this chassis (sb_readonly=0)
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 ovn_controller[94902]: 2025-09-30T21:17:51Z|00074|binding|INFO|Setting lport 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f down in Southbound
Sep 30 21:17:51 compute-1 ovn_controller[94902]: 2025-09-30T21:17:51Z|00075|binding|INFO|Removing iface tap35b0bc25-77 ovn-installed in OVS
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.534 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:98:46 10.100.0.11'], port_security=['fa:16:3e:af:98:46 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '709e44da-6758-4be0-9022-83a511ea88eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53e65989-8c98-4f88-9650-e62767650e5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8682f64a38a84e09b7a6bf7d33b1a5aa', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1f96f39e-fd28-443a-9a6a-f8657fefd81a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=438afc58-4fac-4a65-8830-7d1a0f05079c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=35b0bc25-77bf-479e-9d9b-4e20e3a9da8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.535 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 35b0bc25-77bf-479e-9d9b-4e20e3a9da8f in datapath 53e65989-8c98-4f88-9650-e62767650e5a unbound from our chassis
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.537 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 53e65989-8c98-4f88-9650-e62767650e5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.538 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b84affb3-b3ce-438b-bbab-632aa3c364b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.539 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a namespace which is not needed anymore
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Sep 30 21:17:51 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000d.scope: Consumed 14.372s CPU time.
Sep 30 21:17:51 compute-1 systemd-machined[152783]: Machine qemu-6-instance-0000000d terminated.
Sep 30 21:17:51 compute-1 neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a[221896]: [NOTICE]   (221903) : haproxy version is 2.8.14-c23fe91
Sep 30 21:17:51 compute-1 neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a[221896]: [NOTICE]   (221903) : path to executable is /usr/sbin/haproxy
Sep 30 21:17:51 compute-1 neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a[221896]: [WARNING]  (221903) : Exiting Master process...
Sep 30 21:17:51 compute-1 neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a[221896]: [ALERT]    (221903) : Current worker (221905) exited with code 143 (Terminated)
Sep 30 21:17:51 compute-1 neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a[221896]: [WARNING]  (221903) : All workers exited. Exiting... (0)
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 systemd[1]: libpod-c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a.scope: Deactivated successfully.
Sep 30 21:17:51 compute-1 podman[222259]: 2025-09-30 21:17:51.712649274 +0000 UTC m=+0.055373469 container died c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:17:51 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a-userdata-shm.mount: Deactivated successfully.
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.745 2 INFO nova.virt.libvirt.driver [-] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Instance destroyed successfully.
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.745 2 DEBUG nova.objects.instance [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lazy-loading 'resources' on Instance uuid 709e44da-6758-4be0-9022-83a511ea88eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:51 compute-1 systemd[1]: var-lib-containers-storage-overlay-47c6a4e755055d156c589c5cedc0b58ada78438221c659c5705c089b02dd2132-merged.mount: Deactivated successfully.
Sep 30 21:17:51 compute-1 podman[222259]: 2025-09-30 21:17:51.756677073 +0000 UTC m=+0.099401268 container cleanup c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.769 2 DEBUG nova.virt.libvirt.vif [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:17:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-83880644',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-83880644',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(15),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-83880644',id=13,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=15,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAnWpckxr0DtyXLfS0GzNvnNTp1bsVurnd6EmGZMCR98B6S3IXvoBvWtgyqHzjIndUqkBC7I7pwh8DnLzZoREkpILcp0NScAgAtJWk4up+BRbmxNlgueA/Mrcv3rAMqgAA==',key_name='tempest-keypair-1892732760',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8682f64a38a84e09b7a6bf7d33b1a5aa',ramdisk_id='',reservation_id='r-gvtbmxlm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-676823569',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-676823569-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:17:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0345cb3cbb454545b3c51ba5db7a66fc',uuid=709e44da-6758-4be0-9022-83a511ea88eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.769 2 DEBUG nova.network.os_vif_util [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Converting VIF {"id": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "address": "fa:16:3e:af:98:46", "network": {"id": "53e65989-8c98-4f88-9650-e62767650e5a", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1896803251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8682f64a38a84e09b7a6bf7d33b1a5aa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35b0bc25-77", "ovs_interfaceid": "35b0bc25-77bf-479e-9d9b-4e20e3a9da8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:51 compute-1 systemd[1]: libpod-conmon-c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a.scope: Deactivated successfully.
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.770 2 DEBUG nova.network.os_vif_util [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:98:46,bridge_name='br-int',has_traffic_filtering=True,id=35b0bc25-77bf-479e-9d9b-4e20e3a9da8f,network=Network(53e65989-8c98-4f88-9650-e62767650e5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35b0bc25-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.771 2 DEBUG os_vif [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:98:46,bridge_name='br-int',has_traffic_filtering=True,id=35b0bc25-77bf-479e-9d9b-4e20e3a9da8f,network=Network(53e65989-8c98-4f88-9650-e62767650e5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35b0bc25-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.773 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35b0bc25-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.798 2 INFO os_vif [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:98:46,bridge_name='br-int',has_traffic_filtering=True,id=35b0bc25-77bf-479e-9d9b-4e20e3a9da8f,network=Network(53e65989-8c98-4f88-9650-e62767650e5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35b0bc25-77')
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.799 2 INFO nova.virt.libvirt.driver [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Deleting instance files /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb_del
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.800 2 INFO nova.virt.libvirt.driver [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Deletion of /var/lib/nova/instances/709e44da-6758-4be0-9022-83a511ea88eb_del complete
Sep 30 21:17:51 compute-1 podman[222303]: 2025-09-30 21:17:51.830278984 +0000 UTC m=+0.047128786 container remove c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.835 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2189f7d9-37f6-48c8-88e1-c9629b137dab]: (4, ('Tue Sep 30 09:17:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a (c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a)\nc46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a\nTue Sep 30 09:17:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a (c46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a)\nc46679dae5bbbf2d3b9a4de70059f106388a6a0a6a22843b7fffd951f4c50f3a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.837 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[098fc824-2313-4c1a-be00-43e732f56505]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.838 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53e65989-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 kernel: tap53e65989-80: left promiscuous mode
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.860 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[da273326-4e6a-450a-bf12-0837506c13fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.882 2 INFO nova.compute.manager [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.882 2 DEBUG oslo.service.loopingcall [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.883 2 DEBUG nova.compute.manager [-] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:17:51 compute-1 nova_compute[192795]: 2025-09-30 21:17:51.883 2 DEBUG nova.network.neutron [-] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.883 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cf741257-bfc2-4fa5-80bf-69d271a6c6c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.885 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c171c8e3-d9bd-4170-b50e-1efe03292c2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.901 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f66591bf-48ed-468a-a32e-c335f9555971]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374662, 'reachable_time': 23433, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222324, 'error': None, 'target': 'ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.905 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-53e65989-8c98-4f88-9650-e62767650e5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:17:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:51.905 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[6c91cf7a-204f-4e9d-86f7-3195cff9a51c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:51 compute-1 systemd[1]: run-netns-ovnmeta\x2d53e65989\x2d8c98\x2d4f88\x2d9650\x2de62767650e5a.mount: Deactivated successfully.
Sep 30 21:17:51 compute-1 sshd-session[222317]: Accepted publickey for nova from 192.168.122.100 port 47646 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:17:51 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:17:51 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:17:51 compute-1 systemd-logind[793]: New session 31 of user nova.
Sep 30 21:17:51 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:17:51 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:17:52 compute-1 systemd[222327]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:17:52 compute-1 systemd[222327]: Queued start job for default target Main User Target.
Sep 30 21:17:52 compute-1 systemd[222327]: Created slice User Application Slice.
Sep 30 21:17:52 compute-1 systemd[222327]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:17:52 compute-1 systemd[222327]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:17:52 compute-1 systemd[222327]: Reached target Paths.
Sep 30 21:17:52 compute-1 systemd[222327]: Reached target Timers.
Sep 30 21:17:52 compute-1 systemd[222327]: Starting D-Bus User Message Bus Socket...
Sep 30 21:17:52 compute-1 systemd[222327]: Starting Create User's Volatile Files and Directories...
Sep 30 21:17:52 compute-1 systemd[222327]: Finished Create User's Volatile Files and Directories.
Sep 30 21:17:52 compute-1 systemd[222327]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:17:52 compute-1 systemd[222327]: Reached target Sockets.
Sep 30 21:17:52 compute-1 systemd[222327]: Reached target Basic System.
Sep 30 21:17:52 compute-1 systemd[222327]: Reached target Main User Target.
Sep 30 21:17:52 compute-1 systemd[222327]: Startup finished in 147ms.
Sep 30 21:17:52 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:17:52 compute-1 systemd[1]: Started Session 31 of User nova.
Sep 30 21:17:52 compute-1 sshd-session[222317]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:17:52 compute-1 sshd-session[222343]: Received disconnect from 192.168.122.100 port 47646:11: disconnected by user
Sep 30 21:17:52 compute-1 sshd-session[222343]: Disconnected from user nova 192.168.122.100 port 47646
Sep 30 21:17:52 compute-1 sshd-session[222317]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:17:52 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Sep 30 21:17:52 compute-1 systemd-logind[793]: Session 31 logged out. Waiting for processes to exit.
Sep 30 21:17:52 compute-1 systemd-logind[793]: Removed session 31.
Sep 30 21:17:52 compute-1 podman[222345]: 2025-09-30 21:17:52.368560677 +0000 UTC m=+0.074189457 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:17:52 compute-1 nova_compute[192795]: 2025-09-30 21:17:52.438 2 DEBUG nova.network.neutron [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Port 730a9e74-900e-49b2-a5c3-043d6da1a52b updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Sep 30 21:17:52 compute-1 nova_compute[192795]: 2025-09-30 21:17:52.452 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps3z4e6mm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Sep 30 21:17:52 compute-1 NetworkManager[51724]: <info>  [1759267072.7781] manager: (tap730a9e74-90): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Sep 30 21:17:52 compute-1 systemd-udevd[222240]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:52 compute-1 NetworkManager[51724]: <info>  [1759267072.8175] device (tap730a9e74-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:17:52 compute-1 kernel: tap730a9e74-90: entered promiscuous mode
Sep 30 21:17:52 compute-1 NetworkManager[51724]: <info>  [1759267072.8190] device (tap730a9e74-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:17:52 compute-1 systemd-machined[152783]: New machine qemu-8-instance-0000000c.
Sep 30 21:17:52 compute-1 ovn_controller[94902]: 2025-09-30T21:17:52Z|00076|binding|INFO|Claiming lport 730a9e74-900e-49b2-a5c3-043d6da1a52b for this additional chassis.
Sep 30 21:17:52 compute-1 ovn_controller[94902]: 2025-09-30T21:17:52Z|00077|binding|INFO|730a9e74-900e-49b2-a5c3-043d6da1a52b: Claiming fa:16:3e:72:47:71 10.100.0.11
Sep 30 21:17:52 compute-1 ovn_controller[94902]: 2025-09-30T21:17:52Z|00078|binding|INFO|Claiming lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 for this additional chassis.
Sep 30 21:17:52 compute-1 ovn_controller[94902]: 2025-09-30T21:17:52Z|00079|binding|INFO|9ece0208-0151-4f0a-bda7-acd45fe4f2a0: Claiming fa:16:3e:28:e2:ac 19.80.0.112
Sep 30 21:17:52 compute-1 nova_compute[192795]: 2025-09-30 21:17:52.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:52 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-0000000c.
Sep 30 21:17:52 compute-1 ovn_controller[94902]: 2025-09-30T21:17:52Z|00080|binding|INFO|Setting lport 730a9e74-900e-49b2-a5c3-043d6da1a52b ovn-installed in OVS
Sep 30 21:17:52 compute-1 nova_compute[192795]: 2025-09-30 21:17:52.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:52 compute-1 nova_compute[192795]: 2025-09-30 21:17:52.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.296 2 DEBUG nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received event network-vif-unplugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.297 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "709e44da-6758-4be0-9022-83a511ea88eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.297 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.298 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.298 2 DEBUG nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] No waiting events found dispatching network-vif-unplugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.298 2 DEBUG nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received event network-vif-unplugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.298 2 DEBUG nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received event network-vif-plugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.298 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "709e44da-6758-4be0-9022-83a511ea88eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.298 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.298 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.299 2 DEBUG nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] No waiting events found dispatching network-vif-plugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.299 2 WARNING nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received unexpected event network-vif-plugged-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f for instance with vm_state active and task_state deleting.
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.310 2 DEBUG nova.compute.manager [req-d4c618e1-a199-435b-9e63-e0082dfa27e9 req-3b07bae7-7fb2-45b8-b178-01cd34ea61f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.310 2 DEBUG oslo_concurrency.lockutils [req-d4c618e1-a199-435b-9e63-e0082dfa27e9 req-3b07bae7-7fb2-45b8-b178-01cd34ea61f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.310 2 DEBUG oslo_concurrency.lockutils [req-d4c618e1-a199-435b-9e63-e0082dfa27e9 req-3b07bae7-7fb2-45b8-b178-01cd34ea61f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.310 2 DEBUG oslo_concurrency.lockutils [req-d4c618e1-a199-435b-9e63-e0082dfa27e9 req-3b07bae7-7fb2-45b8-b178-01cd34ea61f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.311 2 DEBUG nova.compute.manager [req-d4c618e1-a199-435b-9e63-e0082dfa27e9 req-3b07bae7-7fb2-45b8-b178-01cd34ea61f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.311 2 DEBUG nova.compute.manager [req-d4c618e1-a199-435b-9e63-e0082dfa27e9 req-3b07bae7-7fb2-45b8-b178-01cd34ea61f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:17:53 compute-1 ovn_controller[94902]: 2025-09-30T21:17:53Z|00081|memory|INFO|peak resident set size grew 51% in last 1076.2 seconds, from 16128 kB to 24320 kB
Sep 30 21:17:53 compute-1 ovn_controller[94902]: 2025-09-30T21:17:53Z|00082|memory|INFO|idl-cells-OVN_Southbound:10441 idl-cells-Open_vSwitch:984 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:367 lflow-cache-entries-cache-matches:293 lflow-cache-size-KB:1526 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:715 ofctrl_installed_flow_usage-KB:521 ofctrl_sb_flow_ref_usage-KB:266
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.388 2 DEBUG nova.network.neutron [-] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.426 2 INFO nova.compute.manager [-] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Took 1.54 seconds to deallocate network for instance.
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.498 2 DEBUG oslo_concurrency.lockutils [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.499 2 DEBUG oslo_concurrency.lockutils [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.813 2 DEBUG nova.compute.provider_tree [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.829 2 DEBUG nova.scheduler.client.report [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.858 2 DEBUG oslo_concurrency.lockutils [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.890 2 INFO nova.scheduler.client.report [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Deleted allocations for instance 709e44da-6758-4be0-9022-83a511ea88eb
Sep 30 21:17:53 compute-1 nova_compute[192795]: 2025-09-30 21:17:53.971 2 DEBUG oslo_concurrency.lockutils [None req-98e5a611-ad92-4694-998e-5c53a37fdb40 0345cb3cbb454545b3c51ba5db7a66fc 8682f64a38a84e09b7a6bf7d33b1a5aa - - default default] Lock "709e44da-6758-4be0-9022-83a511ea88eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.024 2 INFO nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Took 5.75 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.024 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.044 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxbhoey6v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(f0ac7ef9-959c-499f-b60b-f101fa97d660),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.065 2 DEBUG nova.objects.instance [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f0e9e16-1467-41e5-b5b0-965591aa014c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.066 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.069 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.069 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.081 2 DEBUG nova.virt.libvirt.vif [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:17:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-772926731',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-772926731',id=11,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-kiix6ynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:17:42Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.081 2 DEBUG nova.network.os_vif_util [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converting VIF {"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.082 2 DEBUG nova.network.os_vif_util [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.082 2 DEBUG nova.virt.libvirt.migration [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 21:17:54 compute-1 nova_compute[192795]:   <mac address="fa:16:3e:45:9d:62"/>
Sep 30 21:17:54 compute-1 nova_compute[192795]:   <model type="virtio"/>
Sep 30 21:17:54 compute-1 nova_compute[192795]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:17:54 compute-1 nova_compute[192795]:   <mtu size="1442"/>
Sep 30 21:17:54 compute-1 nova_compute[192795]:   <target dev="tapbb8ecc8e-9c"/>
Sep 30 21:17:54 compute-1 nova_compute[192795]: </interface>
Sep 30 21:17:54 compute-1 nova_compute[192795]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.083 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.322 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267074.322102, b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.323 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] VM Started (Lifecycle Event)
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.572 2 DEBUG nova.virt.libvirt.migration [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:54 compute-1 nova_compute[192795]: 2025-09-30 21:17:54.572 2 INFO nova.virt.libvirt.migration [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 21:17:55 compute-1 nova_compute[192795]: 2025-09-30 21:17:55.357 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:55 compute-1 nova_compute[192795]: 2025-09-30 21:17:55.472 2 INFO nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 21:17:55 compute-1 nova_compute[192795]: 2025-09-30 21:17:55.728 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267075.7280455, b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:55 compute-1 nova_compute[192795]: 2025-09-30 21:17:55.728 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] VM Resumed (Lifecycle Event)
Sep 30 21:17:55 compute-1 nova_compute[192795]: 2025-09-30 21:17:55.752 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:55 compute-1 nova_compute[192795]: 2025-09-30 21:17:55.757 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:55 compute-1 nova_compute[192795]: 2025-09-30 21:17:55.774 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Sep 30 21:17:55 compute-1 nova_compute[192795]: 2025-09-30 21:17:55.977 2 DEBUG nova.virt.libvirt.migration [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:55 compute-1 nova_compute[192795]: 2025-09-30 21:17:55.978 2 DEBUG nova.virt.libvirt.migration [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.465 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267076.4648147, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.466 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Paused (Lifecycle Event)
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.494 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.495 2 DEBUG nova.virt.libvirt.migration [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.495 2 DEBUG nova.virt.libvirt.migration [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.499 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.533 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] During sync_power_state the instance has a pending task (migrating). Skip.
Sep 30 21:17:56 compute-1 kernel: tapbb8ecc8e-9c (unregistering): left promiscuous mode
Sep 30 21:17:56 compute-1 NetworkManager[51724]: <info>  [1759267076.6145] device (tapbb8ecc8e-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:17:56 compute-1 ovn_controller[94902]: 2025-09-30T21:17:56Z|00083|binding|INFO|Releasing lport bb8ecc8e-9cf4-4901-9788-83c49356f983 from this chassis (sb_readonly=0)
Sep 30 21:17:56 compute-1 ovn_controller[94902]: 2025-09-30T21:17:56Z|00084|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 down in Southbound
Sep 30 21:17:56 compute-1 ovn_controller[94902]: 2025-09-30T21:17:56Z|00085|binding|INFO|Removing iface tapbb8ecc8e-9c ovn-installed in OVS
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:56.629 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:9d:62 10.100.0.12'], port_security=['fa:16:3e:45:9d:62 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3b817c7f-1137-4e8f-8263-8c5e6eddafa4'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '18', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=bb8ecc8e-9cf4-4901-9788-83c49356f983) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:56.631 103861 INFO neutron.agent.ovn.metadata.agent [-] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 unbound from our chassis
Sep 30 21:17:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:56.635 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 934fff90-5446-41f1-a5ad-d2568cb337b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:56.637 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[54908386-be9c-42a8-9e0c-3762af362ec9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:56.638 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 namespace which is not needed anymore
Sep 30 21:17:56 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Sep 30 21:17:56 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Consumed 2.657s CPU time.
Sep 30 21:17:56 compute-1 systemd-machined[152783]: Machine qemu-7-instance-0000000b terminated.
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222080]: [NOTICE]   (222084) : haproxy version is 2.8.14-c23fe91
Sep 30 21:17:56 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222080]: [NOTICE]   (222084) : path to executable is /usr/sbin/haproxy
Sep 30 21:17:56 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222080]: [WARNING]  (222084) : Exiting Master process...
Sep 30 21:17:56 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222080]: [WARNING]  (222084) : Exiting Master process...
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222080]: [ALERT]    (222084) : Current worker (222086) exited with code 143 (Terminated)
Sep 30 21:17:56 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222080]: [WARNING]  (222084) : All workers exited. Exiting... (0)
Sep 30 21:17:56 compute-1 systemd[1]: libpod-d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552.scope: Deactivated successfully.
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-1 podman[222442]: 2025-09-30 21:17:56.829810689 +0000 UTC m=+0.058106731 container died d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.870 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.871 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.872 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Sep 30 21:17:56 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552-userdata-shm.mount: Deactivated successfully.
Sep 30 21:17:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-298d531c0c7b0413c31f3f7c19e227eaf1df8d8a51fcb60fc4e1d0546b614038-merged.mount: Deactivated successfully.
Sep 30 21:17:56 compute-1 podman[222442]: 2025-09-30 21:17:56.889049671 +0000 UTC m=+0.117345723 container cleanup d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:17:56 compute-1 systemd[1]: libpod-conmon-d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552.scope: Deactivated successfully.
Sep 30 21:17:56 compute-1 podman[222484]: 2025-09-30 21:17:56.954558252 +0000 UTC m=+0.040347731 container remove d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:17:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:56.963 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[15a3d978-8179-4efc-bd1a-6e2b7cfa6ad5]: (4, ('Tue Sep 30 09:17:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 (d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552)\nd70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552\nTue Sep 30 09:17:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 (d70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552)\nd70010a79cbcb5623f6beedf7642df53867f0eb77809aa8a7589fa19e7d8e552\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:56.965 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3bc2b84b-f809-45c3-a20c-b9e682f314f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:56.967 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-1 kernel: tap934fff90-50: left promiscuous mode
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:56.989 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a70938e3-eb39-460e-8d52-ae58c998be0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.997 2 DEBUG nova.virt.libvirt.guest [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '7f0e9e16-1467-41e5-b5b0-965591aa014c' (instance-0000000b) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.998 2 INFO nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration operation has completed
Sep 30 21:17:56 compute-1 nova_compute[192795]: 2025-09-30 21:17:56.998 2 INFO nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] _post_live_migration() is started..
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.026 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4eba69b7-4233-4355-a8ca-cf3efa2245d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.028 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[747ab921-740a-4b4b-9ea0-91356198d75c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.050 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5fa7ca-f1fb-4304-aab2-2af4c020310e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375344, 'reachable_time': 44109, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222503, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.053 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.053 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5d48b8-bdcc-473e-9b64-aee813e3e101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 systemd[1]: run-netns-ovnmeta\x2d934fff90\x2d5446\x2d41f1\x2da5ad\x2dd2568cb337b1.mount: Deactivated successfully.
Sep 30 21:17:57 compute-1 ovn_controller[94902]: 2025-09-30T21:17:57Z|00086|binding|INFO|Claiming lport 730a9e74-900e-49b2-a5c3-043d6da1a52b for this chassis.
Sep 30 21:17:57 compute-1 ovn_controller[94902]: 2025-09-30T21:17:57Z|00087|binding|INFO|730a9e74-900e-49b2-a5c3-043d6da1a52b: Claiming fa:16:3e:72:47:71 10.100.0.11
Sep 30 21:17:57 compute-1 ovn_controller[94902]: 2025-09-30T21:17:57Z|00088|binding|INFO|Claiming lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 for this chassis.
Sep 30 21:17:57 compute-1 ovn_controller[94902]: 2025-09-30T21:17:57Z|00089|binding|INFO|9ece0208-0151-4f0a-bda7-acd45fe4f2a0: Claiming fa:16:3e:28:e2:ac 19.80.0.112
Sep 30 21:17:57 compute-1 ovn_controller[94902]: 2025-09-30T21:17:57Z|00090|binding|INFO|Setting lport 730a9e74-900e-49b2-a5c3-043d6da1a52b up in Southbound
Sep 30 21:17:57 compute-1 ovn_controller[94902]: 2025-09-30T21:17:57Z|00091|binding|INFO|Setting lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 up in Southbound
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.199 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e2:ac 19.80.0.112'], port_security=['fa:16:3e:28:e2:ac 19.80.0.112'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['730a9e74-900e-49b2-a5c3-043d6da1a52b'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1118581425', 'neutron:cidrs': '19.80.0.112/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1118581425', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '4', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=10e37694-4797-470f-adb8-72e2aa69e8d9, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ece0208-0151-4f0a-bda7-acd45fe4f2a0) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.202 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:47:71 10.100.0.11'], port_security=['fa:16:3e:72:47:71 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-415058274', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-415058274', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '11', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=730a9e74-900e-49b2-a5c3-043d6da1a52b) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.203 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 in datapath 4a7886e9-2920-46a8-89e7-811c01f2e7c6 bound to our chassis
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.206 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a7886e9-2920-46a8-89e7-811c01f2e7c6
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.222 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fd99ceed-b3f2-4a2c-bce3-558282d3e244]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.223 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4a7886e9-21 in ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.225 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4a7886e9-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.225 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[76ad8f51-091e-4edb-b6da-ff3dca847835]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.226 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef64a44-94fb-4111-81d2-4df8dff99c65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.241 2 DEBUG nova.compute.manager [req-61628e32-01d4-48e8-bd95-3619bbcee49f req-5b91e394-280a-4491-be12-bb0514b45ab3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Received event network-vif-deleted-35b0bc25-77bf-479e-9d9b-4e20e3a9da8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.242 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[b8633050-4d9a-4d47-afce-dca43bf0ad8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.260 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a1296013-0fb8-48a7-ab34-feb2e61bff39]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.292 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b46df89e-7374-4e68-bfc7-1a88d79e3143]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 systemd-udevd[222421]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.298 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea3ace3-68a6-4f22-8595-cf32151d7723]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 NetworkManager[51724]: <info>  [1759267077.3012] manager: (tap4a7886e9-20): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.336 2 DEBUG nova.compute.manager [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.336 2 DEBUG oslo_concurrency.lockutils [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.336 2 DEBUG oslo_concurrency.lockutils [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.337 2 DEBUG oslo_concurrency.lockutils [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.337 2 DEBUG nova.compute.manager [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.337 2 WARNING nova.compute.manager [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state migrating.
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.337 2 DEBUG nova.compute.manager [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-changed-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.337 2 DEBUG nova.compute.manager [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Refreshing instance network info cache due to event network-changed-bb8ecc8e-9cf4-4901-9788-83c49356f983. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.337 2 DEBUG oslo_concurrency.lockutils [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.337 2 DEBUG oslo_concurrency.lockutils [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.338 2 DEBUG nova.network.neutron [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Refreshing network info cache for port bb8ecc8e-9cf4-4901-9788-83c49356f983 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.345 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f3ff0a-081e-4cc0-8407-26f7b94b976d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.349 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[309d4320-38c9-4d61-ac43-d7b3b30bd246]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 NetworkManager[51724]: <info>  [1759267077.3794] device (tap4a7886e9-20): carrier: link connected
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.388 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9ebb9c2e-1edb-44fd-8c45-010e9c60414e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.411 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2d8637-83ae-4c4d-874f-042f76b2683d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a7886e9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:12:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377305, 'reachable_time': 23734, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222532, 'error': None, 'target': 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.422 2 INFO nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Post operation of migration started
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.435 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[19504f36-0cf6-4043-8013-eb36ce2f0cda]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:12cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377305, 'tstamp': 377305}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222533, 'error': None, 'target': 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.456 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[09009f13-a59e-4fe3-bfaf-2a3c7440af32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a7886e9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:12:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377305, 'reachable_time': 23734, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222534, 'error': None, 'target': 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.507 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[89bdbe88-b0e0-46c1-901d-125f23fc0b54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.590 2 DEBUG nova.compute.manager [req-b7954fd7-54db-4315-be8b-f81ff50c433f req-ea7a01ad-78be-4fb8-bac8-cacad82a57f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.590 2 DEBUG oslo_concurrency.lockutils [req-b7954fd7-54db-4315-be8b-f81ff50c433f req-ea7a01ad-78be-4fb8-bac8-cacad82a57f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.591 2 DEBUG oslo_concurrency.lockutils [req-b7954fd7-54db-4315-be8b-f81ff50c433f req-ea7a01ad-78be-4fb8-bac8-cacad82a57f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.591 2 DEBUG oslo_concurrency.lockutils [req-b7954fd7-54db-4315-be8b-f81ff50c433f req-ea7a01ad-78be-4fb8-bac8-cacad82a57f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.591 2 DEBUG nova.compute.manager [req-b7954fd7-54db-4315-be8b-f81ff50c433f req-ea7a01ad-78be-4fb8-bac8-cacad82a57f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.591 2 DEBUG nova.compute.manager [req-b7954fd7-54db-4315-be8b-f81ff50c433f req-ea7a01ad-78be-4fb8-bac8-cacad82a57f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.604 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b83d69ee-e61f-430d-a9b7-a7016b05c1d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.606 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a7886e9-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.606 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.607 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a7886e9-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:57 compute-1 kernel: tap4a7886e9-20: entered promiscuous mode
Sep 30 21:17:57 compute-1 NetworkManager[51724]: <info>  [1759267077.6449] manager: (tap4a7886e9-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.652 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a7886e9-20, col_values=(('external_ids', {'iface-id': '976cb173-259a-473b-830a-8c627acdbeaf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:57 compute-1 ovn_controller[94902]: 2025-09-30T21:17:57Z|00092|binding|INFO|Releasing lport 976cb173-259a-473b-830a-8c627acdbeaf from this chassis (sb_readonly=0)
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.674 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4a7886e9-2920-46a8-89e7-811c01f2e7c6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4a7886e9-2920-46a8-89e7-811c01f2e7c6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.675 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a9886e88-39f0-4647-b750-55ba76a18def]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.675 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-4a7886e9-2920-46a8-89e7-811c01f2e7c6
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/4a7886e9-2920-46a8-89e7-811c01f2e7c6.pid.haproxy
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 4a7886e9-2920-46a8-89e7-811c01f2e7c6
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:17:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:57.676 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'env', 'PROCESS_TAG=haproxy-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4a7886e9-2920-46a8-89e7-811c01f2e7c6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.910 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.911 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquired lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:57 compute-1 nova_compute[192795]: 2025-09-30 21:17:57.912 2 DEBUG nova.network.neutron [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:58 compute-1 podman[222569]: 2025-09-30 21:17:58.080549157 +0000 UTC m=+0.055660956 container create 927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.107 2 DEBUG nova.network.neutron [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Activated binding for port bb8ecc8e-9cf4-4901-9788-83c49356f983 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.108 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.109 2 DEBUG nova.virt.libvirt.vif [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:17:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-772926731',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-772926731',id=11,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-kiix6ynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:17:46Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.109 2 DEBUG nova.network.os_vif_util [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converting VIF {"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.110 2 DEBUG nova.network.os_vif_util [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.110 2 DEBUG os_vif [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.112 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb8ecc8e-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.117 2 INFO os_vif [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c')
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.118 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.118 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.118 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.118 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.119 2 INFO nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Deleting instance files /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c_del
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.119 2 INFO nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Deletion of /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c_del complete
Sep 30 21:17:58 compute-1 systemd[1]: Started libpod-conmon-927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66.scope.
Sep 30 21:17:58 compute-1 podman[222569]: 2025-09-30 21:17:58.050559435 +0000 UTC m=+0.025671254 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:17:58 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:17:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8555f7d13cdd8a15881a903b41a32c4aecbda579e4c16d157b51c605f61e8f80/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:17:58 compute-1 podman[222569]: 2025-09-30 21:17:58.187474307 +0000 UTC m=+0.162586126 container init 927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:17:58 compute-1 podman[222569]: 2025-09-30 21:17:58.193385577 +0000 UTC m=+0.168497376 container start 927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:17:58 compute-1 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[222585]: [NOTICE]   (222589) : New worker (222591) forked
Sep 30 21:17:58 compute-1 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[222585]: [NOTICE]   (222589) : Loading success.
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.265 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 730a9e74-900e-49b2-a5c3-043d6da1a52b in datapath 16d40025-1087-460f-a42f-c007f6eff406 unbound from our chassis
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.268 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.283 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9e96b7ee-a696-4b07-8463-c639398efa64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.314 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[63e8601e-1b3f-4b88-bb60-11c578fe424a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.319 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e2bb20ea-b3c2-467e-b5fd-0538c84988f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.356 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f9c55072-b70b-4a09-9023-0a4ffef16215]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.375 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8c83a0c2-eb07-44c9-a5bb-f1dcc65a7582]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 43, 'tx_packets': 5, 'rx_bytes': 2852, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 5, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 43, 'tx_packets': 5, 'rx_bytes': 2852, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 5, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371395, 'reachable_time': 22266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 18, 'inoctets': 1460, 'indelivers': 6, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 18, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1460, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 18, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 6, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222605, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.404 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e994783f-1340-4532-9c10-e24f17e43348]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap16d40025-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371410, 'tstamp': 371410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222606, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap16d40025-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371414, 'tstamp': 371414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222606, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.406 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:58 compute-1 nova_compute[192795]: 2025-09-30 21:17:58.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.411 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16d40025-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.411 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.411 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16d40025-10, col_values=(('external_ids', {'iface-id': '0c66892e-7baf-4f9a-a329-dd0545dbf700'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:17:58.411 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.750 2 DEBUG nova.compute.manager [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.751 2 DEBUG oslo_concurrency.lockutils [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.751 2 DEBUG oslo_concurrency.lockutils [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.751 2 DEBUG oslo_concurrency.lockutils [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.751 2 DEBUG nova.compute.manager [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.752 2 WARNING nova.compute.manager [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state migrating.
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.752 2 DEBUG nova.compute.manager [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.752 2 DEBUG oslo_concurrency.lockutils [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.752 2 DEBUG oslo_concurrency.lockutils [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.752 2 DEBUG oslo_concurrency.lockutils [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.753 2 DEBUG nova.compute.manager [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.753 2 WARNING nova.compute.manager [req-506e3234-1ce8-483c-9f55-7e91729a5995 req-fea71de5-db72-40d6-b089-728063064909 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state migrating.
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.764 2 DEBUG nova.network.neutron [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updated VIF entry in instance network info cache for port bb8ecc8e-9cf4-4901-9788-83c49356f983. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.765 2 DEBUG nova.network.neutron [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:59 compute-1 nova_compute[192795]: 2025-09-30 21:17:59.796 2 DEBUG oslo_concurrency.lockutils [req-bdde3e13-cdbe-4ca6-a3dd-fb57b87910dc req-bb3a9f5c-e8d4-4c37-b8af-4c0c642a9b8c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:00 compute-1 nova_compute[192795]: 2025-09-30 21:18:00.106 2 DEBUG nova.network.neutron [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating instance_info_cache with network_info: [{"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:00 compute-1 nova_compute[192795]: 2025-09-30 21:18:00.154 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Releasing lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:00 compute-1 nova_compute[192795]: 2025-09-30 21:18:00.190 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:00 compute-1 nova_compute[192795]: 2025-09-30 21:18:00.190 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:00 compute-1 nova_compute[192795]: 2025-09-30 21:18:00.190 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:00 compute-1 nova_compute[192795]: 2025-09-30 21:18:00.194 2 INFO nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 21:18:00 compute-1 virtqemud[192217]: Domain id=8 name='instance-0000000c' uuid=b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 is tainted: custom-monitor
Sep 30 21:18:00 compute-1 podman[222607]: 2025-09-30 21:18:00.253354114 +0000 UTC m=+0.074757362 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:18:01 compute-1 nova_compute[192795]: 2025-09-30 21:18:01.205 2 INFO nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 21:18:01 compute-1 nova_compute[192795]: 2025-09-30 21:18:01.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:02 compute-1 nova_compute[192795]: 2025-09-30 21:18:02.072 2 DEBUG nova.compute.manager [req-c15de830-7343-423d-80ce-ab37462db27b req-2b9e0a53-2a2d-4a22-829a-25328938ff62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:02 compute-1 nova_compute[192795]: 2025-09-30 21:18:02.072 2 DEBUG oslo_concurrency.lockutils [req-c15de830-7343-423d-80ce-ab37462db27b req-2b9e0a53-2a2d-4a22-829a-25328938ff62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:02 compute-1 nova_compute[192795]: 2025-09-30 21:18:02.073 2 DEBUG oslo_concurrency.lockutils [req-c15de830-7343-423d-80ce-ab37462db27b req-2b9e0a53-2a2d-4a22-829a-25328938ff62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:02 compute-1 nova_compute[192795]: 2025-09-30 21:18:02.073 2 DEBUG oslo_concurrency.lockutils [req-c15de830-7343-423d-80ce-ab37462db27b req-2b9e0a53-2a2d-4a22-829a-25328938ff62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:02 compute-1 nova_compute[192795]: 2025-09-30 21:18:02.073 2 DEBUG nova.compute.manager [req-c15de830-7343-423d-80ce-ab37462db27b req-2b9e0a53-2a2d-4a22-829a-25328938ff62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:02 compute-1 nova_compute[192795]: 2025-09-30 21:18:02.074 2 WARNING nova.compute.manager [req-c15de830-7343-423d-80ce-ab37462db27b req-2b9e0a53-2a2d-4a22-829a-25328938ff62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state migrating.
Sep 30 21:18:02 compute-1 nova_compute[192795]: 2025-09-30 21:18:02.211 2 INFO nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 21:18:02 compute-1 nova_compute[192795]: 2025-09-30 21:18:02.218 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:02 compute-1 nova_compute[192795]: 2025-09-30 21:18:02.251 2 DEBUG nova.objects.instance [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:18:02 compute-1 podman[222627]: 2025-09-30 21:18:02.257457011 +0000 UTC m=+0.098464294 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Sep 30 21:18:02 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:18:02 compute-1 systemd[222327]: Activating special unit Exit the Session...
Sep 30 21:18:02 compute-1 systemd[222327]: Stopped target Main User Target.
Sep 30 21:18:02 compute-1 systemd[222327]: Stopped target Basic System.
Sep 30 21:18:02 compute-1 systemd[222327]: Stopped target Paths.
Sep 30 21:18:02 compute-1 systemd[222327]: Stopped target Sockets.
Sep 30 21:18:02 compute-1 systemd[222327]: Stopped target Timers.
Sep 30 21:18:02 compute-1 systemd[222327]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:18:02 compute-1 systemd[222327]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:18:02 compute-1 systemd[222327]: Closed D-Bus User Message Bus Socket.
Sep 30 21:18:02 compute-1 systemd[222327]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:18:02 compute-1 systemd[222327]: Removed slice User Application Slice.
Sep 30 21:18:02 compute-1 systemd[222327]: Reached target Shutdown.
Sep 30 21:18:02 compute-1 systemd[222327]: Finished Exit the Session.
Sep 30 21:18:02 compute-1 systemd[222327]: Reached target Exit the Session.
Sep 30 21:18:02 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:18:02 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:18:02 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:18:02 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:18:02 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:18:02 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:18:02 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.148 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.149 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.150 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.174 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.175 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.175 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.175 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.250 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.315 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.317 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.384 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.390 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.473 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.475 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.535 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.705 2 WARNING nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.707 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5389MB free_disk=73.40472412109375GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.707 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.708 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.754 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Applying migration context for instance b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 as it has an incoming, in-progress migration 75c88f2a-0d52-4e9d-8432-8454d77751e5. Migration status is running _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.754 2 DEBUG nova.objects.instance [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.755 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Migration for instance 7f0e9e16-1467-41e5-b5b0-965591aa014c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.755 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.779 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.816 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Instance 252d5457-8837-4aa6-b309-c3139e8db7ed actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.816 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Instance b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.817 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Migration f0ac7ef9-959c-499f-b60b-f101fa97d660 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.817 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.817 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.938 2 DEBUG nova.compute.provider_tree [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:03 compute-1 nova_compute[192795]: 2025-09-30 21:18:03.961 2 DEBUG nova.scheduler.client.report [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:04 compute-1 nova_compute[192795]: 2025-09-30 21:18:04.004 2 DEBUG nova.compute.resource_tracker [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:18:04 compute-1 nova_compute[192795]: 2025-09-30 21:18:04.005 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:04 compute-1 nova_compute[192795]: 2025-09-30 21:18:04.021 2 INFO nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Sep 30 21:18:04 compute-1 nova_compute[192795]: 2025-09-30 21:18:04.166 2 INFO nova.scheduler.client.report [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Deleted allocation for migration f0ac7ef9-959c-499f-b60b-f101fa97d660
Sep 30 21:18:04 compute-1 nova_compute[192795]: 2025-09-30 21:18:04.166 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Sep 30 21:18:04 compute-1 podman[222664]: 2025-09-30 21:18:04.229624553 +0000 UTC m=+0.065477301 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.223 2 DEBUG oslo_concurrency.lockutils [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.223 2 DEBUG oslo_concurrency.lockutils [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.224 2 DEBUG oslo_concurrency.lockutils [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.224 2 DEBUG oslo_concurrency.lockutils [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.224 2 DEBUG oslo_concurrency.lockutils [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.235 2 INFO nova.compute.manager [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Terminating instance
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.244 2 DEBUG nova.compute.manager [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:18:05 compute-1 kernel: tap730a9e74-90 (unregistering): left promiscuous mode
Sep 30 21:18:05 compute-1 NetworkManager[51724]: <info>  [1759267085.2660] device (tap730a9e74-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:18:05 compute-1 ovn_controller[94902]: 2025-09-30T21:18:05Z|00093|binding|INFO|Releasing lport 730a9e74-900e-49b2-a5c3-043d6da1a52b from this chassis (sb_readonly=0)
Sep 30 21:18:05 compute-1 ovn_controller[94902]: 2025-09-30T21:18:05Z|00094|binding|INFO|Setting lport 730a9e74-900e-49b2-a5c3-043d6da1a52b down in Southbound
Sep 30 21:18:05 compute-1 ovn_controller[94902]: 2025-09-30T21:18:05Z|00095|binding|INFO|Releasing lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 from this chassis (sb_readonly=0)
Sep 30 21:18:05 compute-1 ovn_controller[94902]: 2025-09-30T21:18:05Z|00096|binding|INFO|Setting lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 down in Southbound
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 ovn_controller[94902]: 2025-09-30T21:18:05Z|00097|binding|INFO|Removing iface tap730a9e74-90 ovn-installed in OVS
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.284 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e2:ac 19.80.0.112'], port_security=['fa:16:3e:28:e2:ac 19.80.0.112'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['730a9e74-900e-49b2-a5c3-043d6da1a52b'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1118581425', 'neutron:cidrs': '19.80.0.112/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1118581425', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '5', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=10e37694-4797-470f-adb8-72e2aa69e8d9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ece0208-0151-4f0a-bda7-acd45fe4f2a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.286 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:47:71 10.100.0.11'], port_security=['fa:16:3e:72:47:71 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-415058274', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-415058274', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '13', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=730a9e74-900e-49b2-a5c3-043d6da1a52b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.287 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 in datapath 4a7886e9-2920-46a8-89e7-811c01f2e7c6 unbound from our chassis
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.289 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a7886e9-2920-46a8-89e7-811c01f2e7c6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:18:05 compute-1 ovn_controller[94902]: 2025-09-30T21:18:05Z|00098|binding|INFO|Releasing lport 0c66892e-7baf-4f9a-a329-dd0545dbf700 from this chassis (sb_readonly=0)
Sep 30 21:18:05 compute-1 ovn_controller[94902]: 2025-09-30T21:18:05Z|00099|binding|INFO|Releasing lport 976cb173-259a-473b-830a-8c627acdbeaf from this chassis (sb_readonly=0)
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.292 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c0644f0b-bd1a-4ce9-a8f8-350e0883cca5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.293 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 namespace which is not needed anymore
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[222585]: [NOTICE]   (222589) : haproxy version is 2.8.14-c23fe91
Sep 30 21:18:05 compute-1 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[222585]: [NOTICE]   (222589) : path to executable is /usr/sbin/haproxy
Sep 30 21:18:05 compute-1 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[222585]: [WARNING]  (222589) : Exiting Master process...
Sep 30 21:18:05 compute-1 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[222585]: [WARNING]  (222589) : Exiting Master process...
Sep 30 21:18:05 compute-1 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[222585]: [ALERT]    (222589) : Current worker (222591) exited with code 143 (Terminated)
Sep 30 21:18:05 compute-1 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[222585]: [WARNING]  (222589) : All workers exited. Exiting... (0)
Sep 30 21:18:05 compute-1 systemd[1]: libpod-927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66.scope: Deactivated successfully.
Sep 30 21:18:05 compute-1 podman[222711]: 2025-09-30 21:18:05.424569982 +0000 UTC m=+0.047946508 container died 927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:18:05 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Sep 30 21:18:05 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Consumed 2.329s CPU time.
Sep 30 21:18:05 compute-1 systemd-machined[152783]: Machine qemu-8-instance-0000000c terminated.
Sep 30 21:18:05 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66-userdata-shm.mount: Deactivated successfully.
Sep 30 21:18:05 compute-1 systemd[1]: var-lib-containers-storage-overlay-8555f7d13cdd8a15881a903b41a32c4aecbda579e4c16d157b51c605f61e8f80-merged.mount: Deactivated successfully.
Sep 30 21:18:05 compute-1 podman[222711]: 2025-09-30 21:18:05.594477496 +0000 UTC m=+0.217854022 container cleanup 927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:18:05 compute-1 ovn_controller[94902]: 2025-09-30T21:18:05Z|00100|binding|INFO|Releasing lport 0c66892e-7baf-4f9a-a329-dd0545dbf700 from this chassis (sb_readonly=0)
Sep 30 21:18:05 compute-1 ovn_controller[94902]: 2025-09-30T21:18:05Z|00101|binding|INFO|Releasing lport 976cb173-259a-473b-830a-8c627acdbeaf from this chassis (sb_readonly=0)
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 systemd[1]: libpod-conmon-927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66.scope: Deactivated successfully.
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.611 2 INFO nova.virt.libvirt.driver [-] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Instance destroyed successfully.
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.612 2 DEBUG nova.objects.instance [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lazy-loading 'resources' on Instance uuid b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.641 2 DEBUG nova.virt.libvirt.vif [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-2078302607',display_name='tempest-LiveMigrationTest-server-2078302607',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-2078302607',id=12,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-1871um1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:18:02Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.641 2 DEBUG nova.network.os_vif_util [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converting VIF {"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.642 2 DEBUG nova.network.os_vif_util [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.642 2 DEBUG os_vif [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.644 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap730a9e74-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.651 2 INFO os_vif [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90')
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.652 2 INFO nova.virt.libvirt.driver [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Deleting instance files /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65_del
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.652 2 INFO nova.virt.libvirt.driver [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Deletion of /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65_del complete
Sep 30 21:18:05 compute-1 podman[222760]: 2025-09-30 21:18:05.671410776 +0000 UTC m=+0.048625466 container remove 927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.676 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[88cb0339-a058-4a5c-bcbe-e4b2d5cc47af]: (4, ('Tue Sep 30 09:18:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 (927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66)\n927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66\nTue Sep 30 09:18:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 (927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66)\n927f101adccea0fbf9cc26dd9135da33ba75b2ae32099a2c567125bdb564bb66\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.678 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f138c1-8f8a-4694-ac5a-9603ff8dcc61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.679 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a7886e9-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 kernel: tap4a7886e9-20: left promiscuous mode
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.700 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[59c17bed-bdd2-44ba-84f2-0069fb45ba1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.727 2 INFO nova.compute.manager [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Took 0.48 seconds to destroy the instance on the hypervisor.
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.727 2 DEBUG oslo.service.loopingcall [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.728 2 DEBUG nova.compute.manager [-] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.728 2 DEBUG nova.network.neutron [-] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.728 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[028af891-c8df-4607-bd4f-66b7ab07c2c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.729 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b585595f-46cf-43e1-996f-b665e43d198b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.749 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[477a5fa1-1688-44fb-8929-e68a44f388e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377295, 'reachable_time': 38731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222775, 'error': None, 'target': 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 systemd[1]: run-netns-ovnmeta\x2d4a7886e9\x2d2920\x2d46a8\x2d89e7\x2d811c01f2e7c6.mount: Deactivated successfully.
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.752 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.753 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[87a983cd-8c10-4a6c-8eae-63825736a91f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.753 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 730a9e74-900e-49b2-a5c3-043d6da1a52b in datapath 16d40025-1087-460f-a42f-c007f6eff406 unbound from our chassis
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.755 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.776 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc55db4-6efa-49d9-aa44-3567b5e4a880]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.807 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[cb5ad823-d96a-4807-a05c-585a66edbfbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.811 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[333787e9-320f-4db8-b11b-01061f6d3296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.852 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3839d1-cea1-496b-9522-ed02825d81a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.879 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[29205612-3ba6-41d3-aa32-de974159f1f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 58, 'tx_packets': 7, 'rx_bytes': 3482, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 5, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 58, 'tx_packets': 7, 'rx_bytes': 3482, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 5, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371395, 'reachable_time': 22266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 18, 'inoctets': 1460, 'indelivers': 6, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 18, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1460, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 18, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 6, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222782, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.902 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[98e2efd6-d1c7-4210-bfdd-2aeb583494c2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap16d40025-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371410, 'tstamp': 371410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222783, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap16d40025-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371414, 'tstamp': 371414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222783, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.904 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 nova_compute[192795]: 2025-09-30 21:18:05.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.909 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16d40025-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.909 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.910 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16d40025-10, col_values=(('external_ids', {'iface-id': '0c66892e-7baf-4f9a-a329-dd0545dbf700'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:05.911 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.354 2 DEBUG nova.compute.manager [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.355 2 DEBUG oslo_concurrency.lockutils [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.355 2 DEBUG oslo_concurrency.lockutils [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.355 2 DEBUG oslo_concurrency.lockutils [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.355 2 DEBUG nova.compute.manager [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.356 2 DEBUG nova.compute.manager [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.356 2 DEBUG nova.compute.manager [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.356 2 DEBUG oslo_concurrency.lockutils [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.356 2 DEBUG oslo_concurrency.lockutils [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.356 2 DEBUG oslo_concurrency.lockutils [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.356 2 DEBUG nova.compute.manager [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.357 2 WARNING nova.compute.manager [req-fff3bb3f-7a3a-459a-8547-b69697f2ab10 req-f30d3f7a-8ddd-49c2-b4b8-e30cbb9d27b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received unexpected event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with vm_state active and task_state deleting.
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.743 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267071.7418582, 709e44da-6758-4be0-9022-83a511ea88eb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.743 2 INFO nova.compute.manager [-] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] VM Stopped (Lifecycle Event)
Sep 30 21:18:06 compute-1 nova_compute[192795]: 2025-09-30 21:18:06.764 2 DEBUG nova.compute.manager [None req-96d0400e-47dd-419b-a4f7-d9f6afc6abe4 - - - - - -] [instance: 709e44da-6758-4be0-9022-83a511ea88eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:08 compute-1 nova_compute[192795]: 2025-09-30 21:18:08.217 2 DEBUG nova.network.neutron [-] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:08 compute-1 nova_compute[192795]: 2025-09-30 21:18:08.238 2 INFO nova.compute.manager [-] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Took 2.51 seconds to deallocate network for instance.
Sep 30 21:18:08 compute-1 nova_compute[192795]: 2025-09-30 21:18:08.310 2 DEBUG oslo_concurrency.lockutils [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:08 compute-1 nova_compute[192795]: 2025-09-30 21:18:08.310 2 DEBUG oslo_concurrency.lockutils [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:08 compute-1 nova_compute[192795]: 2025-09-30 21:18:08.375 2 DEBUG nova.compute.provider_tree [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:08 compute-1 nova_compute[192795]: 2025-09-30 21:18:08.390 2 DEBUG nova.scheduler.client.report [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:08 compute-1 nova_compute[192795]: 2025-09-30 21:18:08.411 2 DEBUG oslo_concurrency.lockutils [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:08 compute-1 nova_compute[192795]: 2025-09-30 21:18:08.436 2 INFO nova.scheduler.client.report [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Deleted allocations for instance b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65
Sep 30 21:18:08 compute-1 nova_compute[192795]: 2025-09-30 21:18:08.561 2 DEBUG oslo_concurrency.lockutils [None req-34625dc4-133c-41d8-9ccb-58ae0a643bef 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:10 compute-1 nova_compute[192795]: 2025-09-30 21:18:10.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:11 compute-1 nova_compute[192795]: 2025-09-30 21:18:11.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:11 compute-1 nova_compute[192795]: 2025-09-30 21:18:11.869 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267076.8672535, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:11 compute-1 nova_compute[192795]: 2025-09-30 21:18:11.869 2 INFO nova.compute.manager [-] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Stopped (Lifecycle Event)
Sep 30 21:18:11 compute-1 nova_compute[192795]: 2025-09-30 21:18:11.892 2 DEBUG nova.compute.manager [None req-39c11ac6-05ed-4db9-9396-cf5c18f3856c - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.102 2 DEBUG oslo_concurrency.lockutils [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.103 2 DEBUG oslo_concurrency.lockutils [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.103 2 DEBUG oslo_concurrency.lockutils [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.103 2 DEBUG oslo_concurrency.lockutils [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.103 2 DEBUG oslo_concurrency.lockutils [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.113 2 INFO nova.compute.manager [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Terminating instance
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.122 2 DEBUG nova.compute.manager [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:18:12 compute-1 kernel: tap70b5da71-31 (unregistering): left promiscuous mode
Sep 30 21:18:12 compute-1 NetworkManager[51724]: <info>  [1759267092.1496] device (tap70b5da71-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:12 compute-1 ovn_controller[94902]: 2025-09-30T21:18:12Z|00102|binding|INFO|Releasing lport 70b5da71-314a-4c92-9db2-fb08b57a6736 from this chassis (sb_readonly=0)
Sep 30 21:18:12 compute-1 ovn_controller[94902]: 2025-09-30T21:18:12Z|00103|binding|INFO|Setting lport 70b5da71-314a-4c92-9db2-fb08b57a6736 down in Southbound
Sep 30 21:18:12 compute-1 ovn_controller[94902]: 2025-09-30T21:18:12Z|00104|binding|INFO|Removing iface tap70b5da71-31 ovn-installed in OVS
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.168 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:31:8d 10.100.0.7'], port_security=['fa:16:3e:a9:31:8d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '23', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=70b5da71-314a-4c92-9db2-fb08b57a6736) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.170 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 70b5da71-314a-4c92-9db2-fb08b57a6736 in datapath 16d40025-1087-460f-a42f-c007f6eff406 unbound from our chassis
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.171 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16d40025-1087-460f-a42f-c007f6eff406, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.173 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9af46c36-e677-415a-a72e-de36787c85cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.174 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 namespace which is not needed anymore
Sep 30 21:18:12 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000005.scope: Deactivated successfully.
Sep 30 21:18:12 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000005.scope: Consumed 5.195s CPU time.
Sep 30 21:18:12 compute-1 systemd-machined[152783]: Machine qemu-4-instance-00000005 terminated.
Sep 30 21:18:12 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221465]: [NOTICE]   (221469) : haproxy version is 2.8.14-c23fe91
Sep 30 21:18:12 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221465]: [NOTICE]   (221469) : path to executable is /usr/sbin/haproxy
Sep 30 21:18:12 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221465]: [WARNING]  (221469) : Exiting Master process...
Sep 30 21:18:12 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221465]: [ALERT]    (221469) : Current worker (221471) exited with code 143 (Terminated)
Sep 30 21:18:12 compute-1 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221465]: [WARNING]  (221469) : All workers exited. Exiting... (0)
Sep 30 21:18:12 compute-1 systemd[1]: libpod-5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5.scope: Deactivated successfully.
Sep 30 21:18:12 compute-1 podman[222809]: 2025-09-30 21:18:12.299744091 +0000 UTC m=+0.042031787 container died 5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:18:12 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5-userdata-shm.mount: Deactivated successfully.
Sep 30 21:18:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-77042a2d836d8793894449ee2cc50aecbdfb2eacae855148d8250178088f11c3-merged.mount: Deactivated successfully.
Sep 30 21:18:12 compute-1 podman[222809]: 2025-09-30 21:18:12.335316723 +0000 UTC m=+0.077604429 container cleanup 5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:18:12 compute-1 systemd[1]: libpod-conmon-5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5.scope: Deactivated successfully.
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.389 2 INFO nova.virt.libvirt.driver [-] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Instance destroyed successfully.
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.389 2 DEBUG nova.objects.instance [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lazy-loading 'resources' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.404 2 DEBUG nova.compute.manager [req-35bfa4f9-4880-4be1-bb33-7907927f0354 req-a5ef7143-ea13-42da-91a4-5d5eb67d10f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.404 2 DEBUG oslo_concurrency.lockutils [req-35bfa4f9-4880-4be1-bb33-7907927f0354 req-a5ef7143-ea13-42da-91a4-5d5eb67d10f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.405 2 DEBUG oslo_concurrency.lockutils [req-35bfa4f9-4880-4be1-bb33-7907927f0354 req-a5ef7143-ea13-42da-91a4-5d5eb67d10f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.405 2 DEBUG oslo_concurrency.lockutils [req-35bfa4f9-4880-4be1-bb33-7907927f0354 req-a5ef7143-ea13-42da-91a4-5d5eb67d10f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.406 2 DEBUG nova.compute.manager [req-35bfa4f9-4880-4be1-bb33-7907927f0354 req-a5ef7143-ea13-42da-91a4-5d5eb67d10f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.406 2 DEBUG nova.compute.manager [req-35bfa4f9-4880-4be1-bb33-7907927f0354 req-a5ef7143-ea13-42da-91a4-5d5eb67d10f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.411 2 DEBUG nova.virt.libvirt.vif [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:16:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1363935032',display_name='tempest-LiveMigrationTest-server-1363935032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1363935032',id=5,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:16:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-0x92j8qn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:17:02Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=252d5457-8837-4aa6-b309-c3139e8db7ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.411 2 DEBUG nova.network.os_vif_util [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converting VIF {"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.412 2 DEBUG nova.network.os_vif_util [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.412 2 DEBUG os_vif [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:18:12 compute-1 podman[222855]: 2025-09-30 21:18:12.413228149 +0000 UTC m=+0.051059551 container remove 5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.414 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70b5da71-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.420 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c56d2d60-d314-49ff-9761-6bced47b14cd]: (4, ('Tue Sep 30 09:18:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 (5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5)\n5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5\nTue Sep 30 09:18:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 (5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5)\n5eb9032c90f60bc970b6578bb66cad5e8da03356eb080c20150cdf22670078f5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.422 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3ae54950-ba3b-46fa-89c6-c6f9865e8892]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.421 2 INFO os_vif [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31')
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.421 2 INFO nova.virt.libvirt.driver [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Deleting instance files /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed_del
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.422 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.422 2 INFO nova.virt.libvirt.driver [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Deletion of /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed_del complete
Sep 30 21:18:12 compute-1 kernel: tap16d40025-10: left promiscuous mode
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:12 compute-1 podman[222826]: 2025-09-30 21:18:12.435893002 +0000 UTC m=+0.107121727 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.437 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7cbc60-ff24-4db2-8de9-f11929a25add]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.472 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e14809d6-0044-4e93-a549-6536b190b747]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.474 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd08683-0e54-41a9-b1ca-0c0c32b069f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.496 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2d13b35c-9d9e-45ca-ad89-db7572e71674]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371385, 'reachable_time': 19554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222891, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.497 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:18:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:12.498 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[feebc3e8-ca07-46c0-a379-c4a166b5f9a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:12 compute-1 systemd[1]: run-netns-ovnmeta\x2d16d40025\x2d1087\x2d460f\x2da42f\x2dc007f6eff406.mount: Deactivated successfully.
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.552 2 INFO nova.compute.manager [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.554 2 DEBUG oslo.service.loopingcall [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.554 2 DEBUG nova.compute.manager [-] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:18:12 compute-1 nova_compute[192795]: 2025-09-30 21:18:12.554 2 DEBUG nova.network.neutron [-] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.257 2 DEBUG nova.network.neutron [-] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.278 2 INFO nova.compute.manager [-] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Took 0.72 seconds to deallocate network for instance.
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.358 2 DEBUG oslo_concurrency.lockutils [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.359 2 DEBUG oslo_concurrency.lockutils [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.389 2 DEBUG nova.compute.manager [req-5faefa14-b245-4f58-aaa0-981c3f4b2288 req-4463f252-0b84-4ad6-b1ad-525e8f138b65 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-deleted-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.404 2 DEBUG nova.compute.provider_tree [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.418 2 DEBUG nova.scheduler.client.report [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.436 2 DEBUG oslo_concurrency.lockutils [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.474 2 INFO nova.scheduler.client.report [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Deleted allocations for instance 252d5457-8837-4aa6-b309-c3139e8db7ed
Sep 30 21:18:13 compute-1 nova_compute[192795]: 2025-09-30 21:18:13.567 2 DEBUG oslo_concurrency.lockutils [None req-1256f65b-5fc9-4f74-9b86-0efad9e686bf 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.464s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.604 2 DEBUG nova.compute.manager [req-80c65e8f-609e-4837-a39b-3a6562d8a092 req-7dafca0d-171b-40b7-be64-45db0e6b48b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.604 2 DEBUG oslo_concurrency.lockutils [req-80c65e8f-609e-4837-a39b-3a6562d8a092 req-7dafca0d-171b-40b7-be64-45db0e6b48b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.605 2 DEBUG oslo_concurrency.lockutils [req-80c65e8f-609e-4837-a39b-3a6562d8a092 req-7dafca0d-171b-40b7-be64-45db0e6b48b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.605 2 DEBUG oslo_concurrency.lockutils [req-80c65e8f-609e-4837-a39b-3a6562d8a092 req-7dafca0d-171b-40b7-be64-45db0e6b48b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.605 2 DEBUG nova.compute.manager [req-80c65e8f-609e-4837-a39b-3a6562d8a092 req-7dafca0d-171b-40b7-be64-45db0e6b48b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.605 2 WARNING nova.compute.manager [req-80c65e8f-609e-4837-a39b-3a6562d8a092 req-7dafca0d-171b-40b7-be64-45db0e6b48b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state deleted and task_state None.
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.809 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.809 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.828 2 DEBUG nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.939 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.939 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.962 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:18:14 compute-1 nova_compute[192795]: 2025-09-30 21:18:14.963 2 INFO nova.compute.claims [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.110 2 DEBUG nova.compute.provider_tree [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.132 2 DEBUG nova.scheduler.client.report [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.161 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.162 2 DEBUG nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.226 2 DEBUG nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.226 2 DEBUG nova.network.neutron [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.251 2 INFO nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.266 2 DEBUG nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.411 2 DEBUG nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.413 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.414 2 INFO nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Creating image(s)
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.415 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.415 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.416 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.428 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.484 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.485 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.486 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.496 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.552 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.553 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.584 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.585 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.585 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.651 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.653 2 DEBUG nova.virt.disk.api [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Checking if we can resize image /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.653 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.715 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.717 2 DEBUG nova.virt.disk.api [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Cannot resize image /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.717 2 DEBUG nova.objects.instance [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lazy-loading 'migration_context' on Instance uuid 800f4413-c978-4c4e-97b6-1ea1e45f9f17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.741 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.742 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Ensure instance console log exists: /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.743 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.743 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.744 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:15 compute-1 nova_compute[192795]: 2025-09-30 21:18:15.768 2 DEBUG nova.policy [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:18:16 compute-1 nova_compute[192795]: 2025-09-30 21:18:16.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:17 compute-1 nova_compute[192795]: 2025-09-30 21:18:17.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:17 compute-1 nova_compute[192795]: 2025-09-30 21:18:17.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:17 compute-1 nova_compute[192795]: 2025-09-30 21:18:17.633 2 DEBUG nova.network.neutron [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Successfully updated port: c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:18:17 compute-1 nova_compute[192795]: 2025-09-30 21:18:17.686 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:18:17 compute-1 nova_compute[192795]: 2025-09-30 21:18:17.686 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquired lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:18:17 compute-1 nova_compute[192795]: 2025-09-30 21:18:17.686 2 DEBUG nova.network.neutron [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:18:17 compute-1 nova_compute[192795]: 2025-09-30 21:18:17.945 2 DEBUG nova.compute.manager [req-effdaf7c-52af-4ad3-b12c-f85922f3253a req-5a191997-21e8-44ea-b42b-d7b2e056b2ec dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-changed-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:17 compute-1 nova_compute[192795]: 2025-09-30 21:18:17.945 2 DEBUG nova.compute.manager [req-effdaf7c-52af-4ad3-b12c-f85922f3253a req-5a191997-21e8-44ea-b42b-d7b2e056b2ec dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Refreshing instance network info cache due to event network-changed-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:18:17 compute-1 nova_compute[192795]: 2025-09-30 21:18:17.946 2 DEBUG oslo_concurrency.lockutils [req-effdaf7c-52af-4ad3-b12c-f85922f3253a req-5a191997-21e8-44ea-b42b-d7b2e056b2ec dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:18:18 compute-1 nova_compute[192795]: 2025-09-30 21:18:18.104 2 DEBUG nova.network.neutron [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:18:19 compute-1 podman[222908]: 2025-09-30 21:18:19.223704464 +0000 UTC m=+0.055309851 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:18:19 compute-1 podman[222907]: 2025-09-30 21:18:19.252460179 +0000 UTC m=+0.077529490 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd)
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.610 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267085.6052053, b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.610 2 INFO nova.compute.manager [-] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] VM Stopped (Lifecycle Event)
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.633 2 DEBUG nova.compute.manager [None req-0aab87de-f255-40a3-bd2e-8a8916bf52f4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.853 2 DEBUG nova.network.neutron [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updating instance_info_cache with network_info: [{"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.882 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Releasing lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.882 2 DEBUG nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Instance network_info: |[{"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.882 2 DEBUG oslo_concurrency.lockutils [req-effdaf7c-52af-4ad3-b12c-f85922f3253a req-5a191997-21e8-44ea-b42b-d7b2e056b2ec dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.883 2 DEBUG nova.network.neutron [req-effdaf7c-52af-4ad3-b12c-f85922f3253a req-5a191997-21e8-44ea-b42b-d7b2e056b2ec dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Refreshing network info cache for port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.886 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Start _get_guest_xml network_info=[{"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.890 2 WARNING nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.905 2 DEBUG nova.virt.libvirt.host [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.906 2 DEBUG nova.virt.libvirt.host [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.912 2 DEBUG nova.virt.libvirt.host [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.913 2 DEBUG nova.virt.libvirt.host [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.914 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.915 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.915 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.915 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.915 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.916 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.916 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.916 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.916 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.916 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.917 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.917 2 DEBUG nova.virt.hardware [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.920 2 DEBUG nova.virt.libvirt.vif [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:18:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-943144079',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-943144079',id=15,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-4e04clu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:18:15Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=800f4413-c978-4c4e-97b6-1ea1e45f9f17,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.921 2 DEBUG nova.network.os_vif_util [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converting VIF {"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.922 2 DEBUG nova.network.os_vif_util [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.922 2 DEBUG nova.objects.instance [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lazy-loading 'pci_devices' on Instance uuid 800f4413-c978-4c4e-97b6-1ea1e45f9f17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.935 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <uuid>800f4413-c978-4c4e-97b6-1ea1e45f9f17</uuid>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <name>instance-0000000f</name>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-943144079</nova:name>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:18:20</nova:creationTime>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:18:20 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:18:20 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:18:20 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:18:20 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:18:20 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:18:20 compute-1 nova_compute[192795]:         <nova:user uuid="981e96ea2bc2419d9a1e57d6aed70304">tempest-LiveAutoBlockMigrationV225Test-860972404-project-member</nova:user>
Sep 30 21:18:20 compute-1 nova_compute[192795]:         <nova:project uuid="544a33c53701466d8bf7e8ed34f38dcb">tempest-LiveAutoBlockMigrationV225Test-860972404</nova:project>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:18:20 compute-1 nova_compute[192795]:         <nova:port uuid="c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7">
Sep 30 21:18:20 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <system>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <entry name="serial">800f4413-c978-4c4e-97b6-1ea1e45f9f17</entry>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <entry name="uuid">800f4413-c978-4c4e-97b6-1ea1e45f9f17</entry>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     </system>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <os>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   </os>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <features>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   </features>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:82:04:bd"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <target dev="tapc6fd9c21-8c"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/console.log" append="off"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <video>
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     </video>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:18:20 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:18:20 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:18:20 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:18:20 compute-1 nova_compute[192795]: </domain>
Sep 30 21:18:20 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.936 2 DEBUG nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Preparing to wait for external event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.936 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.936 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.937 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.937 2 DEBUG nova.virt.libvirt.vif [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:18:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-943144079',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-943144079',id=15,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-4e04clu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:18:15Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=800f4413-c978-4c4e-97b6-1ea1e45f9f17,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.938 2 DEBUG nova.network.os_vif_util [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converting VIF {"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.938 2 DEBUG nova.network.os_vif_util [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.939 2 DEBUG os_vif [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.940 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.940 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.942 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6fd9c21-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.943 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6fd9c21-8c, col_values=(('external_ids', {'iface-id': 'c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:04:bd', 'vm-uuid': '800f4413-c978-4c4e-97b6-1ea1e45f9f17'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:20 compute-1 NetworkManager[51724]: <info>  [1759267100.9453] manager: (tapc6fd9c21-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:20 compute-1 nova_compute[192795]: 2025-09-30 21:18:20.950 2 INFO os_vif [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c')
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.020 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.020 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.020 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] No VIF found with MAC fa:16:3e:82:04:bd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.021 2 INFO nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Using config drive
Sep 30 21:18:21 compute-1 podman[222953]: 2025-09-30 21:18:21.077839059 +0000 UTC m=+0.100095757 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.756 2 INFO nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Creating config drive at /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.761 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr_70yd1j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.887 2 DEBUG oslo_concurrency.processutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr_70yd1j" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:21 compute-1 kernel: tapc6fd9c21-8c: entered promiscuous mode
Sep 30 21:18:21 compute-1 NetworkManager[51724]: <info>  [1759267101.9502] manager: (tapc6fd9c21-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Sep 30 21:18:21 compute-1 ovn_controller[94902]: 2025-09-30T21:18:21Z|00105|binding|INFO|Claiming lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for this chassis.
Sep 30 21:18:21 compute-1 ovn_controller[94902]: 2025-09-30T21:18:21Z|00106|binding|INFO|c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7: Claiming fa:16:3e:82:04:bd 10.100.0.4
Sep 30 21:18:21 compute-1 ovn_controller[94902]: 2025-09-30T21:18:21Z|00107|binding|INFO|Claiming lport 7ce813d5-5e22-4733-a15e-f9d5223072eb for this chassis.
Sep 30 21:18:21 compute-1 ovn_controller[94902]: 2025-09-30T21:18:21Z|00108|binding|INFO|7ce813d5-5e22-4733-a15e-f9d5223072eb: Claiming fa:16:3e:dd:ce:7d 19.80.0.205
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:21 compute-1 nova_compute[192795]: 2025-09-30 21:18:21.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:21 compute-1 systemd-udevd[222996]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:18:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:21.979 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:ce:7d 19.80.0.205'], port_security=['fa:16:3e:dd:ce:7d 19.80.0.205'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-521832462', 'neutron:cidrs': '19.80.0.205/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-383de3b7-a202-49cd-b129-b065ac294878', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-521832462', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d890b95b-932f-4c74-902c-ed705814144b, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ce813d5-5e22-4733-a15e-f9d5223072eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:21.981 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:04:bd 10.100.0.4'], port_security=['fa:16:3e:82:04:bd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-280178662', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-280178662', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:21.982 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 7ce813d5-5e22-4733-a15e-f9d5223072eb in datapath 383de3b7-a202-49cd-b129-b065ac294878 bound to our chassis
Sep 30 21:18:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:21.984 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 383de3b7-a202-49cd-b129-b065ac294878
Sep 30 21:18:21 compute-1 NetworkManager[51724]: <info>  [1759267101.9959] device (tapc6fd9c21-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:18:21 compute-1 NetworkManager[51724]: <info>  [1759267101.9970] device (tapc6fd9c21-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:18:21 compute-1 systemd-machined[152783]: New machine qemu-9-instance-0000000f.
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:21.999 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3026f6-e4fe-48fc-9d19-81913683ae54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.000 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap383de3b7-a1 in ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.002 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap383de3b7-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.002 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[41d21fd0-3b0c-4985-81e0-6a10c8191f3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.003 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e3709024-cc2d-4430-a835-ac85dea42e6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.015 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[ecdaa81c-b9a8-4a1f-a948-68c1ee27cc2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-0000000f.
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.037 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5a53f005-5f82-4e07-b7b7-b87d3c4c06fb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 nova_compute[192795]: 2025-09-30 21:18:22.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:22 compute-1 ovn_controller[94902]: 2025-09-30T21:18:22Z|00109|binding|INFO|Setting lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 ovn-installed in OVS
Sep 30 21:18:22 compute-1 ovn_controller[94902]: 2025-09-30T21:18:22Z|00110|binding|INFO|Setting lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 up in Southbound
Sep 30 21:18:22 compute-1 ovn_controller[94902]: 2025-09-30T21:18:22Z|00111|binding|INFO|Setting lport 7ce813d5-5e22-4733-a15e-f9d5223072eb up in Southbound
Sep 30 21:18:22 compute-1 nova_compute[192795]: 2025-09-30 21:18:22.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.069 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[aa994975-2b78-492b-a970-0db3f5b8bdfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.075 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bd186df7-1d78-41ac-a8e9-f29681a53f73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 NetworkManager[51724]: <info>  [1759267102.0778] manager: (tap383de3b7-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.116 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[03e20886-00db-4451-84d9-a8f5d3fd641b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.120 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2873c525-4640-4da6-b066-460eede7c408]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 NetworkManager[51724]: <info>  [1759267102.1475] device (tap383de3b7-a0): carrier: link connected
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.154 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[358b89d8-3cb9-496e-ae38-4d80a8a9f484]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.175 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d2dfdc7d-e722-4d59-8b3d-a9850648bace]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap383de3b7-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:be:58:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379782, 'reachable_time': 43180, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223030, 'error': None, 'target': 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.196 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f87ef4e1-6714-4494-89e1-14fb3c2dab48]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febe:58ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379782, 'tstamp': 379782}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223031, 'error': None, 'target': 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.218 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb1cf0a-852f-44ba-9b3a-98433a46322a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap383de3b7-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:be:58:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379782, 'reachable_time': 43180, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223032, 'error': None, 'target': 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.256 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1730b505-4b22-4cc5-aacf-ec7c1f74df0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.340 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[193d187f-af8d-4969-a308-9c0a131222bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.342 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap383de3b7-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.342 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.342 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap383de3b7-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:22 compute-1 NetworkManager[51724]: <info>  [1759267102.4048] manager: (tap383de3b7-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Sep 30 21:18:22 compute-1 nova_compute[192795]: 2025-09-30 21:18:22.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:22 compute-1 kernel: tap383de3b7-a0: entered promiscuous mode
Sep 30 21:18:22 compute-1 nova_compute[192795]: 2025-09-30 21:18:22.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.407 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap383de3b7-a0, col_values=(('external_ids', {'iface-id': 'd9318b69-dfe5-4e14-b762-51aa441e80cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:22 compute-1 nova_compute[192795]: 2025-09-30 21:18:22.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:22 compute-1 ovn_controller[94902]: 2025-09-30T21:18:22Z|00112|binding|INFO|Releasing lport d9318b69-dfe5-4e14-b762-51aa441e80cd from this chassis (sb_readonly=0)
Sep 30 21:18:22 compute-1 nova_compute[192795]: 2025-09-30 21:18:22.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.422 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/383de3b7-a202-49cd-b129-b065ac294878.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/383de3b7-a202-49cd-b129-b065ac294878.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.423 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e11d90-ed03-4dcb-93ed-02509c659e95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.423 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-383de3b7-a202-49cd-b129-b065ac294878
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/383de3b7-a202-49cd-b129-b065ac294878.pid.haproxy
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 383de3b7-a202-49cd-b129-b065ac294878
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:18:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:22.424 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'env', 'PROCESS_TAG=haproxy-383de3b7-a202-49cd-b129-b065ac294878', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/383de3b7-a202-49cd-b129-b065ac294878.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:18:22 compute-1 podman[223071]: 2025-09-30 21:18:22.828494489 +0000 UTC m=+0.062864195 container create 2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:18:22 compute-1 systemd[1]: Started libpod-conmon-2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa.scope.
Sep 30 21:18:22 compute-1 podman[223071]: 2025-09-30 21:18:22.795217402 +0000 UTC m=+0.029587088 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:18:22 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:18:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fe63c5a2dfb3c7af3fb0d2f408dc1e151e66cb30e3180b633053acad8ddffb8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:18:22 compute-1 podman[223071]: 2025-09-30 21:18:22.92763817 +0000 UTC m=+0.162007866 container init 2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:18:22 compute-1 podman[223071]: 2025-09-30 21:18:22.935391709 +0000 UTC m=+0.169761405 container start 2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:18:22 compute-1 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[223087]: [NOTICE]   (223103) : New worker (223109) forked
Sep 30 21:18:22 compute-1 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[223087]: [NOTICE]   (223103) : Loading success.
Sep 30 21:18:22 compute-1 podman[223084]: 2025-09-30 21:18:22.983244498 +0000 UTC m=+0.103910121 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.026 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267103.025621, 800f4413-c978-4c4e-97b6-1ea1e45f9f17 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.026 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] VM Started (Lifecycle Event)
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.030 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 unbound from our chassis
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.032 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.047 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8c42ab8c-005d-4593-af92-5f5f788513d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.048 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap934fff90-51 in ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.050 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap934fff90-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.050 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[99357954-547d-4f59-9a0d-fb2b9f9d78fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.051 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0c81387d-9273-4ff0-bb0d-f9a568d68586]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.052 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.057 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267103.025825, 800f4413-c978-4c4e-97b6-1ea1e45f9f17 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.057 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] VM Paused (Lifecycle Event)
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.066 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[d8ddce1f-cc05-46af-9a63-5875321e362a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.081 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.085 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.096 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7693d29b-1613-486f-84a4-2e1abfdcffdd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.124 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.128 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[815ddae0-1e2d-443d-bd41-402f0d5af26b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.133 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[04cb654a-f4cf-45e4-8024-fc3e6db6fb04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 NetworkManager[51724]: <info>  [1759267103.1359] manager: (tap934fff90-50): new Veth device (/org/freedesktop/NetworkManager/Devices/56)
Sep 30 21:18:23 compute-1 systemd-udevd[223015]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.180 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b377c567-fe69-4bfb-a5a6-dbd34b75e3f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.184 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6c6bca09-5e28-4f6b-8d17-5007cc1f8390]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 NetworkManager[51724]: <info>  [1759267103.2202] device (tap934fff90-50): carrier: link connected
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.231 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9510f2dd-6fa6-4d49-a9cb-6b67f028331b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.258 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a7de6a76-12e4-4f59-9316-860c43ba49ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379889, 'reachable_time': 27055, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223132, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.275 2 DEBUG nova.compute.manager [req-b4eaaf17-fc52-456f-b34b-232362609cd4 req-687462e1-a1c3-4f31-85f6-82ce2e005b2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.275 2 DEBUG oslo_concurrency.lockutils [req-b4eaaf17-fc52-456f-b34b-232362609cd4 req-687462e1-a1c3-4f31-85f6-82ce2e005b2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.275 2 DEBUG oslo_concurrency.lockutils [req-b4eaaf17-fc52-456f-b34b-232362609cd4 req-687462e1-a1c3-4f31-85f6-82ce2e005b2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.276 2 DEBUG oslo_concurrency.lockutils [req-b4eaaf17-fc52-456f-b34b-232362609cd4 req-687462e1-a1c3-4f31-85f6-82ce2e005b2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.276 2 DEBUG nova.compute.manager [req-b4eaaf17-fc52-456f-b34b-232362609cd4 req-687462e1-a1c3-4f31-85f6-82ce2e005b2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Processing event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.275 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c509a0ea-a4d7-45c9-9eb5-bc58fec2f291]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe18:3765'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379889, 'tstamp': 379889}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223133, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.277 2 DEBUG nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.281 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267103.2814708, 800f4413-c978-4c4e-97b6-1ea1e45f9f17 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.282 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] VM Resumed (Lifecycle Event)
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.283 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.289 2 INFO nova.virt.libvirt.driver [-] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Instance spawned successfully.
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.289 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.302 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[37b40484-d7d3-43a8-ac12-351f79a2bd77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379889, 'reachable_time': 27055, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223134, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.308 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.315 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.319 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.320 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.320 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.321 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.321 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.322 2 DEBUG nova.virt.libvirt.driver [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.344 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[64768fc6-5643-4d0e-ad2a-448eb01d5805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.345 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.400 2 INFO nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Took 7.99 seconds to spawn the instance on the hypervisor.
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.401 2 DEBUG nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.425 2 DEBUG nova.network.neutron [req-effdaf7c-52af-4ad3-b12c-f85922f3253a req-5a191997-21e8-44ea-b42b-d7b2e056b2ec dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updated VIF entry in instance network info cache for port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.426 2 DEBUG nova.network.neutron [req-effdaf7c-52af-4ad3-b12c-f85922f3253a req-5a191997-21e8-44ea-b42b-d7b2e056b2ec dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updating instance_info_cache with network_info: [{"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.431 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[03cf6ce4-1ef9-427f-bfb0-f768f77eb850]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.433 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.433 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.434 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap934fff90-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:23 compute-1 NetworkManager[51724]: <info>  [1759267103.4764] manager: (tap934fff90-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Sep 30 21:18:23 compute-1 kernel: tap934fff90-50: entered promiscuous mode
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.478 2 DEBUG oslo_concurrency.lockutils [req-effdaf7c-52af-4ad3-b12c-f85922f3253a req-5a191997-21e8-44ea-b42b-d7b2e056b2ec dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.481 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap934fff90-50, col_values=(('external_ids', {'iface-id': 'b21a7164-770c-4265-ad15-a3e058ec1a56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:23 compute-1 ovn_controller[94902]: 2025-09-30T21:18:23Z|00113|binding|INFO|Releasing lport b21a7164-770c-4265-ad15-a3e058ec1a56 from this chassis (sb_readonly=0)
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.503 2 INFO nova.compute.manager [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Took 8.61 seconds to build instance.
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.509 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.510 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[da1a316f-cbfc-4d8b-acf5-4a15ad5f5f98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.511 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:18:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:23.511 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'env', 'PROCESS_TAG=haproxy-934fff90-5446-41f1-a5ad-d2568cb337b1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/934fff90-5446-41f1-a5ad-d2568cb337b1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:18:23 compute-1 nova_compute[192795]: 2025-09-30 21:18:23.539 2 DEBUG oslo_concurrency.lockutils [None req-3e13a41e-c0a2-44b9-9e78-6fd7651cc781 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:23 compute-1 podman[223167]: 2025-09-30 21:18:23.941425235 +0000 UTC m=+0.068383833 container create b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Sep 30 21:18:23 compute-1 systemd[1]: Started libpod-conmon-b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615.scope.
Sep 30 21:18:23 compute-1 podman[223167]: 2025-09-30 21:18:23.905149888 +0000 UTC m=+0.032108566 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:18:24 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:18:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f3f866fbd47f818615f665dd4595141b52cfa49b059ff72499baf961388f9cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:18:24 compute-1 podman[223167]: 2025-09-30 21:18:24.070205005 +0000 UTC m=+0.197163613 container init b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:18:24 compute-1 podman[223167]: 2025-09-30 21:18:24.082754133 +0000 UTC m=+0.209712711 container start b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:18:24 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[223183]: [NOTICE]   (223187) : New worker (223189) forked
Sep 30 21:18:24 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[223183]: [NOTICE]   (223187) : Loading success.
Sep 30 21:18:25 compute-1 nova_compute[192795]: 2025-09-30 21:18:25.391 2 DEBUG nova.compute.manager [req-34d763da-1559-46fa-bda4-6a7ef1f1814f req-bdf42373-f47e-4ec0-a06d-409b5ae7a636 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:25 compute-1 nova_compute[192795]: 2025-09-30 21:18:25.393 2 DEBUG oslo_concurrency.lockutils [req-34d763da-1559-46fa-bda4-6a7ef1f1814f req-bdf42373-f47e-4ec0-a06d-409b5ae7a636 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:25 compute-1 nova_compute[192795]: 2025-09-30 21:18:25.393 2 DEBUG oslo_concurrency.lockutils [req-34d763da-1559-46fa-bda4-6a7ef1f1814f req-bdf42373-f47e-4ec0-a06d-409b5ae7a636 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:25 compute-1 nova_compute[192795]: 2025-09-30 21:18:25.394 2 DEBUG oslo_concurrency.lockutils [req-34d763da-1559-46fa-bda4-6a7ef1f1814f req-bdf42373-f47e-4ec0-a06d-409b5ae7a636 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:25 compute-1 nova_compute[192795]: 2025-09-30 21:18:25.395 2 DEBUG nova.compute.manager [req-34d763da-1559-46fa-bda4-6a7ef1f1814f req-bdf42373-f47e-4ec0-a06d-409b5ae7a636 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:25 compute-1 nova_compute[192795]: 2025-09-30 21:18:25.395 2 WARNING nova.compute.manager [req-34d763da-1559-46fa-bda4-6a7ef1f1814f req-bdf42373-f47e-4ec0-a06d-409b5ae7a636 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received unexpected event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with vm_state active and task_state None.
Sep 30 21:18:25 compute-1 nova_compute[192795]: 2025-09-30 21:18:25.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:26 compute-1 nova_compute[192795]: 2025-09-30 21:18:26.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:27 compute-1 nova_compute[192795]: 2025-09-30 21:18:27.388 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267092.387126, 252d5457-8837-4aa6-b309-c3139e8db7ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:27 compute-1 nova_compute[192795]: 2025-09-30 21:18:27.389 2 INFO nova.compute.manager [-] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Stopped (Lifecycle Event)
Sep 30 21:18:27 compute-1 nova_compute[192795]: 2025-09-30 21:18:27.422 2 DEBUG nova.compute.manager [None req-6bbf4c32-abab-4163-93cd-2d372ebe540d - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:27 compute-1 nova_compute[192795]: 2025-09-30 21:18:27.769 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Check if temp file /var/lib/nova/instances/tmpgbk0jjex exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Sep 30 21:18:27 compute-1 nova_compute[192795]: 2025-09-30 21:18:27.770 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbk0jjex',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='800f4413-c978-4c4e-97b6-1ea1e45f9f17',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Sep 30 21:18:28 compute-1 nova_compute[192795]: 2025-09-30 21:18:28.947 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:29 compute-1 nova_compute[192795]: 2025-09-30 21:18:29.038 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:29 compute-1 nova_compute[192795]: 2025-09-30 21:18:29.041 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:29 compute-1 nova_compute[192795]: 2025-09-30 21:18:29.111 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:30 compute-1 nova_compute[192795]: 2025-09-30 21:18:30.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:31 compute-1 podman[223205]: 2025-09-30 21:18:31.286033896 +0000 UTC m=+0.110898479 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:18:31 compute-1 nova_compute[192795]: 2025-09-30 21:18:31.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:31 compute-1 sshd-session[223224]: Accepted publickey for nova from 192.168.122.100 port 50808 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:18:31 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:18:31 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:18:31 compute-1 systemd-logind[793]: New session 33 of user nova.
Sep 30 21:18:31 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:18:31 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:18:31 compute-1 systemd[223228]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:18:32 compute-1 systemd[223228]: Queued start job for default target Main User Target.
Sep 30 21:18:32 compute-1 systemd[223228]: Created slice User Application Slice.
Sep 30 21:18:32 compute-1 systemd[223228]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:18:32 compute-1 systemd[223228]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:18:32 compute-1 systemd[223228]: Reached target Paths.
Sep 30 21:18:32 compute-1 systemd[223228]: Reached target Timers.
Sep 30 21:18:32 compute-1 systemd[223228]: Starting D-Bus User Message Bus Socket...
Sep 30 21:18:32 compute-1 systemd[223228]: Starting Create User's Volatile Files and Directories...
Sep 30 21:18:32 compute-1 systemd[223228]: Finished Create User's Volatile Files and Directories.
Sep 30 21:18:32 compute-1 systemd[223228]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:18:32 compute-1 systemd[223228]: Reached target Sockets.
Sep 30 21:18:32 compute-1 systemd[223228]: Reached target Basic System.
Sep 30 21:18:32 compute-1 systemd[223228]: Reached target Main User Target.
Sep 30 21:18:32 compute-1 systemd[223228]: Startup finished in 182ms.
Sep 30 21:18:32 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:18:32 compute-1 systemd[1]: Started Session 33 of User nova.
Sep 30 21:18:32 compute-1 sshd-session[223224]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:18:32 compute-1 sshd-session[223243]: Received disconnect from 192.168.122.100 port 50808:11: disconnected by user
Sep 30 21:18:32 compute-1 sshd-session[223243]: Disconnected from user nova 192.168.122.100 port 50808
Sep 30 21:18:32 compute-1 sshd-session[223224]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:18:32 compute-1 systemd-logind[793]: Session 33 logged out. Waiting for processes to exit.
Sep 30 21:18:32 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Sep 30 21:18:32 compute-1 systemd-logind[793]: Removed session 33.
Sep 30 21:18:33 compute-1 podman[223245]: 2025-09-30 21:18:33.269767835 +0000 UTC m=+0.107001083 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 21:18:33 compute-1 nova_compute[192795]: 2025-09-30 21:18:33.325 2 DEBUG nova.compute.manager [req-afa742e9-b180-4e51-bca0-c00f50ed227a req-20dfee1e-861d-401c-921a-b381acccfe3d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:33 compute-1 nova_compute[192795]: 2025-09-30 21:18:33.326 2 DEBUG oslo_concurrency.lockutils [req-afa742e9-b180-4e51-bca0-c00f50ed227a req-20dfee1e-861d-401c-921a-b381acccfe3d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:33 compute-1 nova_compute[192795]: 2025-09-30 21:18:33.326 2 DEBUG oslo_concurrency.lockutils [req-afa742e9-b180-4e51-bca0-c00f50ed227a req-20dfee1e-861d-401c-921a-b381acccfe3d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:33 compute-1 nova_compute[192795]: 2025-09-30 21:18:33.326 2 DEBUG oslo_concurrency.lockutils [req-afa742e9-b180-4e51-bca0-c00f50ed227a req-20dfee1e-861d-401c-921a-b381acccfe3d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:33 compute-1 nova_compute[192795]: 2025-09-30 21:18:33.327 2 DEBUG nova.compute.manager [req-afa742e9-b180-4e51-bca0-c00f50ed227a req-20dfee1e-861d-401c-921a-b381acccfe3d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:33 compute-1 nova_compute[192795]: 2025-09-30 21:18:33.327 2 DEBUG nova.compute.manager [req-afa742e9-b180-4e51-bca0-c00f50ed227a req-20dfee1e-861d-401c-921a-b381acccfe3d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:18:33 compute-1 nova_compute[192795]: 2025-09-30 21:18:33.635 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:33 compute-1 nova_compute[192795]: 2025-09-30 21:18:33.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.039 2 INFO nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Took 4.93 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.041 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.069 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbk0jjex',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='800f4413-c978-4c4e-97b6-1ea1e45f9f17',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(2385dcbc-a0c6-42bb-b744-9f34ef959fad),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.100 2 DEBUG nova.objects.instance [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lazy-loading 'migration_context' on Instance uuid 800f4413-c978-4c4e-97b6-1ea1e45f9f17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.103 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.105 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.106 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.123 2 DEBUG nova.virt.libvirt.vif [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:18:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-943144079',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-943144079',id=15,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:18:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-4e04clu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:18:23Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=800f4413-c978-4c4e-97b6-1ea1e45f9f17,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.124 2 DEBUG nova.network.os_vif_util [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converting VIF {"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.125 2 DEBUG nova.network.os_vif_util [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.127 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 21:18:34 compute-1 nova_compute[192795]:   <mac address="fa:16:3e:82:04:bd"/>
Sep 30 21:18:34 compute-1 nova_compute[192795]:   <model type="virtio"/>
Sep 30 21:18:34 compute-1 nova_compute[192795]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:18:34 compute-1 nova_compute[192795]:   <mtu size="1442"/>
Sep 30 21:18:34 compute-1 nova_compute[192795]:   <target dev="tapc6fd9c21-8c"/>
Sep 30 21:18:34 compute-1 nova_compute[192795]: </interface>
Sep 30 21:18:34 compute-1 nova_compute[192795]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.128 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.610 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.611 2 INFO nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 21:18:34 compute-1 nova_compute[192795]: 2025-09-30 21:18:34.709 2 INFO nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.213 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.213 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:18:35 compute-1 podman[223279]: 2025-09-30 21:18:35.233513146 +0000 UTC m=+0.067469440 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:18:35 compute-1 ovn_controller[94902]: 2025-09-30T21:18:35Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:04:bd 10.100.0.4
Sep 30 21:18:35 compute-1 ovn_controller[94902]: 2025-09-30T21:18:35Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:04:bd 10.100.0.4
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.554 2 DEBUG nova.compute.manager [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.555 2 DEBUG oslo_concurrency.lockutils [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.555 2 DEBUG oslo_concurrency.lockutils [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.556 2 DEBUG oslo_concurrency.lockutils [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.556 2 DEBUG nova.compute.manager [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.557 2 WARNING nova.compute.manager [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received unexpected event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with vm_state active and task_state migrating.
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.557 2 DEBUG nova.compute.manager [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-changed-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.557 2 DEBUG nova.compute.manager [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Refreshing instance network info cache due to event network-changed-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.558 2 DEBUG oslo_concurrency.lockutils [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.559 2 DEBUG oslo_concurrency.lockutils [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.559 2 DEBUG nova.network.neutron [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Refreshing network info cache for port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.759 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.761 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:18:35 compute-1 nova_compute[192795]: 2025-09-30 21:18:35.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.266 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.267 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.746 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267116.7461886, 800f4413-c978-4c4e-97b6-1ea1e45f9f17 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.747 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] VM Paused (Lifecycle Event)
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.772 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.778 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.800 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.800 2 DEBUG nova.virt.libvirt.migration [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.803 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] During sync_power_state the instance has a pending task (migrating). Skip.
Sep 30 21:18:36 compute-1 kernel: tapc6fd9c21-8c (unregistering): left promiscuous mode
Sep 30 21:18:36 compute-1 NetworkManager[51724]: <info>  [1759267116.9130] device (tapc6fd9c21-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:18:36 compute-1 ovn_controller[94902]: 2025-09-30T21:18:36Z|00114|binding|INFO|Releasing lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 from this chassis (sb_readonly=0)
Sep 30 21:18:36 compute-1 ovn_controller[94902]: 2025-09-30T21:18:36Z|00115|binding|INFO|Setting lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 down in Southbound
Sep 30 21:18:36 compute-1 ovn_controller[94902]: 2025-09-30T21:18:36Z|00116|binding|INFO|Releasing lport 7ce813d5-5e22-4733-a15e-f9d5223072eb from this chassis (sb_readonly=0)
Sep 30 21:18:36 compute-1 ovn_controller[94902]: 2025-09-30T21:18:36Z|00117|binding|INFO|Setting lport 7ce813d5-5e22-4733-a15e-f9d5223072eb down in Southbound
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:36 compute-1 ovn_controller[94902]: 2025-09-30T21:18:36Z|00118|binding|INFO|Removing iface tapc6fd9c21-8c ovn-installed in OVS
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:36 compute-1 ovn_controller[94902]: 2025-09-30T21:18:36Z|00119|binding|INFO|Releasing lport d9318b69-dfe5-4e14-b762-51aa441e80cd from this chassis (sb_readonly=0)
Sep 30 21:18:36 compute-1 ovn_controller[94902]: 2025-09-30T21:18:36Z|00120|binding|INFO|Releasing lport b21a7164-770c-4265-ad15-a3e058ec1a56 from this chassis (sb_readonly=0)
Sep 30 21:18:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:36.937 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:ce:7d 19.80.0.205'], port_security=['fa:16:3e:dd:ce:7d 19.80.0.205'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-521832462', 'neutron:cidrs': '19.80.0.205/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-383de3b7-a202-49cd-b129-b065ac294878', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-521832462', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d890b95b-932f-4c74-902c-ed705814144b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ce813d5-5e22-4733-a15e-f9d5223072eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:36.942 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:04:bd 10.100.0.4'], port_security=['fa:16:3e:82:04:bd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3b817c7f-1137-4e8f-8263-8c5e6eddafa4'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-280178662', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-280178662', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:36.944 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 7ce813d5-5e22-4733-a15e-f9d5223072eb in datapath 383de3b7-a202-49cd-b129-b065ac294878 unbound from our chassis
Sep 30 21:18:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:36.948 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 383de3b7-a202-49cd-b129-b065ac294878, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:18:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:36.950 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[381680cf-f7c4-4882-9d5b-e8b0666c43fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:36.952 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 namespace which is not needed anymore
Sep 30 21:18:36 compute-1 nova_compute[192795]: 2025-09-30 21:18:36.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:37 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Sep 30 21:18:37 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000f.scope: Consumed 14.046s CPU time.
Sep 30 21:18:37 compute-1 systemd-machined[152783]: Machine qemu-9-instance-0000000f terminated.
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[223087]: [NOTICE]   (223103) : haproxy version is 2.8.14-c23fe91
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[223087]: [NOTICE]   (223103) : path to executable is /usr/sbin/haproxy
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[223087]: [WARNING]  (223103) : Exiting Master process...
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[223087]: [ALERT]    (223103) : Current worker (223109) exited with code 143 (Terminated)
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[223087]: [WARNING]  (223103) : All workers exited. Exiting... (0)
Sep 30 21:18:37 compute-1 systemd[1]: libpod-2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa.scope: Deactivated successfully.
Sep 30 21:18:37 compute-1 podman[223347]: 2025-09-30 21:18:37.166001254 +0000 UTC m=+0.078744562 container died 2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.175 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.175 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.175 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.196 2 DEBUG nova.compute.manager [req-b7ed0fba-d909-4e42-a272-7cfde2f05f89 req-846e58b5-8be5-43bb-9f66-842c99a6753c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.196 2 DEBUG oslo_concurrency.lockutils [req-b7ed0fba-d909-4e42-a272-7cfde2f05f89 req-846e58b5-8be5-43bb-9f66-842c99a6753c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.197 2 DEBUG oslo_concurrency.lockutils [req-b7ed0fba-d909-4e42-a272-7cfde2f05f89 req-846e58b5-8be5-43bb-9f66-842c99a6753c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.197 2 DEBUG oslo_concurrency.lockutils [req-b7ed0fba-d909-4e42-a272-7cfde2f05f89 req-846e58b5-8be5-43bb-9f66-842c99a6753c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.197 2 DEBUG nova.compute.manager [req-b7ed0fba-d909-4e42-a272-7cfde2f05f89 req-846e58b5-8be5-43bb-9f66-842c99a6753c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.197 2 DEBUG nova.compute.manager [req-b7ed0fba-d909-4e42-a272-7cfde2f05f89 req-846e58b5-8be5-43bb-9f66-842c99a6753c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:18:37 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa-userdata-shm.mount: Deactivated successfully.
Sep 30 21:18:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-2fe63c5a2dfb3c7af3fb0d2f408dc1e151e66cb30e3180b633053acad8ddffb8-merged.mount: Deactivated successfully.
Sep 30 21:18:37 compute-1 podman[223347]: 2025-09-30 21:18:37.234018256 +0000 UTC m=+0.146761555 container cleanup 2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:18:37 compute-1 systemd[1]: libpod-conmon-2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa.scope: Deactivated successfully.
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.303 2 DEBUG nova.virt.libvirt.guest [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '800f4413-c978-4c4e-97b6-1ea1e45f9f17' (instance-0000000f) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.304 2 INFO nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Migration operation has completed
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.304 2 INFO nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] _post_live_migration() is started..
Sep 30 21:18:37 compute-1 podman[223395]: 2025-09-30 21:18:37.314410973 +0000 UTC m=+0.052650290 container remove 2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.321 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[48840614-7380-4c28-a8d2-1d837ddfea16]: (4, ('Tue Sep 30 09:18:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 (2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa)\n2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa\nTue Sep 30 09:18:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 (2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa)\n2d90143c0016c811c8c55ac579b2ca54a42bcbcc9d9a83722c661bfbc9a7c7fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.323 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e4491fb0-d93c-46b5-ab4b-05c0e3ec5955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.325 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap383de3b7-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:37 compute-1 kernel: tap383de3b7-a0: left promiscuous mode
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.351 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7ebeb4c0-e7dc-494e-b29d-d6080f07c882]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.392 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c288f61e-1d35-4df2-a563-b1a35d11b645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.394 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ac1be1-ea52-4b1a-857e-f6cc8c904bb4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.423 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ab20f949-c2b4-41c7-b692-3d088dcad050]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379773, 'reachable_time': 16959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223413, 'error': None, 'target': 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.426 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.427 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[f63613b1-1cad-4776-b08c-6f10e3c703e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.428 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 unbound from our chassis
Sep 30 21:18:37 compute-1 systemd[1]: run-netns-ovnmeta\x2d383de3b7\x2da202\x2d49cd\x2db129\x2db065ac294878.mount: Deactivated successfully.
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.430 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 934fff90-5446-41f1-a5ad-d2568cb337b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.431 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ffb869be-ebe4-4e20-be15-04f088eaa7c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.432 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 namespace which is not needed anymore
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[223183]: [NOTICE]   (223187) : haproxy version is 2.8.14-c23fe91
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[223183]: [NOTICE]   (223187) : path to executable is /usr/sbin/haproxy
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[223183]: [WARNING]  (223187) : Exiting Master process...
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[223183]: [WARNING]  (223187) : Exiting Master process...
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[223183]: [ALERT]    (223187) : Current worker (223189) exited with code 143 (Terminated)
Sep 30 21:18:37 compute-1 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[223183]: [WARNING]  (223187) : All workers exited. Exiting... (0)
Sep 30 21:18:37 compute-1 systemd[1]: libpod-b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615.scope: Deactivated successfully.
Sep 30 21:18:37 compute-1 podman[223431]: 2025-09-30 21:18:37.651187947 +0000 UTC m=+0.071855717 container died b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:18:37 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615-userdata-shm.mount: Deactivated successfully.
Sep 30 21:18:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-1f3f866fbd47f818615f665dd4595141b52cfa49b059ff72499baf961388f9cb-merged.mount: Deactivated successfully.
Sep 30 21:18:37 compute-1 podman[223431]: 2025-09-30 21:18:37.693019274 +0000 UTC m=+0.113687024 container cleanup b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:18:37 compute-1 systemd[1]: libpod-conmon-b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615.scope: Deactivated successfully.
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.768 2 DEBUG nova.network.neutron [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updated VIF entry in instance network info cache for port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.769 2 DEBUG nova.network.neutron [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updating instance_info_cache with network_info: [{"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:37 compute-1 podman[223460]: 2025-09-30 21:18:37.77638267 +0000 UTC m=+0.049786112 container remove b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.787 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7111c208-c710-46f1-aef8-cd960abc2fd8]: (4, ('Tue Sep 30 09:18:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 (b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615)\nb45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615\nTue Sep 30 09:18:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 (b45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615)\nb45144fbae23fbd8f95e8dd81dc2491e7d74839e8f9e032a0d779358e3926615\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.789 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[86b013cf-768e-4ed6-9be4-0fd2e27345c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.789 2 DEBUG oslo_concurrency.lockutils [req-5d8b4f40-0907-4be0-8939-cd879d40bf5c req-d801d605-fa76-4fc1-b8d6-11f46b116baa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.790 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:37 compute-1 kernel: tap934fff90-50: left promiscuous mode
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:37 compute-1 nova_compute[192795]: 2025-09-30 21:18:37.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.825 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9635db-9ab3-4518-8cd0-cef21d1133f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.854 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5200684c-9055-4840-ae60-7ef1e9e9989c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.855 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[404c5307-d238-497e-a4a5-7fdebbaf8dfe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.885 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[490fb457-5a16-4ea1-b736-8134bc25f071]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379879, 'reachable_time': 17735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223478, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.887 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:18:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:37.888 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[185e165a-208f-427a-8f0f-f67067d81a2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.181 2 DEBUG nova.compute.manager [req-59b8560b-dccb-4d3a-8a93-ef8663411f08 req-5e56d063-89ef-4df6-937d-6758a3c349c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.182 2 DEBUG oslo_concurrency.lockutils [req-59b8560b-dccb-4d3a-8a93-ef8663411f08 req-5e56d063-89ef-4df6-937d-6758a3c349c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.182 2 DEBUG oslo_concurrency.lockutils [req-59b8560b-dccb-4d3a-8a93-ef8663411f08 req-5e56d063-89ef-4df6-937d-6758a3c349c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.183 2 DEBUG oslo_concurrency.lockutils [req-59b8560b-dccb-4d3a-8a93-ef8663411f08 req-5e56d063-89ef-4df6-937d-6758a3c349c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.183 2 DEBUG nova.compute.manager [req-59b8560b-dccb-4d3a-8a93-ef8663411f08 req-5e56d063-89ef-4df6-937d-6758a3c349c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.184 2 DEBUG nova.compute.manager [req-59b8560b-dccb-4d3a-8a93-ef8663411f08 req-5e56d063-89ef-4df6-937d-6758a3c349c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:18:38 compute-1 systemd[1]: run-netns-ovnmeta\x2d934fff90\x2d5446\x2d41f1\x2da5ad\x2dd2568cb337b1.mount: Deactivated successfully.
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.559 2 DEBUG nova.network.neutron [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Activated binding for port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.560 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.562 2 DEBUG nova.virt.libvirt.vif [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:18:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-943144079',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-943144079',id=15,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:18:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-4e04clu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:18:27Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=800f4413-c978-4c4e-97b6-1ea1e45f9f17,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.563 2 DEBUG nova.network.os_vif_util [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converting VIF {"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.564 2 DEBUG nova.network.os_vif_util [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.564 2 DEBUG os_vif [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.568 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6fd9c21-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.579 2 INFO os_vif [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c')
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.579 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.580 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.580 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.581 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.581 2 INFO nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Deleting instance files /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17_del
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.582 2 INFO nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Deletion of /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17_del complete
Sep 30 21:18:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:38.679 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:38.680 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:38.680 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.715 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.716 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.717 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:38 compute-1 nova_compute[192795]: 2025-09-30 21:18:38.717 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.006 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.007 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5732MB free_disk=73.4624137878418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.007 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.008 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.047 2 INFO nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updating resource usage from migration 2385dcbc-a0c6-42bb-b744-9f34ef959fad
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.079 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Migration 2385dcbc-a0c6-42bb-b744-9f34ef959fad is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.079 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.080 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.121 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.141 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.171 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.171 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.454 2 DEBUG nova.compute.manager [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.455 2 DEBUG oslo_concurrency.lockutils [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.456 2 DEBUG oslo_concurrency.lockutils [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.456 2 DEBUG oslo_concurrency.lockutils [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.457 2 DEBUG nova.compute.manager [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.457 2 WARNING nova.compute.manager [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received unexpected event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with vm_state active and task_state migrating.
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.458 2 DEBUG nova.compute.manager [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.458 2 DEBUG oslo_concurrency.lockutils [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.459 2 DEBUG oslo_concurrency.lockutils [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.459 2 DEBUG oslo_concurrency.lockutils [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.460 2 DEBUG nova.compute.manager [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.460 2 WARNING nova.compute.manager [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received unexpected event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with vm_state active and task_state migrating.
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.461 2 DEBUG nova.compute.manager [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.462 2 DEBUG oslo_concurrency.lockutils [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.462 2 DEBUG oslo_concurrency.lockutils [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.463 2 DEBUG oslo_concurrency.lockutils [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.463 2 DEBUG nova.compute.manager [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:39 compute-1 nova_compute[192795]: 2025-09-30 21:18:39.464 2 WARNING nova.compute.manager [req-703ddf13-9086-4e90-a3b4-f4be1af84db3 req-d4b5de37-0243-4ed9-816e-c1f8ae2d1d35 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received unexpected event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with vm_state active and task_state migrating.
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.171 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.171 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.189 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.190 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.191 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.334 2 DEBUG nova.compute.manager [req-fe315b6b-1713-462e-95ff-8b862672530c req-e3555787-14fd-43a1-88ff-f28ed5df2a7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.335 2 DEBUG oslo_concurrency.lockutils [req-fe315b6b-1713-462e-95ff-8b862672530c req-e3555787-14fd-43a1-88ff-f28ed5df2a7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.335 2 DEBUG oslo_concurrency.lockutils [req-fe315b6b-1713-462e-95ff-8b862672530c req-e3555787-14fd-43a1-88ff-f28ed5df2a7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.336 2 DEBUG oslo_concurrency.lockutils [req-fe315b6b-1713-462e-95ff-8b862672530c req-e3555787-14fd-43a1-88ff-f28ed5df2a7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.336 2 DEBUG nova.compute.manager [req-fe315b6b-1713-462e-95ff-8b862672530c req-e3555787-14fd-43a1-88ff-f28ed5df2a7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:40 compute-1 nova_compute[192795]: 2025-09-30 21:18:40.337 2 WARNING nova.compute.manager [req-fe315b6b-1713-462e-95ff-8b862672530c req-e3555787-14fd-43a1-88ff-f28ed5df2a7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received unexpected event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with vm_state active and task_state migrating.
Sep 30 21:18:41 compute-1 nova_compute[192795]: 2025-09-30 21:18:41.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:42 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:18:42 compute-1 systemd[223228]: Activating special unit Exit the Session...
Sep 30 21:18:42 compute-1 systemd[223228]: Stopped target Main User Target.
Sep 30 21:18:42 compute-1 systemd[223228]: Stopped target Basic System.
Sep 30 21:18:42 compute-1 systemd[223228]: Stopped target Paths.
Sep 30 21:18:42 compute-1 systemd[223228]: Stopped target Sockets.
Sep 30 21:18:42 compute-1 systemd[223228]: Stopped target Timers.
Sep 30 21:18:42 compute-1 systemd[223228]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:18:42 compute-1 systemd[223228]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:18:42 compute-1 systemd[223228]: Closed D-Bus User Message Bus Socket.
Sep 30 21:18:42 compute-1 systemd[223228]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:18:42 compute-1 systemd[223228]: Removed slice User Application Slice.
Sep 30 21:18:42 compute-1 systemd[223228]: Reached target Shutdown.
Sep 30 21:18:42 compute-1 systemd[223228]: Finished Exit the Session.
Sep 30 21:18:42 compute-1 systemd[223228]: Reached target Exit the Session.
Sep 30 21:18:42 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:18:42 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:18:42 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:18:42 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:18:42 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:18:42 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:18:42 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:18:43 compute-1 podman[223481]: 2025-09-30 21:18:43.273686408 +0000 UTC m=+0.107782535 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:18:43 compute-1 nova_compute[192795]: 2025-09-30 21:18:43.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.015 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.015 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.015 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.015 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.015 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.015 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.016 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.016 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.016 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.016 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.016 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.016 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.017 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.017 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.017 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.017 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.017 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:18:44.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.160 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.161 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.161 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.198 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.198 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.199 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.199 2 DEBUG nova.compute.resource_tracker [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.443 2 WARNING nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.444 2 DEBUG nova.compute.resource_tracker [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5754MB free_disk=73.46245193481445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.445 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.445 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.485 2 DEBUG nova.compute.resource_tracker [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Migration for instance 800f4413-c978-4c4e-97b6-1ea1e45f9f17 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.521 2 DEBUG nova.compute.resource_tracker [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.561 2 DEBUG nova.compute.resource_tracker [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Migration 2385dcbc-a0c6-42bb-b744-9f34ef959fad is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.562 2 DEBUG nova.compute.resource_tracker [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.562 2 DEBUG nova.compute.resource_tracker [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.623 2 DEBUG nova.compute.provider_tree [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.646 2 DEBUG nova.scheduler.client.report [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.697 2 DEBUG nova.compute.resource_tracker [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.697 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.718 2 INFO nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.807 2 INFO nova.scheduler.client.report [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Deleted allocation for migration 2385dcbc-a0c6-42bb-b744-9f34ef959fad
Sep 30 21:18:44 compute-1 nova_compute[192795]: 2025-09-30 21:18:44.808 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Sep 30 21:18:46 compute-1 nova_compute[192795]: 2025-09-30 21:18:46.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:46.788 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:46.789 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:18:46 compute-1 nova_compute[192795]: 2025-09-30 21:18:46.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:18:47.791 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:48 compute-1 nova_compute[192795]: 2025-09-30 21:18:48.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:50 compute-1 podman[223501]: 2025-09-30 21:18:50.2327119 +0000 UTC m=+0.077636143 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Sep 30 21:18:50 compute-1 podman[223502]: 2025-09-30 21:18:50.251750283 +0000 UTC m=+0.084222500 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:18:51 compute-1 podman[223544]: 2025-09-30 21:18:51.261433898 +0000 UTC m=+0.106976162 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Sep 30 21:18:51 compute-1 nova_compute[192795]: 2025-09-30 21:18:51.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:52 compute-1 nova_compute[192795]: 2025-09-30 21:18:52.173 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267117.1727467, 800f4413-c978-4c4e-97b6-1ea1e45f9f17 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:52 compute-1 nova_compute[192795]: 2025-09-30 21:18:52.174 2 INFO nova.compute.manager [-] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] VM Stopped (Lifecycle Event)
Sep 30 21:18:52 compute-1 nova_compute[192795]: 2025-09-30 21:18:52.200 2 DEBUG nova.compute.manager [None req-6a73c3c2-126b-4fc4-97a4-e85d8a29f498 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:53 compute-1 podman[223572]: 2025-09-30 21:18:53.217474181 +0000 UTC m=+0.063250305 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.487 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquiring lock "ff783766-0cd8-4198-9a57-fe6f744ac4a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.487 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "ff783766-0cd8-4198-9a57-fe6f744ac4a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.502 2 DEBUG nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.614 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.615 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.624 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.625 2 INFO nova.compute.claims [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.738 2 DEBUG nova.compute.provider_tree [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.751 2 DEBUG nova.scheduler.client.report [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.768 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.769 2 DEBUG nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.827 2 DEBUG nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.828 2 DEBUG nova.network.neutron [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.844 2 INFO nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:18:53 compute-1 nova_compute[192795]: 2025-09-30 21:18:53.867 2 DEBUG nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.003 2 DEBUG nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.004 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.005 2 INFO nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Creating image(s)
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.005 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquiring lock "/var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.006 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "/var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.007 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "/var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.023 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.114 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.116 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.116 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.130 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.200 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.201 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.221 2 DEBUG nova.network.neutron [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.222 2 DEBUG nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.238 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.239 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.240 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.299 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.300 2 DEBUG nova.virt.disk.api [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Checking if we can resize image /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.301 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.361 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.362 2 DEBUG nova.virt.disk.api [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Cannot resize image /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.362 2 DEBUG nova.objects.instance [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lazy-loading 'migration_context' on Instance uuid ff783766-0cd8-4198-9a57-fe6f744ac4a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.374 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.375 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Ensure instance console log exists: /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.375 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.376 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.376 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.377 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.382 2 WARNING nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.386 2 DEBUG nova.virt.libvirt.host [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.387 2 DEBUG nova.virt.libvirt.host [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.389 2 DEBUG nova.virt.libvirt.host [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.390 2 DEBUG nova.virt.libvirt.host [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.392 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.392 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.393 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.393 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.393 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.393 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.393 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.394 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.394 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.394 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.394 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.394 2 DEBUG nova.virt.hardware [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.398 2 DEBUG nova.objects.instance [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lazy-loading 'pci_devices' on Instance uuid ff783766-0cd8-4198-9a57-fe6f744ac4a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.416 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <uuid>ff783766-0cd8-4198-9a57-fe6f744ac4a2</uuid>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <name>instance-00000011</name>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <nova:name>tempest-DeleteServersAdminTestJSON-server-1241421842</nova:name>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:18:54</nova:creationTime>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:18:54 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:18:54 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:18:54 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:18:54 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:18:54 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:18:54 compute-1 nova_compute[192795]:         <nova:user uuid="12e2106a93c04344a4634369773c1e19">tempest-DeleteServersAdminTestJSON-208471385-project-member</nova:user>
Sep 30 21:18:54 compute-1 nova_compute[192795]:         <nova:project uuid="ef5729988a5c4ca3994dd0fb6bacca6f">tempest-DeleteServersAdminTestJSON-208471385</nova:project>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <system>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <entry name="serial">ff783766-0cd8-4198-9a57-fe6f744ac4a2</entry>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <entry name="uuid">ff783766-0cd8-4198-9a57-fe6f744ac4a2</entry>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     </system>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <os>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   </os>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <features>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   </features>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk.config"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/console.log" append="off"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <video>
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     </video>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:18:54 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:18:54 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:18:54 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:18:54 compute-1 nova_compute[192795]: </domain>
Sep 30 21:18:54 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.477 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.478 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.479 2 INFO nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Using config drive
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.729 2 INFO nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Creating config drive at /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk.config
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.739 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpue2s2nkc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:54 compute-1 nova_compute[192795]: 2025-09-30 21:18:54.870 2 DEBUG oslo_concurrency.processutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpue2s2nkc" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:54 compute-1 systemd-machined[152783]: New machine qemu-10-instance-00000011.
Sep 30 21:18:54 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-00000011.
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.725 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267136.7242637, ff783766-0cd8-4198-9a57-fe6f744ac4a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.725 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] VM Resumed (Lifecycle Event)
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.729 2 DEBUG nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.730 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.735 2 INFO nova.virt.libvirt.driver [-] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Instance spawned successfully.
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.736 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.750 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.755 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.758 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.758 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.759 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.759 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.760 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.760 2 DEBUG nova.virt.libvirt.driver [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.783 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.783 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267136.725786, ff783766-0cd8-4198-9a57-fe6f744ac4a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.783 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] VM Started (Lifecycle Event)
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.822 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.827 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.853 2 INFO nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Took 2.85 seconds to spawn the instance on the hypervisor.
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.854 2 DEBUG nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.859 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:18:56 compute-1 nova_compute[192795]: 2025-09-30 21:18:56.987 2 INFO nova.compute.manager [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Took 3.43 seconds to build instance.
Sep 30 21:18:57 compute-1 nova_compute[192795]: 2025-09-30 21:18:57.011 2 DEBUG oslo_concurrency.lockutils [None req-a2078960-776c-4cc2-ae92-2d716c209c2a 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "ff783766-0cd8-4198-9a57-fe6f744ac4a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.826 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquiring lock "ff783766-0cd8-4198-9a57-fe6f744ac4a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.827 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "ff783766-0cd8-4198-9a57-fe6f744ac4a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.827 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquiring lock "ff783766-0cd8-4198-9a57-fe6f744ac4a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.828 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "ff783766-0cd8-4198-9a57-fe6f744ac4a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.829 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "ff783766-0cd8-4198-9a57-fe6f744ac4a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.844 2 INFO nova.compute.manager [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Terminating instance
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.857 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquiring lock "refresh_cache-ff783766-0cd8-4198-9a57-fe6f744ac4a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.857 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquired lock "refresh_cache-ff783766-0cd8-4198-9a57-fe6f744ac4a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:18:58 compute-1 nova_compute[192795]: 2025-09-30 21:18:58.858 2 DEBUG nova.network.neutron [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.026 2 DEBUG nova.network.neutron [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.544 2 DEBUG nova.network.neutron [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.569 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Releasing lock "refresh_cache-ff783766-0cd8-4198-9a57-fe6f744ac4a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.570 2 DEBUG nova.compute.manager [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:18:59 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000011.scope: Deactivated successfully.
Sep 30 21:18:59 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000011.scope: Consumed 4.528s CPU time.
Sep 30 21:18:59 compute-1 systemd-machined[152783]: Machine qemu-10-instance-00000011 terminated.
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.840 2 INFO nova.virt.libvirt.driver [-] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Instance destroyed successfully.
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.841 2 DEBUG nova.objects.instance [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lazy-loading 'resources' on Instance uuid ff783766-0cd8-4198-9a57-fe6f744ac4a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.861 2 INFO nova.virt.libvirt.driver [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Deleting instance files /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2_del
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.862 2 INFO nova.virt.libvirt.driver [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Deletion of /var/lib/nova/instances/ff783766-0cd8-4198-9a57-fe6f744ac4a2_del complete
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.946 2 INFO nova.compute.manager [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.947 2 DEBUG oslo.service.loopingcall [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.948 2 DEBUG nova.compute.manager [-] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:18:59 compute-1 nova_compute[192795]: 2025-09-30 21:18:59.948 2 DEBUG nova.network.neutron [-] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.216 2 DEBUG nova.network.neutron [-] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.242 2 DEBUG nova.network.neutron [-] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.266 2 INFO nova.compute.manager [-] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Took 0.32 seconds to deallocate network for instance.
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.353 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.353 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.433 2 DEBUG nova.compute.provider_tree [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.451 2 DEBUG nova.scheduler.client.report [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.476 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.532 2 INFO nova.scheduler.client.report [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Deleted allocations for instance ff783766-0cd8-4198-9a57-fe6f744ac4a2
Sep 30 21:19:00 compute-1 nova_compute[192795]: 2025-09-30 21:19:00.645 2 DEBUG oslo_concurrency.lockutils [None req-4aea6d66-f7a2-48b5-9cd0-2b0275d60304 12e2106a93c04344a4634369773c1e19 ef5729988a5c4ca3994dd0fb6bacca6f - - default default] Lock "ff783766-0cd8-4198-9a57-fe6f744ac4a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:01 compute-1 nova_compute[192795]: 2025-09-30 21:19:01.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:02 compute-1 podman[223643]: 2025-09-30 21:19:02.238518371 +0000 UTC m=+0.072481804 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:19:03 compute-1 nova_compute[192795]: 2025-09-30 21:19:03.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:04 compute-1 podman[223662]: 2025-09-30 21:19:04.226196187 +0000 UTC m=+0.071637482 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Sep 30 21:19:06 compute-1 podman[223683]: 2025-09-30 21:19:06.220242544 +0000 UTC m=+0.062621789 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:19:06 compute-1 nova_compute[192795]: 2025-09-30 21:19:06.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:08 compute-1 nova_compute[192795]: 2025-09-30 21:19:08.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.014 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquiring lock "7987f03a-80f7-420b-ae35-d3256dd9c382" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.014 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "7987f03a-80f7-420b-ae35-d3256dd9c382" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.039 2 DEBUG nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.200 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.201 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.209 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.209 2 INFO nova.compute.claims [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.398 2 DEBUG nova.compute.provider_tree [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.416 2 DEBUG nova.scheduler.client.report [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.443 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.444 2 DEBUG nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.511 2 DEBUG nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.512 2 DEBUG nova.network.neutron [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.545 2 INFO nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.592 2 DEBUG nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.733 2 DEBUG nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.734 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.734 2 INFO nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Creating image(s)
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.735 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquiring lock "/var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.735 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "/var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.736 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "/var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.748 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.803 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.804 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.805 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.817 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.875 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.876 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.908 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.909 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:10 compute-1 nova_compute[192795]: 2025-09-30 21:19:10.910 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.001 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.002 2 DEBUG nova.virt.disk.api [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Checking if we can resize image /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.003 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.055 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.056 2 DEBUG nova.virt.disk.api [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Cannot resize image /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.057 2 DEBUG nova.objects.instance [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lazy-loading 'migration_context' on Instance uuid 7987f03a-80f7-420b-ae35-d3256dd9c382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.105 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.106 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Ensure instance console log exists: /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.106 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.107 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.107 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.255 2 DEBUG nova.network.neutron [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.256 2 DEBUG nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.257 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.261 2 WARNING nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.265 2 DEBUG nova.virt.libvirt.host [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.266 2 DEBUG nova.virt.libvirt.host [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.273 2 DEBUG nova.virt.libvirt.host [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.273 2 DEBUG nova.virt.libvirt.host [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.274 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.274 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.275 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.275 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.275 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.276 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.276 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.276 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.276 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.276 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.277 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.277 2 DEBUG nova.virt.hardware [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.281 2 DEBUG nova.objects.instance [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7987f03a-80f7-420b-ae35-d3256dd9c382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.303 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <uuid>7987f03a-80f7-420b-ae35-d3256dd9c382</uuid>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <name>instance-00000013</name>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerDiagnosticsTest-server-276869958</nova:name>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:19:11</nova:creationTime>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:19:11 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:19:11 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:19:11 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:19:11 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:19:11 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:19:11 compute-1 nova_compute[192795]:         <nova:user uuid="98f4997d636346abb1eca7276d3e56b4">tempest-ServerDiagnosticsTest-15936370-project-member</nova:user>
Sep 30 21:19:11 compute-1 nova_compute[192795]:         <nova:project uuid="5f732bb5a97f4af38b9aae51d05afcf4">tempest-ServerDiagnosticsTest-15936370</nova:project>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <system>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <entry name="serial">7987f03a-80f7-420b-ae35-d3256dd9c382</entry>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <entry name="uuid">7987f03a-80f7-420b-ae35-d3256dd9c382</entry>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     </system>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <os>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   </os>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <features>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   </features>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk.config"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/console.log" append="off"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <video>
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     </video>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:19:11 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:19:11 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:19:11 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:19:11 compute-1 nova_compute[192795]: </domain>
Sep 30 21:19:11 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.368 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.368 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.369 2 INFO nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Using config drive
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.780 2 INFO nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Creating config drive at /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk.config
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.786 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpckyt8xkg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:11 compute-1 nova_compute[192795]: 2025-09-30 21:19:11.935 2 DEBUG oslo_concurrency.processutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpckyt8xkg" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:12 compute-1 systemd-machined[152783]: New machine qemu-11-instance-00000013.
Sep 30 21:19:12 compute-1 systemd[1]: Started Virtual Machine qemu-11-instance-00000013.
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.945 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267152.944959, 7987f03a-80f7-420b-ae35-d3256dd9c382 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.946 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] VM Resumed (Lifecycle Event)
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.950 2 DEBUG nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.951 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.954 2 INFO nova.virt.libvirt.driver [-] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Instance spawned successfully.
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.955 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.978 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.987 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.991 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.991 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.992 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.993 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.994 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:12 compute-1 nova_compute[192795]: 2025-09-30 21:19:12.995 2 DEBUG nova.virt.libvirt.driver [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.009 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.010 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267152.9461308, 7987f03a-80f7-420b-ae35-d3256dd9c382 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.010 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] VM Started (Lifecycle Event)
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.031 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.039 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.064 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.103 2 INFO nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Took 2.37 seconds to spawn the instance on the hypervisor.
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.104 2 DEBUG nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:13 compute-1 sshd-session[223750]: Invalid user pos from 167.71.248.239 port 45814
Sep 30 21:19:13 compute-1 sshd-session[223750]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:19:13 compute-1 sshd-session[223750]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.350 2 INFO nova.compute.manager [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Took 3.24 seconds to build instance.
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.377 2 DEBUG oslo_concurrency.lockutils [None req-3061a3d4-97fa-47c7-9603-6bd5ecf03245 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "7987f03a-80f7-420b-ae35-d3256dd9c382" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:13 compute-1 nova_compute[192795]: 2025-09-30 21:19:13.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:14 compute-1 podman[223752]: 2025-09-30 21:19:14.235778452 +0000 UTC m=+0.073769189 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:19:14 compute-1 nova_compute[192795]: 2025-09-30 21:19:14.837 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267139.8362134, ff783766-0cd8-4198-9a57-fe6f744ac4a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:19:14 compute-1 nova_compute[192795]: 2025-09-30 21:19:14.840 2 INFO nova.compute.manager [-] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] VM Stopped (Lifecycle Event)
Sep 30 21:19:15 compute-1 sshd-session[223750]: Failed password for invalid user pos from 167.71.248.239 port 45814 ssh2
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.066 2 DEBUG nova.compute.manager [None req-188c3f49-7ed8-4f34-9d64-064d1ab51239 720432fc1ed846a9b553865361f656b4 a8c96db3ca5a4050ae7086e9ac7f330c - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.071 2 INFO nova.compute.manager [None req-188c3f49-7ed8-4f34-9d64-064d1ab51239 720432fc1ed846a9b553865361f656b4 a8c96db3ca5a4050ae7086e9ac7f330c - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Retrieving diagnostics
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.162 2 DEBUG nova.compute.manager [None req-8a31f784-682d-4150-bf24-129b67d00575 - - - - - -] [instance: ff783766-0cd8-4198-9a57-fe6f744ac4a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.332 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquiring lock "7987f03a-80f7-420b-ae35-d3256dd9c382" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.333 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "7987f03a-80f7-420b-ae35-d3256dd9c382" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.334 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquiring lock "7987f03a-80f7-420b-ae35-d3256dd9c382-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.334 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "7987f03a-80f7-420b-ae35-d3256dd9c382-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.335 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "7987f03a-80f7-420b-ae35-d3256dd9c382-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.352 2 INFO nova.compute.manager [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Terminating instance
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.371 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquiring lock "refresh_cache-7987f03a-80f7-420b-ae35-d3256dd9c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.372 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquired lock "refresh_cache-7987f03a-80f7-420b-ae35-d3256dd9c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.372 2 DEBUG nova.network.neutron [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:19:16 compute-1 sshd-session[223750]: Connection closed by invalid user pos 167.71.248.239 port 45814 [preauth]
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:16 compute-1 nova_compute[192795]: 2025-09-30 21:19:16.754 2 DEBUG nova.network.neutron [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.261 2 DEBUG nova.network.neutron [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.279 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Releasing lock "refresh_cache-7987f03a-80f7-420b-ae35-d3256dd9c382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.281 2 DEBUG nova.compute.manager [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:19:17 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000013.scope: Deactivated successfully.
Sep 30 21:19:17 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000013.scope: Consumed 5.165s CPU time.
Sep 30 21:19:17 compute-1 systemd-machined[152783]: Machine qemu-11-instance-00000013 terminated.
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.551 2 INFO nova.virt.libvirt.driver [-] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Instance destroyed successfully.
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.552 2 DEBUG nova.objects.instance [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lazy-loading 'resources' on Instance uuid 7987f03a-80f7-420b-ae35-d3256dd9c382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.568 2 INFO nova.virt.libvirt.driver [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Deleting instance files /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382_del
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.569 2 INFO nova.virt.libvirt.driver [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Deletion of /var/lib/nova/instances/7987f03a-80f7-420b-ae35-d3256dd9c382_del complete
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.658 2 INFO nova.compute.manager [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.659 2 DEBUG oslo.service.loopingcall [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.660 2 DEBUG nova.compute.manager [-] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.660 2 DEBUG nova.network.neutron [-] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.931 2 DEBUG nova.network.neutron [-] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.950 2 DEBUG nova.network.neutron [-] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:19:17 compute-1 nova_compute[192795]: 2025-09-30 21:19:17.967 2 INFO nova.compute.manager [-] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Took 0.31 seconds to deallocate network for instance.
Sep 30 21:19:18 compute-1 nova_compute[192795]: 2025-09-30 21:19:18.065 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:18 compute-1 nova_compute[192795]: 2025-09-30 21:19:18.066 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:18 compute-1 nova_compute[192795]: 2025-09-30 21:19:18.174 2 DEBUG nova.compute.provider_tree [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:19:18 compute-1 nova_compute[192795]: 2025-09-30 21:19:18.206 2 DEBUG nova.scheduler.client.report [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:19:18 compute-1 nova_compute[192795]: 2025-09-30 21:19:18.237 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:18 compute-1 nova_compute[192795]: 2025-09-30 21:19:18.264 2 INFO nova.scheduler.client.report [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Deleted allocations for instance 7987f03a-80f7-420b-ae35-d3256dd9c382
Sep 30 21:19:18 compute-1 nova_compute[192795]: 2025-09-30 21:19:18.365 2 DEBUG oslo_concurrency.lockutils [None req-36619068-3a2d-4c2b-b7ed-8e91ca7d0376 98f4997d636346abb1eca7276d3e56b4 5f732bb5a97f4af38b9aae51d05afcf4 - - default default] Lock "7987f03a-80f7-420b-ae35-d3256dd9c382" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:18 compute-1 nova_compute[192795]: 2025-09-30 21:19:18.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:21 compute-1 podman[223782]: 2025-09-30 21:19:21.257684607 +0000 UTC m=+0.088019952 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:19:21 compute-1 podman[223781]: 2025-09-30 21:19:21.262495707 +0000 UTC m=+0.097285242 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd)
Sep 30 21:19:21 compute-1 podman[223822]: 2025-09-30 21:19:21.475575718 +0000 UTC m=+0.167894145 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:19:21 compute-1 nova_compute[192795]: 2025-09-30 21:19:21.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:23 compute-1 nova_compute[192795]: 2025-09-30 21:19:23.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:24 compute-1 podman[223850]: 2025-09-30 21:19:24.257871063 +0000 UTC m=+0.089486052 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Sep 30 21:19:26 compute-1 nova_compute[192795]: 2025-09-30 21:19:26.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:27 compute-1 nova_compute[192795]: 2025-09-30 21:19:27.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:27 compute-1 nova_compute[192795]: 2025-09-30 21:19:27.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:19:28 compute-1 nova_compute[192795]: 2025-09-30 21:19:28.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:31 compute-1 ovn_controller[94902]: 2025-09-30T21:19:31Z|00121|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 21:19:31 compute-1 nova_compute[192795]: 2025-09-30 21:19:31.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.550 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267157.5476573, 7987f03a-80f7-420b-ae35-d3256dd9c382 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.551 2 INFO nova.compute.manager [-] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] VM Stopped (Lifecycle Event)
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.596 2 DEBUG nova.compute.manager [None req-e225a76b-dd41-4b27-871b-e946a77f7f13 - - - - - -] [instance: 7987f03a-80f7-420b-ae35-d3256dd9c382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.656 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquiring lock "7738fa4c-805c-475a-92c2-219468edc691" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.657 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "7738fa4c-805c-475a-92c2-219468edc691" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.677 2 DEBUG nova.compute.manager [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.716 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.821 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.822 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.830 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:19:32 compute-1 nova_compute[192795]: 2025-09-30 21:19:32.831 2 INFO nova.compute.claims [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:19:33 compute-1 nova_compute[192795]: 2025-09-30 21:19:33.145 2 DEBUG nova.compute.provider_tree [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:19:33 compute-1 nova_compute[192795]: 2025-09-30 21:19:33.167 2 DEBUG nova.scheduler.client.report [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:19:33 compute-1 nova_compute[192795]: 2025-09-30 21:19:33.209 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:33 compute-1 nova_compute[192795]: 2025-09-30 21:19:33.210 2 DEBUG nova.compute.manager [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:19:33 compute-1 podman[223872]: 2025-09-30 21:19:33.234093575 +0000 UTC m=+0.066092161 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:19:33 compute-1 nova_compute[192795]: 2025-09-30 21:19:33.278 2 DEBUG nova.compute.manager [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Sep 30 21:19:33 compute-1 nova_compute[192795]: 2025-09-30 21:19:33.313 2 INFO nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:19:33 compute-1 nova_compute[192795]: 2025-09-30 21:19:33.339 2 DEBUG nova.compute.manager [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:19:33 compute-1 nova_compute[192795]: 2025-09-30 21:19:33.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:33 compute-1 nova_compute[192795]: 2025-09-30 21:19:33.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:34 compute-1 nova_compute[192795]: 2025-09-30 21:19:34.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:34 compute-1 nova_compute[192795]: 2025-09-30 21:19:34.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.215 2 DEBUG nova.compute.manager [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.216 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.217 2 INFO nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Creating image(s)
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.218 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquiring lock "/var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.218 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "/var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.219 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "/var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.244 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:35 compute-1 podman[223891]: 2025-09-30 21:19:35.273588587 +0000 UTC m=+0.106688115 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.274 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.324 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.325 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.326 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.350 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.430 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.431 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.483 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.484 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.485 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.549 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.551 2 DEBUG nova.virt.disk.api [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Checking if we can resize image /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.552 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.627 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.629 2 DEBUG nova.virt.disk.api [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Cannot resize image /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.629 2 DEBUG nova.objects.instance [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lazy-loading 'migration_context' on Instance uuid 7738fa4c-805c-475a-92c2-219468edc691 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.667 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.668 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Ensure instance console log exists: /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.669 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.669 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.670 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.673 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.680 2 WARNING nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.695 2 DEBUG nova.virt.libvirt.host [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.696 2 DEBUG nova.virt.libvirt.host [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.699 2 DEBUG nova.virt.libvirt.host [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.700 2 DEBUG nova.virt.libvirt.host [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.702 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.703 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.704 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.705 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.705 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.706 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.706 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.707 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.708 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.708 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.709 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.709 2 DEBUG nova.virt.hardware [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.718 2 DEBUG nova.objects.instance [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7738fa4c-805c-475a-92c2-219468edc691 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.744 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <uuid>7738fa4c-805c-475a-92c2-219468edc691</uuid>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <name>instance-00000014</name>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerDiagnosticsV248Test-server-1658679336</nova:name>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:19:35</nova:creationTime>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:19:35 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:19:35 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:19:35 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:19:35 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:19:35 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:19:35 compute-1 nova_compute[192795]:         <nova:user uuid="b17b6801df2c43bc95b83132dc22d974">tempest-ServerDiagnosticsV248Test-1769896207-project-member</nova:user>
Sep 30 21:19:35 compute-1 nova_compute[192795]:         <nova:project uuid="32b6ff8e31dd4de0864d848bd8017886">tempest-ServerDiagnosticsV248Test-1769896207</nova:project>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <system>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <entry name="serial">7738fa4c-805c-475a-92c2-219468edc691</entry>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <entry name="uuid">7738fa4c-805c-475a-92c2-219468edc691</entry>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     </system>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <os>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   </os>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <features>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   </features>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk.config"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/console.log" append="off"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <video>
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     </video>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:19:35 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:19:35 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:19:35 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:19:35 compute-1 nova_compute[192795]: </domain>
Sep 30 21:19:35 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.816 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.816 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:19:35 compute-1 nova_compute[192795]: 2025-09-30 21:19:35.817 2 INFO nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Using config drive
Sep 30 21:19:36 compute-1 nova_compute[192795]: 2025-09-30 21:19:36.347 2 INFO nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Creating config drive at /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk.config
Sep 30 21:19:36 compute-1 nova_compute[192795]: 2025-09-30 21:19:36.355 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp53hpy0pp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:36 compute-1 nova_compute[192795]: 2025-09-30 21:19:36.491 2 DEBUG oslo_concurrency.processutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp53hpy0pp" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:36 compute-1 systemd-machined[152783]: New machine qemu-12-instance-00000014.
Sep 30 21:19:36 compute-1 systemd[1]: Started Virtual Machine qemu-12-instance-00000014.
Sep 30 21:19:36 compute-1 nova_compute[192795]: 2025-09-30 21:19:36.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:36 compute-1 podman[223939]: 2025-09-30 21:19:36.695855229 +0000 UTC m=+0.099613125 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.351 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267177.3506634, 7738fa4c-805c-475a-92c2-219468edc691 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.352 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] VM Resumed (Lifecycle Event)
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.357 2 DEBUG nova.compute.manager [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.358 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.364 2 INFO nova.virt.libvirt.driver [-] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Instance spawned successfully.
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.365 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.391 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.396 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.409 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.410 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.410 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.415 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.416 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.417 2 DEBUG nova.virt.libvirt.driver [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.461 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.462 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267177.3530664, 7738fa4c-805c-475a-92c2-219468edc691 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.462 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] VM Started (Lifecycle Event)
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.493 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.498 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.517 2 INFO nova.compute.manager [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Took 2.30 seconds to spawn the instance on the hypervisor.
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.517 2 DEBUG nova.compute.manager [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.542 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.642 2 INFO nova.compute.manager [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Took 4.88 seconds to build instance.
Sep 30 21:19:37 compute-1 nova_compute[192795]: 2025-09-30 21:19:37.666 2 DEBUG oslo_concurrency.lockutils [None req-cea5c925-8b7f-409e-b59e-fede49a2d433 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "7738fa4c-805c-475a-92c2-219468edc691" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:38 compute-1 nova_compute[192795]: 2025-09-30 21:19:38.279 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:38 compute-1 nova_compute[192795]: 2025-09-30 21:19:38.280 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:38 compute-1 nova_compute[192795]: 2025-09-30 21:19:38.280 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:38 compute-1 nova_compute[192795]: 2025-09-30 21:19:38.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:19:38.680 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:19:38.680 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:19:38.681 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:38 compute-1 nova_compute[192795]: 2025-09-30 21:19:38.914 2 DEBUG nova.compute.manager [None req-dcb3ea24-5316-4d2d-8af5-94c7b7c36dd4 7ec8a3417ec94f12884ca9ea29e18820 b8541d7587a342da980451042b5c27ac - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:38 compute-1 nova_compute[192795]: 2025-09-30 21:19:38.918 2 INFO nova.compute.manager [None req-dcb3ea24-5316-4d2d-8af5-94c7b7c36dd4 7ec8a3417ec94f12884ca9ea29e18820 b8541d7587a342da980451042b5c27ac - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Retrieving diagnostics
Sep 30 21:19:39 compute-1 nova_compute[192795]: 2025-09-30 21:19:39.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:39 compute-1 nova_compute[192795]: 2025-09-30 21:19:39.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:39 compute-1 nova_compute[192795]: 2025-09-30 21:19:39.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:19:39 compute-1 nova_compute[192795]: 2025-09-30 21:19:39.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:40 compute-1 nova_compute[192795]: 2025-09-30 21:19:40.745 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:40 compute-1 nova_compute[192795]: 2025-09-30 21:19:40.745 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:19:40 compute-1 nova_compute[192795]: 2025-09-30 21:19:40.746 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:19:41 compute-1 nova_compute[192795]: 2025-09-30 21:19:41.185 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-7738fa4c-805c-475a-92c2-219468edc691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:19:41 compute-1 nova_compute[192795]: 2025-09-30 21:19:41.186 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-7738fa4c-805c-475a-92c2-219468edc691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:19:41 compute-1 nova_compute[192795]: 2025-09-30 21:19:41.186 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:19:41 compute-1 nova_compute[192795]: 2025-09-30 21:19:41.187 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7738fa4c-805c-475a-92c2-219468edc691 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:19:41 compute-1 nova_compute[192795]: 2025-09-30 21:19:41.492 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:19:41 compute-1 nova_compute[192795]: 2025-09-30 21:19:41.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.120 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.137 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-7738fa4c-805c-475a-92c2-219468edc691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.138 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.138 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.183 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.184 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.184 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.185 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.313 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.401 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.403 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.497 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.719 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.721 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5658MB free_disk=73.46171951293945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.722 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.722 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.889 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 7738fa4c-805c-475a-92c2-219468edc691 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.890 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:19:42 compute-1 nova_compute[192795]: 2025-09-30 21:19:42.890 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:19:43 compute-1 nova_compute[192795]: 2025-09-30 21:19:43.399 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:19:43 compute-1 nova_compute[192795]: 2025-09-30 21:19:43.427 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:19:43 compute-1 nova_compute[192795]: 2025-09-30 21:19:43.485 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:19:43 compute-1 nova_compute[192795]: 2025-09-30 21:19:43.486 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:43 compute-1 nova_compute[192795]: 2025-09-30 21:19:43.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:45 compute-1 podman[223989]: 2025-09-30 21:19:45.235212951 +0000 UTC m=+0.076136403 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid)
Sep 30 21:19:46 compute-1 nova_compute[192795]: 2025-09-30 21:19:46.429 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:46 compute-1 nova_compute[192795]: 2025-09-30 21:19:46.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:48 compute-1 nova_compute[192795]: 2025-09-30 21:19:48.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:19:49.231 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:19:49 compute-1 nova_compute[192795]: 2025-09-30 21:19:49.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:19:49.235 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:19:50 compute-1 nova_compute[192795]: 2025-09-30 21:19:50.688 2 DEBUG nova.compute.manager [None req-7ac8d86f-0650-4515-bf85-9e21bf225902 7ec8a3417ec94f12884ca9ea29e18820 b8541d7587a342da980451042b5c27ac - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:50 compute-1 nova_compute[192795]: 2025-09-30 21:19:50.694 2 INFO nova.compute.manager [None req-7ac8d86f-0650-4515-bf85-9e21bf225902 7ec8a3417ec94f12884ca9ea29e18820 b8541d7587a342da980451042b5c27ac - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Retrieving diagnostics
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.263 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquiring lock "7738fa4c-805c-475a-92c2-219468edc691" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.264 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "7738fa4c-805c-475a-92c2-219468edc691" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.264 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquiring lock "7738fa4c-805c-475a-92c2-219468edc691-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.264 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "7738fa4c-805c-475a-92c2-219468edc691-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.265 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "7738fa4c-805c-475a-92c2-219468edc691-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.283 2 INFO nova.compute.manager [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Terminating instance
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.292 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquiring lock "refresh_cache-7738fa4c-805c-475a-92c2-219468edc691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.293 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquired lock "refresh_cache-7738fa4c-805c-475a-92c2-219468edc691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.293 2 DEBUG nova.network.neutron [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:51 compute-1 nova_compute[192795]: 2025-09-30 21:19:51.920 2 DEBUG nova.network.neutron [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:19:52 compute-1 podman[224026]: 2025-09-30 21:19:52.240956281 +0000 UTC m=+0.072639469 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:19:52 compute-1 podman[224024]: 2025-09-30 21:19:52.268544664 +0000 UTC m=+0.102121112 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Sep 30 21:19:52 compute-1 podman[224025]: 2025-09-30 21:19:52.312241691 +0000 UTC m=+0.144087373 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250923)
Sep 30 21:19:52 compute-1 nova_compute[192795]: 2025-09-30 21:19:52.699 2 DEBUG nova.network.neutron [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:19:52 compute-1 nova_compute[192795]: 2025-09-30 21:19:52.722 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Releasing lock "refresh_cache-7738fa4c-805c-475a-92c2-219468edc691" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:19:52 compute-1 nova_compute[192795]: 2025-09-30 21:19:52.723 2 DEBUG nova.compute.manager [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:19:52 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000014.scope: Deactivated successfully.
Sep 30 21:19:52 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000014.scope: Consumed 12.378s CPU time.
Sep 30 21:19:52 compute-1 systemd-machined[152783]: Machine qemu-12-instance-00000014 terminated.
Sep 30 21:19:52 compute-1 nova_compute[192795]: 2025-09-30 21:19:52.979 2 INFO nova.virt.libvirt.driver [-] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Instance destroyed successfully.
Sep 30 21:19:52 compute-1 nova_compute[192795]: 2025-09-30 21:19:52.980 2 DEBUG nova.objects.instance [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lazy-loading 'resources' on Instance uuid 7738fa4c-805c-475a-92c2-219468edc691 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.008 2 INFO nova.virt.libvirt.driver [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Deleting instance files /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691_del
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.008 2 INFO nova.virt.libvirt.driver [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Deletion of /var/lib/nova/instances/7738fa4c-805c-475a-92c2-219468edc691_del complete
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.125 2 INFO nova.compute.manager [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.125 2 DEBUG oslo.service.loopingcall [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.126 2 DEBUG nova.compute.manager [-] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.126 2 DEBUG nova.network.neutron [-] [instance: 7738fa4c-805c-475a-92c2-219468edc691] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.390 2 DEBUG nova.network.neutron [-] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.411 2 DEBUG nova.network.neutron [-] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.427 2 INFO nova.compute.manager [-] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Took 0.30 seconds to deallocate network for instance.
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.545 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.546 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.656 2 DEBUG nova.compute.provider_tree [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.675 2 DEBUG nova.scheduler.client.report [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.695 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.746 2 INFO nova.scheduler.client.report [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Deleted allocations for instance 7738fa4c-805c-475a-92c2-219468edc691
Sep 30 21:19:53 compute-1 nova_compute[192795]: 2025-09-30 21:19:53.858 2 DEBUG oslo_concurrency.lockutils [None req-cece30a3-9b0d-4701-867a-8a7dc012b6c4 b17b6801df2c43bc95b83132dc22d974 32b6ff8e31dd4de0864d848bd8017886 - - default default] Lock "7738fa4c-805c-475a-92c2-219468edc691" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:55 compute-1 podman[224102]: 2025-09-30 21:19:55.22158163 +0000 UTC m=+0.064935300 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:19:56 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 21:19:56 compute-1 nova_compute[192795]: 2025-09-30 21:19:56.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:19:57.238 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:19:58 compute-1 nova_compute[192795]: 2025-09-30 21:19:58.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:01 compute-1 nova_compute[192795]: 2025-09-30 21:20:01.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:03 compute-1 nova_compute[192795]: 2025-09-30 21:20:03.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:04 compute-1 podman[224123]: 2025-09-30 21:20:04.232541128 +0000 UTC m=+0.066635196 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 21:20:06 compute-1 podman[224143]: 2025-09-30 21:20:06.231873938 +0000 UTC m=+0.076155974 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.574 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquiring lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.574 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.599 2 DEBUG nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.776 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.777 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.784 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.784 2 INFO nova.compute.claims [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.930 2 DEBUG nova.compute.provider_tree [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.953 2 DEBUG nova.scheduler.client.report [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.969 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:06 compute-1 nova_compute[192795]: 2025-09-30 21:20:06.970 2 DEBUG nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.022 2 DEBUG nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.023 2 DEBUG nova.network.neutron [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.047 2 INFO nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.065 2 DEBUG nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.202 2 DEBUG nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.204 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.204 2 INFO nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Creating image(s)
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.205 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquiring lock "/var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.205 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "/var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.206 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "/var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.227 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:07 compute-1 podman[224164]: 2025-09-30 21:20:07.246868756 +0000 UTC m=+0.068529328 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.295 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.297 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.298 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.318 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.376 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.378 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.416 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.418 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.418 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.477 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.479 2 DEBUG nova.virt.disk.api [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Checking if we can resize image /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.480 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.544 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.546 2 DEBUG nova.virt.disk.api [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Cannot resize image /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.547 2 DEBUG nova.objects.instance [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 006cbd78-97fd-45ed-8b2d-3c41d589ba2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.588 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.588 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Ensure instance console log exists: /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.589 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.590 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.590 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.705 2 DEBUG nova.network.neutron [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.706 2 DEBUG nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.708 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.714 2 WARNING nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.722 2 DEBUG nova.virt.libvirt.host [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.723 2 DEBUG nova.virt.libvirt.host [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.727 2 DEBUG nova.virt.libvirt.host [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.728 2 DEBUG nova.virt.libvirt.host [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.730 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.731 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.732 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.732 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.733 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.733 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.734 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.734 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.735 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.735 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.736 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.736 2 DEBUG nova.virt.hardware [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.744 2 DEBUG nova.objects.instance [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 006cbd78-97fd-45ed-8b2d-3c41d589ba2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.760 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <uuid>006cbd78-97fd-45ed-8b2d-3c41d589ba2a</uuid>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <name>instance-00000017</name>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerDiagnosticsNegativeTest-server-1900493310</nova:name>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:20:07</nova:creationTime>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:20:07 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:20:07 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:20:07 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:20:07 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:20:07 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:20:07 compute-1 nova_compute[192795]:         <nova:user uuid="dd668e509eac477298d320ca0db0acd0">tempest-ServerDiagnosticsNegativeTest-1933776125-project-member</nova:user>
Sep 30 21:20:07 compute-1 nova_compute[192795]:         <nova:project uuid="c074d2c6daa24577bf28c7ba07aeb4d0">tempest-ServerDiagnosticsNegativeTest-1933776125</nova:project>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <system>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <entry name="serial">006cbd78-97fd-45ed-8b2d-3c41d589ba2a</entry>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <entry name="uuid">006cbd78-97fd-45ed-8b2d-3c41d589ba2a</entry>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     </system>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <os>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   </os>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <features>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   </features>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk.config"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/console.log" append="off"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <video>
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     </video>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:20:07 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:20:07 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:20:07 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:20:07 compute-1 nova_compute[192795]: </domain>
Sep 30 21:20:07 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.818 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.819 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.820 2 INFO nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Using config drive
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.980 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267192.9774473, 7738fa4c-805c-475a-92c2-219468edc691 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:07 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.981 2 INFO nova.compute.manager [-] [instance: 7738fa4c-805c-475a-92c2-219468edc691] VM Stopped (Lifecycle Event)
Sep 30 21:20:08 compute-1 nova_compute[192795]: 2025-09-30 21:20:07.999 2 DEBUG nova.compute.manager [None req-13f69d5d-7e65-4fa7-aa32-8b73c9a8d320 - - - - - -] [instance: 7738fa4c-805c-475a-92c2-219468edc691] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:08 compute-1 nova_compute[192795]: 2025-09-30 21:20:08.226 2 INFO nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Creating config drive at /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk.config
Sep 30 21:20:08 compute-1 nova_compute[192795]: 2025-09-30 21:20:08.232 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq7ssz59c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:08 compute-1 nova_compute[192795]: 2025-09-30 21:20:08.361 2 DEBUG oslo_concurrency.processutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq7ssz59c" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:08 compute-1 systemd-machined[152783]: New machine qemu-13-instance-00000017.
Sep 30 21:20:08 compute-1 systemd[1]: Started Virtual Machine qemu-13-instance-00000017.
Sep 30 21:20:08 compute-1 nova_compute[192795]: 2025-09-30 21:20:08.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.579 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267209.5787532, 006cbd78-97fd-45ed-8b2d-3c41d589ba2a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.581 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] VM Resumed (Lifecycle Event)
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.584 2 DEBUG nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.585 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.588 2 INFO nova.virt.libvirt.driver [-] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Instance spawned successfully.
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.588 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.611 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.617 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.619 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.620 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.621 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.622 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.623 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.624 2 DEBUG nova.virt.libvirt.driver [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.648 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.649 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267209.583409, 006cbd78-97fd-45ed-8b2d-3c41d589ba2a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.650 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] VM Started (Lifecycle Event)
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.677 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.682 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.714 2 INFO nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Took 2.51 seconds to spawn the instance on the hypervisor.
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.714 2 DEBUG nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.716 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.814 2 INFO nova.compute.manager [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Took 3.11 seconds to build instance.
Sep 30 21:20:09 compute-1 nova_compute[192795]: 2025-09-30 21:20:09.834 2 DEBUG oslo_concurrency.lockutils [None req-63f1d0ca-5f18-44a9-b93e-8ff650309b14 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.598 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquiring lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.599 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.599 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquiring lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.599 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.600 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.604 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.612 2 INFO nova.compute.manager [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Terminating instance
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.624 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquiring lock "refresh_cache-006cbd78-97fd-45ed-8b2d-3c41d589ba2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.624 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquired lock "refresh_cache-006cbd78-97fd-45ed-8b2d-3c41d589ba2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.624 2 DEBUG nova.network.neutron [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.661 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Triggering sync for uuid 006cbd78-97fd-45ed-8b2d-3c41d589ba2a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.662 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:11 compute-1 nova_compute[192795]: 2025-09-30 21:20:11.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.215 2 DEBUG nova.network.neutron [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.544 2 DEBUG nova.network.neutron [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.561 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Releasing lock "refresh_cache-006cbd78-97fd-45ed-8b2d-3c41d589ba2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.562 2 DEBUG nova.compute.manager [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:20:12 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000017.scope: Deactivated successfully.
Sep 30 21:20:12 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000017.scope: Consumed 4.064s CPU time.
Sep 30 21:20:12 compute-1 systemd-machined[152783]: Machine qemu-13-instance-00000017 terminated.
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.810 2 INFO nova.virt.libvirt.driver [-] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Instance destroyed successfully.
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.811 2 DEBUG nova.objects.instance [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lazy-loading 'resources' on Instance uuid 006cbd78-97fd-45ed-8b2d-3c41d589ba2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.842 2 INFO nova.virt.libvirt.driver [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Deleting instance files /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a_del
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.843 2 INFO nova.virt.libvirt.driver [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Deletion of /var/lib/nova/instances/006cbd78-97fd-45ed-8b2d-3c41d589ba2a_del complete
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.948 2 INFO nova.compute.manager [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Took 0.39 seconds to destroy the instance on the hypervisor.
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.949 2 DEBUG oslo.service.loopingcall [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.950 2 DEBUG nova.compute.manager [-] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:20:12 compute-1 nova_compute[192795]: 2025-09-30 21:20:12.950 2 DEBUG nova.network.neutron [-] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.340 2 DEBUG nova.network.neutron [-] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.364 2 DEBUG nova.network.neutron [-] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.380 2 INFO nova.compute.manager [-] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Took 0.43 seconds to deallocate network for instance.
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.469 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.470 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.565 2 DEBUG nova.compute.provider_tree [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.581 2 DEBUG nova.scheduler.client.report [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.621 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.662 2 INFO nova.scheduler.client.report [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Deleted allocations for instance 006cbd78-97fd-45ed-8b2d-3c41d589ba2a
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.769 2 DEBUG oslo_concurrency.lockutils [None req-c920a474-cfbb-4eb2-8e0c-a884b3826746 dd668e509eac477298d320ca0db0acd0 c074d2c6daa24577bf28c7ba07aeb4d0 - - default default] Lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.770 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 2.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.770 2 INFO nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] During sync_power_state the instance has a pending task (deleting). Skip.
Sep 30 21:20:13 compute-1 nova_compute[192795]: 2025-09-30 21:20:13.770 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "006cbd78-97fd-45ed-8b2d-3c41d589ba2a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:16 compute-1 podman[224239]: 2025-09-30 21:20:16.221102964 +0000 UTC m=+0.055189248 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:20:16 compute-1 nova_compute[192795]: 2025-09-30 21:20:16.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:18 compute-1 nova_compute[192795]: 2025-09-30 21:20:18.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:21 compute-1 nova_compute[192795]: 2025-09-30 21:20:21.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:23 compute-1 podman[224259]: 2025-09-30 21:20:23.271551569 +0000 UTC m=+0.100168821 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:20:23 compute-1 podman[224261]: 2025-09-30 21:20:23.286274175 +0000 UTC m=+0.103069808 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:20:23 compute-1 podman[224260]: 2025-09-30 21:20:23.335480381 +0000 UTC m=+0.168765478 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:20:23 compute-1 nova_compute[192795]: 2025-09-30 21:20:23.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:26 compute-1 podman[224324]: 2025-09-30 21:20:26.215947182 +0000 UTC m=+0.062601957 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:20:26 compute-1 nova_compute[192795]: 2025-09-30 21:20:26.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:27 compute-1 nova_compute[192795]: 2025-09-30 21:20:27.808 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267212.8072236, 006cbd78-97fd-45ed-8b2d-3c41d589ba2a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:27 compute-1 nova_compute[192795]: 2025-09-30 21:20:27.809 2 INFO nova.compute.manager [-] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] VM Stopped (Lifecycle Event)
Sep 30 21:20:27 compute-1 nova_compute[192795]: 2025-09-30 21:20:27.827 2 DEBUG nova.compute.manager [None req-b009c8dc-6e57-459f-a6fc-2cd80d0e8894 - - - - - -] [instance: 006cbd78-97fd-45ed-8b2d-3c41d589ba2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:28 compute-1 nova_compute[192795]: 2025-09-30 21:20:28.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:31 compute-1 nova_compute[192795]: 2025-09-30 21:20:31.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:33 compute-1 nova_compute[192795]: 2025-09-30 21:20:33.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:33 compute-1 nova_compute[192795]: 2025-09-30 21:20:33.746 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:35 compute-1 podman[224344]: 2025-09-30 21:20:35.237170072 +0000 UTC m=+0.068100492 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:20:35 compute-1 nova_compute[192795]: 2025-09-30 21:20:35.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:36 compute-1 nova_compute[192795]: 2025-09-30 21:20:36.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:37 compute-1 podman[224363]: 2025-09-30 21:20:37.222477812 +0000 UTC m=+0.065004689 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Sep 30 21:20:37 compute-1 nova_compute[192795]: 2025-09-30 21:20:37.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:37 compute-1 nova_compute[192795]: 2025-09-30 21:20:37.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:38 compute-1 podman[224385]: 2025-09-30 21:20:38.229370712 +0000 UTC m=+0.076290434 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:20:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:38.681 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:38.681 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:38.681 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:38 compute-1 nova_compute[192795]: 2025-09-30 21:20:38.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:38 compute-1 nova_compute[192795]: 2025-09-30 21:20:38.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:39 compute-1 nova_compute[192795]: 2025-09-30 21:20:39.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:39 compute-1 nova_compute[192795]: 2025-09-30 21:20:39.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.718 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.718 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.719 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.719 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.906 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.908 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5781MB free_disk=73.46240234375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.908 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.908 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.969 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.969 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:20:40 compute-1 nova_compute[192795]: 2025-09-30 21:20:40.986 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:41 compute-1 nova_compute[192795]: 2025-09-30 21:20:41.006 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:41 compute-1 nova_compute[192795]: 2025-09-30 21:20:41.041 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:20:41 compute-1 nova_compute[192795]: 2025-09-30 21:20:41.041 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:41 compute-1 nova_compute[192795]: 2025-09-30 21:20:41.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:42 compute-1 nova_compute[192795]: 2025-09-30 21:20:42.042 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:42 compute-1 nova_compute[192795]: 2025-09-30 21:20:42.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:42 compute-1 nova_compute[192795]: 2025-09-30 21:20:42.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:20:42 compute-1 nova_compute[192795]: 2025-09-30 21:20:42.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:20:42 compute-1 nova_compute[192795]: 2025-09-30 21:20:42.721 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:20:42 compute-1 nova_compute[192795]: 2025-09-30 21:20:42.899 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "2ef41935-4444-49ed-a38d-8e6b47bf533f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:42 compute-1 nova_compute[192795]: 2025-09-30 21:20:42.900 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:42 compute-1 nova_compute[192795]: 2025-09-30 21:20:42.913 2 DEBUG nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.010 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.011 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.017 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.018 2 INFO nova.compute.claims [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.228 2 DEBUG nova.compute.provider_tree [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.249 2 DEBUG nova.scheduler.client.report [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.275 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.276 2 DEBUG nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.349 2 DEBUG nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.350 2 DEBUG nova.network.neutron [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.371 2 INFO nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.404 2 DEBUG nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.569 2 DEBUG nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.571 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.572 2 INFO nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Creating image(s)
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.573 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "/var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.573 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "/var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.575 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "/var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.599 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.641 2 DEBUG nova.policy [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c6ecca7d29aa4330a4f67feebfe7800b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '999d8c5597404e69b40454f7b06550ca', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.662 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.663 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.663 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.674 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.742 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.742 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.784 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.786 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.786 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.850 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.853 2 DEBUG nova.virt.disk.api [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Checking if we can resize image /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.854 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.924 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.927 2 DEBUG nova.virt.disk.api [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Cannot resize image /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.927 2 DEBUG nova.objects.instance [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lazy-loading 'migration_context' on Instance uuid 2ef41935-4444-49ed-a38d-8e6b47bf533f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.954 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.954 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Ensure instance console log exists: /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.955 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.955 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:43 compute-1 nova_compute[192795]: 2025-09-30 21:20:43.956 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:20:44.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:20:44 compute-1 nova_compute[192795]: 2025-09-30 21:20:44.803 2 DEBUG nova.network.neutron [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Successfully created port: ab1b6629-5049-4538-a364-1f9bda4b642d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:20:45 compute-1 unix_chkpwd[224427]: password check failed for user (root)
Sep 30 21:20:45 compute-1 sshd-session[224425]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=114.66.3.37  user=root
Sep 30 21:20:46 compute-1 nova_compute[192795]: 2025-09-30 21:20:46.209 2 DEBUG nova.network.neutron [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Successfully updated port: ab1b6629-5049-4538-a364-1f9bda4b642d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:20:46 compute-1 nova_compute[192795]: 2025-09-30 21:20:46.225 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "refresh_cache-2ef41935-4444-49ed-a38d-8e6b47bf533f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:20:46 compute-1 nova_compute[192795]: 2025-09-30 21:20:46.226 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquired lock "refresh_cache-2ef41935-4444-49ed-a38d-8e6b47bf533f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:20:46 compute-1 nova_compute[192795]: 2025-09-30 21:20:46.226 2 DEBUG nova.network.neutron [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:20:46 compute-1 nova_compute[192795]: 2025-09-30 21:20:46.434 2 DEBUG nova.network.neutron [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:46 compute-1 nova_compute[192795]: 2025-09-30 21:20:46.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:47 compute-1 podman[224428]: 2025-09-30 21:20:47.240780653 +0000 UTC m=+0.074977050 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=iscsid)
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.351 2 DEBUG nova.network.neutron [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Updating instance_info_cache with network_info: [{"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.378 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Releasing lock "refresh_cache-2ef41935-4444-49ed-a38d-8e6b47bf533f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.379 2 DEBUG nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Instance network_info: |[{"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.380 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Start _get_guest_xml network_info=[{"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.385 2 WARNING nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.389 2 DEBUG nova.virt.libvirt.host [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.390 2 DEBUG nova.virt.libvirt.host [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.393 2 DEBUG nova.virt.libvirt.host [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.393 2 DEBUG nova.virt.libvirt.host [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.396 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.396 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.396 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.396 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.397 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.397 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.397 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.397 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.398 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.398 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.398 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.398 2 DEBUG nova.virt.hardware [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.403 2 DEBUG nova.virt.libvirt.vif [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:20:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1946324342',display_name='tempest-ImagesOneServerTestJSON-server-1946324342',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1946324342',id=27,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='999d8c5597404e69b40454f7b06550ca',ramdisk_id='',reservation_id='r-63jd1dd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-337579414',owner_user_name='tempest-ImagesOneServerTestJSON-337579414-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:20:43Z,user_data=None,user_id='c6ecca7d29aa4330a4f67feebfe7800b',uuid=2ef41935-4444-49ed-a38d-8e6b47bf533f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.403 2 DEBUG nova.network.os_vif_util [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Converting VIF {"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.404 2 DEBUG nova.network.os_vif_util [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:43:4d,bridge_name='br-int',has_traffic_filtering=True,id=ab1b6629-5049-4538-a364-1f9bda4b642d,network=Network(04360fd2-50cb-46cf-95d6-c54f98289e71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1b6629-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.405 2 DEBUG nova.objects.instance [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ef41935-4444-49ed-a38d-8e6b47bf533f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.427 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <uuid>2ef41935-4444-49ed-a38d-8e6b47bf533f</uuid>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <name>instance-0000001b</name>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <nova:name>tempest-ImagesOneServerTestJSON-server-1946324342</nova:name>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:20:47</nova:creationTime>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:20:47 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:20:47 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:20:47 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:20:47 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:20:47 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:20:47 compute-1 nova_compute[192795]:         <nova:user uuid="c6ecca7d29aa4330a4f67feebfe7800b">tempest-ImagesOneServerTestJSON-337579414-project-member</nova:user>
Sep 30 21:20:47 compute-1 nova_compute[192795]:         <nova:project uuid="999d8c5597404e69b40454f7b06550ca">tempest-ImagesOneServerTestJSON-337579414</nova:project>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:20:47 compute-1 nova_compute[192795]:         <nova:port uuid="ab1b6629-5049-4538-a364-1f9bda4b642d">
Sep 30 21:20:47 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <system>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <entry name="serial">2ef41935-4444-49ed-a38d-8e6b47bf533f</entry>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <entry name="uuid">2ef41935-4444-49ed-a38d-8e6b47bf533f</entry>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     </system>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <os>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   </os>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <features>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   </features>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk.config"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:7d:43:4d"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <target dev="tapab1b6629-50"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/console.log" append="off"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <video>
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     </video>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:20:47 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:20:47 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:20:47 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:20:47 compute-1 nova_compute[192795]: </domain>
Sep 30 21:20:47 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.429 2 DEBUG nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Preparing to wait for external event network-vif-plugged-ab1b6629-5049-4538-a364-1f9bda4b642d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.429 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.429 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.429 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.430 2 DEBUG nova.virt.libvirt.vif [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:20:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1946324342',display_name='tempest-ImagesOneServerTestJSON-server-1946324342',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1946324342',id=27,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='999d8c5597404e69b40454f7b06550ca',ramdisk_id='',reservation_id='r-63jd1dd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-337579414',owner_user_name='tempest-ImagesOneServerTestJSON-337579414-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:20:43Z,user_data=None,user_id='c6ecca7d29aa4330a4f67feebfe7800b',uuid=2ef41935-4444-49ed-a38d-8e6b47bf533f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.430 2 DEBUG nova.network.os_vif_util [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Converting VIF {"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.431 2 DEBUG nova.network.os_vif_util [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:43:4d,bridge_name='br-int',has_traffic_filtering=True,id=ab1b6629-5049-4538-a364-1f9bda4b642d,network=Network(04360fd2-50cb-46cf-95d6-c54f98289e71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1b6629-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.431 2 DEBUG os_vif [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:43:4d,bridge_name='br-int',has_traffic_filtering=True,id=ab1b6629-5049-4538-a364-1f9bda4b642d,network=Network(04360fd2-50cb-46cf-95d6-c54f98289e71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1b6629-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.432 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.433 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.436 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab1b6629-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.436 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab1b6629-50, col_values=(('external_ids', {'iface-id': 'ab1b6629-5049-4538-a364-1f9bda4b642d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:43:4d', 'vm-uuid': '2ef41935-4444-49ed-a38d-8e6b47bf533f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:47 compute-1 NetworkManager[51724]: <info>  [1759267247.4389] manager: (tapab1b6629-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.446 2 INFO os_vif [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:43:4d,bridge_name='br-int',has_traffic_filtering=True,id=ab1b6629-5049-4538-a364-1f9bda4b642d,network=Network(04360fd2-50cb-46cf-95d6-c54f98289e71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1b6629-50')
Sep 30 21:20:47 compute-1 sshd-session[224425]: Failed password for root from 114.66.3.37 port 52686 ssh2
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.487 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.487 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.487 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] No VIF found with MAC fa:16:3e:7d:43:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.488 2 INFO nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Using config drive
Sep 30 21:20:47 compute-1 sshd-session[224425]: Received disconnect from 114.66.3.37 port 52686:11:  [preauth]
Sep 30 21:20:47 compute-1 sshd-session[224425]: Disconnected from authenticating user root 114.66.3.37 port 52686 [preauth]
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.852 2 INFO nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Creating config drive at /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk.config
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.858 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7p6guhmt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.881 2 DEBUG nova.compute.manager [req-fcfd5ab4-94b3-4e76-a6c7-111e5a064aca req-8f1a2ba4-eb08-46ee-b481-6eb35e08f0a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Received event network-changed-ab1b6629-5049-4538-a364-1f9bda4b642d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.882 2 DEBUG nova.compute.manager [req-fcfd5ab4-94b3-4e76-a6c7-111e5a064aca req-8f1a2ba4-eb08-46ee-b481-6eb35e08f0a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Refreshing instance network info cache due to event network-changed-ab1b6629-5049-4538-a364-1f9bda4b642d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.882 2 DEBUG oslo_concurrency.lockutils [req-fcfd5ab4-94b3-4e76-a6c7-111e5a064aca req-8f1a2ba4-eb08-46ee-b481-6eb35e08f0a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2ef41935-4444-49ed-a38d-8e6b47bf533f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.882 2 DEBUG oslo_concurrency.lockutils [req-fcfd5ab4-94b3-4e76-a6c7-111e5a064aca req-8f1a2ba4-eb08-46ee-b481-6eb35e08f0a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2ef41935-4444-49ed-a38d-8e6b47bf533f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.883 2 DEBUG nova.network.neutron [req-fcfd5ab4-94b3-4e76-a6c7-111e5a064aca req-8f1a2ba4-eb08-46ee-b481-6eb35e08f0a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Refreshing network info cache for port ab1b6629-5049-4538-a364-1f9bda4b642d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:20:47 compute-1 nova_compute[192795]: 2025-09-30 21:20:47.984 2 DEBUG oslo_concurrency.processutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7p6guhmt" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:48 compute-1 kernel: tapab1b6629-50: entered promiscuous mode
Sep 30 21:20:48 compute-1 NetworkManager[51724]: <info>  [1759267248.0699] manager: (tapab1b6629-50): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Sep 30 21:20:48 compute-1 ovn_controller[94902]: 2025-09-30T21:20:48Z|00122|binding|INFO|Claiming lport ab1b6629-5049-4538-a364-1f9bda4b642d for this chassis.
Sep 30 21:20:48 compute-1 ovn_controller[94902]: 2025-09-30T21:20:48Z|00123|binding|INFO|ab1b6629-5049-4538-a364-1f9bda4b642d: Claiming fa:16:3e:7d:43:4d 10.100.0.5
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.086 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:43:4d 10.100.0.5'], port_security=['fa:16:3e:7d:43:4d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ef41935-4444-49ed-a38d-8e6b47bf533f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04360fd2-50cb-46cf-95d6-c54f98289e71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '999d8c5597404e69b40454f7b06550ca', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed3f0e07-adb9-4920-8ba9-d9f73ecfee63', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1ac9190-2989-4b07-9f73-9ac89a5050c0, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=ab1b6629-5049-4538-a364-1f9bda4b642d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.087 103861 INFO neutron.agent.ovn.metadata.agent [-] Port ab1b6629-5049-4538-a364-1f9bda4b642d in datapath 04360fd2-50cb-46cf-95d6-c54f98289e71 bound to our chassis
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.088 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04360fd2-50cb-46cf-95d6-c54f98289e71
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.101 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[df06d46f-feff-4956-9c9e-4f1dc62c2252]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.102 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap04360fd2-51 in ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.105 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap04360fd2-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.105 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a84c44-7850-40bf-ac51-0dc4e9c96d0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.105 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[57d878e3-0796-4bcd-82d0-a8a5bb409e09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 systemd-udevd[224468]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.127 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[2d5cd577-b23c-48c5-bcd0-33e625356d5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 NetworkManager[51724]: <info>  [1759267248.1399] device (tapab1b6629-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:20:48 compute-1 NetworkManager[51724]: <info>  [1759267248.1408] device (tapab1b6629-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:20:48 compute-1 systemd-machined[152783]: New machine qemu-14-instance-0000001b.
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.155 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8126cc61-0bb1-46d5-9634-a8585d9dde48]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_controller[94902]: 2025-09-30T21:20:48Z|00124|binding|INFO|Setting lport ab1b6629-5049-4538-a364-1f9bda4b642d ovn-installed in OVS
Sep 30 21:20:48 compute-1 ovn_controller[94902]: 2025-09-30T21:20:48Z|00125|binding|INFO|Setting lport ab1b6629-5049-4538-a364-1f9bda4b642d up in Southbound
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:48 compute-1 systemd[1]: Started Virtual Machine qemu-14-instance-0000001b.
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.196 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e77c1ea0-ae79-4807-b21c-b8c0d65839b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 NetworkManager[51724]: <info>  [1759267248.2044] manager: (tap04360fd2-50): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.205 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[82b9185c-4e10-44eb-935f-44c62d25faf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.245 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[047e0f6a-6296-400e-9092-524db1e6f5a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.255 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ec94e23f-e89f-4878-9dfe-b8981b65be14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 NetworkManager[51724]: <info>  [1759267248.2947] device (tap04360fd2-50): carrier: link connected
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.301 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[3334bab0-c5cc-44de-97d4-185861793ee8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.320 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[04cc274d-b3dc-4c13-b05b-157448369886]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04360fd2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:4b:97'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394396, 'reachable_time': 34472, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224500, 'error': None, 'target': 'ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.341 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b3811560-88bb-461a-9d4b-cbda0156e149]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed3:4b97'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 394396, 'tstamp': 394396}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224501, 'error': None, 'target': 'ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.360 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ad67aece-b8b5-4ed9-b7e4-a8a752f54cd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04360fd2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:4b:97'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394396, 'reachable_time': 34472, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224502, 'error': None, 'target': 'ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.408 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[acbbb1ee-df08-4ee4-9637-0dd8d98a8d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.464 2 DEBUG nova.compute.manager [req-7ca06b50-1061-4d31-8511-e1462b5e4df5 req-e6b5ce36-f89b-4e26-aaf9-8ca806528581 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Received event network-vif-plugged-ab1b6629-5049-4538-a364-1f9bda4b642d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.465 2 DEBUG oslo_concurrency.lockutils [req-7ca06b50-1061-4d31-8511-e1462b5e4df5 req-e6b5ce36-f89b-4e26-aaf9-8ca806528581 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.466 2 DEBUG oslo_concurrency.lockutils [req-7ca06b50-1061-4d31-8511-e1462b5e4df5 req-e6b5ce36-f89b-4e26-aaf9-8ca806528581 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.466 2 DEBUG oslo_concurrency.lockutils [req-7ca06b50-1061-4d31-8511-e1462b5e4df5 req-e6b5ce36-f89b-4e26-aaf9-8ca806528581 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.466 2 DEBUG nova.compute.manager [req-7ca06b50-1061-4d31-8511-e1462b5e4df5 req-e6b5ce36-f89b-4e26-aaf9-8ca806528581 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Processing event network-vif-plugged-ab1b6629-5049-4538-a364-1f9bda4b642d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.491 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3a413308-a2bf-4161-aaf1-9b75a0655be2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.493 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04360fd2-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.493 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.494 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04360fd2-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:48 compute-1 kernel: tap04360fd2-50: entered promiscuous mode
Sep 30 21:20:48 compute-1 NetworkManager[51724]: <info>  [1759267248.4988] manager: (tap04360fd2-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.505 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04360fd2-50, col_values=(('external_ids', {'iface-id': '63f22b22-a3b6-42b7-9894-402bd7e692ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:20:48 compute-1 ovn_controller[94902]: 2025-09-30T21:20:48Z|00126|binding|INFO|Releasing lport 63f22b22-a3b6-42b7-9894-402bd7e692ea from this chassis (sb_readonly=0)
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:48 compute-1 nova_compute[192795]: 2025-09-30 21:20:48.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.520 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/04360fd2-50cb-46cf-95d6-c54f98289e71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/04360fd2-50cb-46cf-95d6-c54f98289e71.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.521 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[adb28a15-42f6-47be-b1fb-22fc73e39729]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.522 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-04360fd2-50cb-46cf-95d6-c54f98289e71
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/04360fd2-50cb-46cf-95d6-c54f98289e71.pid.haproxy
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 04360fd2-50cb-46cf-95d6-c54f98289e71
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:20:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:48.523 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71', 'env', 'PROCESS_TAG=haproxy-04360fd2-50cb-46cf-95d6-c54f98289e71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/04360fd2-50cb-46cf-95d6-c54f98289e71.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:20:48 compute-1 podman[224537]: 2025-09-30 21:20:48.947944553 +0000 UTC m=+0.070301641 container create 3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:20:48 compute-1 systemd[1]: Started libpod-conmon-3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6.scope.
Sep 30 21:20:49 compute-1 podman[224537]: 2025-09-30 21:20:48.9078975 +0000 UTC m=+0.030254628 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:20:49 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:20:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20b793f2c6555d44b960f70d5538238bf7192e42a92f8c737a7abdcfee5815cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:20:49 compute-1 podman[224537]: 2025-09-30 21:20:49.054983358 +0000 UTC m=+0.177340486 container init 3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:20:49 compute-1 podman[224537]: 2025-09-30 21:20:49.06060635 +0000 UTC m=+0.182963448 container start 3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:20:49 compute-1 neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71[224552]: [NOTICE]   (224556) : New worker (224558) forked
Sep 30 21:20:49 compute-1 neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71[224552]: [NOTICE]   (224556) : Loading success.
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.162 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267249.1618538, 2ef41935-4444-49ed-a38d-8e6b47bf533f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.163 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] VM Started (Lifecycle Event)
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.165 2 DEBUG nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.169 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.172 2 INFO nova.virt.libvirt.driver [-] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Instance spawned successfully.
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.172 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.179 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.184 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.195 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.196 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.196 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.197 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.197 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.197 2 DEBUG nova.virt.libvirt.driver [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.263 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.263 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267249.1621108, 2ef41935-4444-49ed-a38d-8e6b47bf533f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.264 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] VM Paused (Lifecycle Event)
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.341 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.346 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267249.1681151, 2ef41935-4444-49ed-a38d-8e6b47bf533f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.346 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] VM Resumed (Lifecycle Event)
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.418 2 INFO nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Took 5.85 seconds to spawn the instance on the hypervisor.
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.419 2 DEBUG nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.423 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.438 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.478 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.814 2 INFO nova.compute.manager [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Took 6.84 seconds to build instance.
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.896 2 DEBUG nova.network.neutron [req-fcfd5ab4-94b3-4e76-a6c7-111e5a064aca req-8f1a2ba4-eb08-46ee-b481-6eb35e08f0a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Updated VIF entry in instance network info cache for port ab1b6629-5049-4538-a364-1f9bda4b642d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.897 2 DEBUG nova.network.neutron [req-fcfd5ab4-94b3-4e76-a6c7-111e5a064aca req-8f1a2ba4-eb08-46ee-b481-6eb35e08f0a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Updating instance_info_cache with network_info: [{"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.919 2 DEBUG oslo_concurrency.lockutils [None req-3e59119d-8601-494b-8af7-49bc099c5cb3 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:49 compute-1 nova_compute[192795]: 2025-09-30 21:20:49.959 2 DEBUG oslo_concurrency.lockutils [req-fcfd5ab4-94b3-4e76-a6c7-111e5a064aca req-8f1a2ba4-eb08-46ee-b481-6eb35e08f0a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2ef41935-4444-49ed-a38d-8e6b47bf533f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:20:50 compute-1 nova_compute[192795]: 2025-09-30 21:20:50.702 2 DEBUG nova.compute.manager [req-f38c47d9-7c03-42f2-8036-575559e4d6d1 req-9e4ded1a-88ab-41d2-95e1-049f2b04d285 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Received event network-vif-plugged-ab1b6629-5049-4538-a364-1f9bda4b642d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:20:50 compute-1 nova_compute[192795]: 2025-09-30 21:20:50.704 2 DEBUG oslo_concurrency.lockutils [req-f38c47d9-7c03-42f2-8036-575559e4d6d1 req-9e4ded1a-88ab-41d2-95e1-049f2b04d285 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:50 compute-1 nova_compute[192795]: 2025-09-30 21:20:50.704 2 DEBUG oslo_concurrency.lockutils [req-f38c47d9-7c03-42f2-8036-575559e4d6d1 req-9e4ded1a-88ab-41d2-95e1-049f2b04d285 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:50 compute-1 nova_compute[192795]: 2025-09-30 21:20:50.704 2 DEBUG oslo_concurrency.lockutils [req-f38c47d9-7c03-42f2-8036-575559e4d6d1 req-9e4ded1a-88ab-41d2-95e1-049f2b04d285 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:50 compute-1 nova_compute[192795]: 2025-09-30 21:20:50.705 2 DEBUG nova.compute.manager [req-f38c47d9-7c03-42f2-8036-575559e4d6d1 req-9e4ded1a-88ab-41d2-95e1-049f2b04d285 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] No waiting events found dispatching network-vif-plugged-ab1b6629-5049-4538-a364-1f9bda4b642d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:20:50 compute-1 nova_compute[192795]: 2025-09-30 21:20:50.705 2 WARNING nova.compute.manager [req-f38c47d9-7c03-42f2-8036-575559e4d6d1 req-9e4ded1a-88ab-41d2-95e1-049f2b04d285 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Received unexpected event network-vif-plugged-ab1b6629-5049-4538-a364-1f9bda4b642d for instance with vm_state active and task_state None.
Sep 30 21:20:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:50.972 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:20:50 compute-1 nova_compute[192795]: 2025-09-30 21:20:50.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:50.977 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:20:51 compute-1 nova_compute[192795]: 2025-09-30 21:20:51.071 2 DEBUG nova.compute.manager [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:51 compute-1 nova_compute[192795]: 2025-09-30 21:20:51.151 2 INFO nova.compute.manager [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] instance snapshotting
Sep 30 21:20:51 compute-1 nova_compute[192795]: 2025-09-30 21:20:51.481 2 INFO nova.virt.libvirt.driver [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Beginning live snapshot process
Sep 30 21:20:51 compute-1 nova_compute[192795]: 2025-09-30 21:20:51.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:51 compute-1 virtqemud[192217]: invalid argument: disk vda does not have an active block job
Sep 30 21:20:51 compute-1 nova_compute[192795]: 2025-09-30 21:20:51.784 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:51 compute-1 nova_compute[192795]: 2025-09-30 21:20:51.873 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json -f qcow2" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:51 compute-1 nova_compute[192795]: 2025-09-30 21:20:51.875 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:51 compute-1 nova_compute[192795]: 2025-09-30 21:20:51.966 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json -f qcow2" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:51 compute-1 nova_compute[192795]: 2025-09-30 21:20:51.978 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.037 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.038 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpx_6fxnt4/5e6de67c229d463d937095a0d0032fdd.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.081 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpx_6fxnt4/5e6de67c229d463d937095a0d0032fdd.delta 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.082 2 INFO nova.virt.libvirt.driver [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.141 2 DEBUG nova.virt.libvirt.guest [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.144 2 INFO nova.virt.libvirt.driver [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.194 2 DEBUG nova.privsep.utils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.195 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpx_6fxnt4/5e6de67c229d463d937095a0d0032fdd.delta /var/lib/nova/instances/snapshots/tmpx_6fxnt4/5e6de67c229d463d937095a0d0032fdd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.404 2 DEBUG oslo_concurrency.processutils [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpx_6fxnt4/5e6de67c229d463d937095a0d0032fdd.delta /var/lib/nova/instances/snapshots/tmpx_6fxnt4/5e6de67c229d463d937095a0d0032fdd" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.406 2 INFO nova.virt.libvirt.driver [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Snapshot extracted, beginning image upload
Sep 30 21:20:52 compute-1 nova_compute[192795]: 2025-09-30 21:20:52.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:54 compute-1 podman[224597]: 2025-09-30 21:20:54.236462031 +0000 UTC m=+0.064910846 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:20:54 compute-1 podman[224595]: 2025-09-30 21:20:54.252740501 +0000 UTC m=+0.086374586 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Sep 30 21:20:54 compute-1 podman[224596]: 2025-09-30 21:20:54.334540742 +0000 UTC m=+0.167602981 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:20:54 compute-1 nova_compute[192795]: 2025-09-30 21:20:54.958 2 INFO nova.virt.libvirt.driver [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Snapshot image upload complete
Sep 30 21:20:54 compute-1 nova_compute[192795]: 2025-09-30 21:20:54.959 2 INFO nova.compute.manager [None req-820dbf62-2247-4671-b5be-22778cefaab1 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Took 3.79 seconds to snapshot the instance on the hypervisor.
Sep 30 21:20:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:20:54.980 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:20:56 compute-1 nova_compute[192795]: 2025-09-30 21:20:56.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:57 compute-1 podman[224660]: 2025-09-30 21:20:57.255340942 +0000 UTC m=+0.082486491 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:20:57 compute-1 nova_compute[192795]: 2025-09-30 21:20:57.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:58 compute-1 nova_compute[192795]: 2025-09-30 21:20:58.844 2 DEBUG nova.compute.manager [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:58 compute-1 nova_compute[192795]: 2025-09-30 21:20:58.944 2 INFO nova.compute.manager [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] instance snapshotting
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.226 2 INFO nova.virt.libvirt.driver [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Beginning live snapshot process
Sep 30 21:20:59 compute-1 virtqemud[192217]: invalid argument: disk vda does not have an active block job
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.421 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.517 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json -f qcow2" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.520 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.617 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f/disk --force-share --output=json -f qcow2" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.644 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.730 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.732 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpv7rcqooz/3754f10be25f409fb526388a4c06ac87.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.783 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpv7rcqooz/3754f10be25f409fb526388a4c06ac87.delta 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.784 2 INFO nova.virt.libvirt.driver [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.849 2 DEBUG nova.virt.libvirt.guest [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.852 2 INFO nova.virt.libvirt.driver [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.895 2 DEBUG nova.privsep.utils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:20:59 compute-1 nova_compute[192795]: 2025-09-30 21:20:59.896 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpv7rcqooz/3754f10be25f409fb526388a4c06ac87.delta /var/lib/nova/instances/snapshots/tmpv7rcqooz/3754f10be25f409fb526388a4c06ac87 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:00 compute-1 nova_compute[192795]: 2025-09-30 21:21:00.084 2 DEBUG oslo_concurrency.processutils [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpv7rcqooz/3754f10be25f409fb526388a4c06ac87.delta /var/lib/nova/instances/snapshots/tmpv7rcqooz/3754f10be25f409fb526388a4c06ac87" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:00 compute-1 nova_compute[192795]: 2025-09-30 21:21:00.086 2 INFO nova.virt.libvirt.driver [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Snapshot extracted, beginning image upload
Sep 30 21:21:01 compute-1 nova_compute[192795]: 2025-09-30 21:21:01.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:02 compute-1 ovn_controller[94902]: 2025-09-30T21:21:02Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:43:4d 10.100.0.5
Sep 30 21:21:02 compute-1 ovn_controller[94902]: 2025-09-30T21:21:02Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:43:4d 10.100.0.5
Sep 30 21:21:02 compute-1 nova_compute[192795]: 2025-09-30 21:21:02.332 2 INFO nova.virt.libvirt.driver [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Snapshot image upload complete
Sep 30 21:21:02 compute-1 nova_compute[192795]: 2025-09-30 21:21:02.333 2 INFO nova.compute.manager [None req-4a96cd4b-0f19-4247-adfc-1a58d4a1c80a c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Took 3.37 seconds to snapshot the instance on the hypervisor.
Sep 30 21:21:02 compute-1 nova_compute[192795]: 2025-09-30 21:21:02.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:06 compute-1 podman[224719]: 2025-09-30 21:21:06.251722816 +0000 UTC m=+0.076509969 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.610 2 DEBUG oslo_concurrency.lockutils [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "2ef41935-4444-49ed-a38d-8e6b47bf533f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.611 2 DEBUG oslo_concurrency.lockutils [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.612 2 DEBUG oslo_concurrency.lockutils [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.612 2 DEBUG oslo_concurrency.lockutils [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.613 2 DEBUG oslo_concurrency.lockutils [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.627 2 INFO nova.compute.manager [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Terminating instance
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.642 2 DEBUG nova.compute.manager [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:21:06 compute-1 kernel: tapab1b6629-50 (unregistering): left promiscuous mode
Sep 30 21:21:06 compute-1 NetworkManager[51724]: <info>  [1759267266.6752] device (tapab1b6629-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:21:06 compute-1 ovn_controller[94902]: 2025-09-30T21:21:06Z|00127|binding|INFO|Releasing lport ab1b6629-5049-4538-a364-1f9bda4b642d from this chassis (sb_readonly=0)
Sep 30 21:21:06 compute-1 ovn_controller[94902]: 2025-09-30T21:21:06Z|00128|binding|INFO|Setting lport ab1b6629-5049-4538-a364-1f9bda4b642d down in Southbound
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:06 compute-1 ovn_controller[94902]: 2025-09-30T21:21:06Z|00129|binding|INFO|Removing iface tapab1b6629-50 ovn-installed in OVS
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:06.700 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:43:4d 10.100.0.5'], port_security=['fa:16:3e:7d:43:4d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ef41935-4444-49ed-a38d-8e6b47bf533f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04360fd2-50cb-46cf-95d6-c54f98289e71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '999d8c5597404e69b40454f7b06550ca', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed3f0e07-adb9-4920-8ba9-d9f73ecfee63', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1ac9190-2989-4b07-9f73-9ac89a5050c0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=ab1b6629-5049-4538-a364-1f9bda4b642d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:06.702 103861 INFO neutron.agent.ovn.metadata.agent [-] Port ab1b6629-5049-4538-a364-1f9bda4b642d in datapath 04360fd2-50cb-46cf-95d6-c54f98289e71 unbound from our chassis
Sep 30 21:21:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:06.706 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 04360fd2-50cb-46cf-95d6-c54f98289e71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:21:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:06.707 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[78bda292-8111-4736-940f-0e2e3fb9440f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:06.708 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71 namespace which is not needed anymore
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:06 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Sep 30 21:21:06 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001b.scope: Consumed 13.603s CPU time.
Sep 30 21:21:06 compute-1 systemd-machined[152783]: Machine qemu-14-instance-0000001b terminated.
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:06 compute-1 neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71[224552]: [NOTICE]   (224556) : haproxy version is 2.8.14-c23fe91
Sep 30 21:21:06 compute-1 neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71[224552]: [NOTICE]   (224556) : path to executable is /usr/sbin/haproxy
Sep 30 21:21:06 compute-1 neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71[224552]: [WARNING]  (224556) : Exiting Master process...
Sep 30 21:21:06 compute-1 neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71[224552]: [ALERT]    (224556) : Current worker (224558) exited with code 143 (Terminated)
Sep 30 21:21:06 compute-1 neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71[224552]: [WARNING]  (224556) : All workers exited. Exiting... (0)
Sep 30 21:21:06 compute-1 systemd[1]: libpod-3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6.scope: Deactivated successfully.
Sep 30 21:21:06 compute-1 podman[224764]: 2025-09-30 21:21:06.934680119 +0000 UTC m=+0.089330695 container died 3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.952 2 INFO nova.virt.libvirt.driver [-] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Instance destroyed successfully.
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.953 2 DEBUG nova.objects.instance [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lazy-loading 'resources' on Instance uuid 2ef41935-4444-49ed-a38d-8e6b47bf533f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.970 2 DEBUG nova.virt.libvirt.vif [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:20:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1946324342',display_name='tempest-ImagesOneServerTestJSON-server-1946324342',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1946324342',id=27,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:20:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='999d8c5597404e69b40454f7b06550ca',ramdisk_id='',reservation_id='r-63jd1dd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-337579414',owner_user_name='tempest-ImagesOneServerTestJSON-337579414-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:02Z,user_data=None,user_id='c6ecca7d29aa4330a4f67feebfe7800b',uuid=2ef41935-4444-49ed-a38d-8e6b47bf533f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.971 2 DEBUG nova.network.os_vif_util [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Converting VIF {"id": "ab1b6629-5049-4538-a364-1f9bda4b642d", "address": "fa:16:3e:7d:43:4d", "network": {"id": "04360fd2-50cb-46cf-95d6-c54f98289e71", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-977818009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "999d8c5597404e69b40454f7b06550ca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab1b6629-50", "ovs_interfaceid": "ab1b6629-5049-4538-a364-1f9bda4b642d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.972 2 DEBUG nova.network.os_vif_util [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:43:4d,bridge_name='br-int',has_traffic_filtering=True,id=ab1b6629-5049-4538-a364-1f9bda4b642d,network=Network(04360fd2-50cb-46cf-95d6-c54f98289e71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1b6629-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.973 2 DEBUG os_vif [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:43:4d,bridge_name='br-int',has_traffic_filtering=True,id=ab1b6629-5049-4538-a364-1f9bda4b642d,network=Network(04360fd2-50cb-46cf-95d6-c54f98289e71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1b6629-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:06 compute-1 nova_compute[192795]: 2025-09-30 21:21:06.975 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab1b6629-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6-userdata-shm.mount: Deactivated successfully.
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:21:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-20b793f2c6555d44b960f70d5538238bf7192e42a92f8c737a7abdcfee5815cf-merged.mount: Deactivated successfully.
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.030 2 INFO os_vif [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:43:4d,bridge_name='br-int',has_traffic_filtering=True,id=ab1b6629-5049-4538-a364-1f9bda4b642d,network=Network(04360fd2-50cb-46cf-95d6-c54f98289e71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab1b6629-50')
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.031 2 INFO nova.virt.libvirt.driver [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Deleting instance files /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f_del
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.032 2 INFO nova.virt.libvirt.driver [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Deletion of /var/lib/nova/instances/2ef41935-4444-49ed-a38d-8e6b47bf533f_del complete
Sep 30 21:21:07 compute-1 podman[224764]: 2025-09-30 21:21:07.035430832 +0000 UTC m=+0.190081438 container cleanup 3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 21:21:07 compute-1 systemd[1]: libpod-conmon-3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6.scope: Deactivated successfully.
Sep 30 21:21:07 compute-1 podman[224811]: 2025-09-30 21:21:07.120178224 +0000 UTC m=+0.057171737 container remove 3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:21:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:07.129 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dde27b1f-707c-43d8-85f6-0ea63b1fc65f]: (4, ('Tue Sep 30 09:21:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71 (3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6)\n3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6\nTue Sep 30 09:21:07 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71 (3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6)\n3eb0c9e312bdb7bebc80591096e90c279a0a0b97defd065f783cb958db7463d6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:07.133 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[abafe222-b40d-4741-a3b3-bb513fb56c92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:07.134 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04360fd2-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:07 compute-1 kernel: tap04360fd2-50: left promiscuous mode
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.150 2 INFO nova.compute.manager [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Took 0.51 seconds to destroy the instance on the hypervisor.
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.151 2 DEBUG oslo.service.loopingcall [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.154 2 DEBUG nova.compute.manager [-] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.155 2 DEBUG nova.network.neutron [-] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:21:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:07.154 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d22f2e-ea3d-41e7-a25e-409a3d18b772]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:07.191 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0037c8fb-88cc-4c40-a8b8-44bc540e9456]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:07.197 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b8f1ea56-2e4a-43df-9085-99106feef7e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:07.223 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[24e05ba9-2049-4731-a6e4-61918496e78c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 394386, 'reachable_time': 33588, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224826, 'error': None, 'target': 'ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:07.226 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-04360fd2-50cb-46cf-95d6-c54f98289e71 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:21:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:07.227 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[ce72801a-8451-4c8f-a8e0-3834cec54fb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:07 compute-1 systemd[1]: run-netns-ovnmeta\x2d04360fd2\x2d50cb\x2d46cf\x2d95d6\x2dc54f98289e71.mount: Deactivated successfully.
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.917 2 DEBUG nova.network.neutron [-] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.941 2 INFO nova.compute.manager [-] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Took 0.79 seconds to deallocate network for instance.
Sep 30 21:21:07 compute-1 nova_compute[192795]: 2025-09-30 21:21:07.998 2 DEBUG nova.compute.manager [req-3d4be007-f755-4945-b227-0e8cae26a013 req-a2904122-1627-445e-ba73-3f41083ccc0c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Received event network-vif-deleted-ab1b6629-5049-4538-a364-1f9bda4b642d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.010 2 DEBUG oslo_concurrency.lockutils [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.011 2 DEBUG oslo_concurrency.lockutils [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.094 2 DEBUG nova.compute.provider_tree [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.110 2 DEBUG nova.scheduler.client.report [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.136 2 DEBUG oslo_concurrency.lockutils [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.166 2 INFO nova.scheduler.client.report [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Deleted allocations for instance 2ef41935-4444-49ed-a38d-8e6b47bf533f
Sep 30 21:21:08 compute-1 podman[224827]: 2025-09-30 21:21:08.245607759 +0000 UTC m=+0.086958882 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, version=9.6)
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.262 2 DEBUG oslo_concurrency.lockutils [None req-a00e2039-77b0-4f09-83a7-86a7d99eb247 c6ecca7d29aa4330a4f67feebfe7800b 999d8c5597404e69b40454f7b06550ca - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:08 compute-1 podman[224848]: 2025-09-30 21:21:08.349854996 +0000 UTC m=+0.076010995 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.542 2 DEBUG nova.compute.manager [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Received event network-vif-unplugged-ab1b6629-5049-4538-a364-1f9bda4b642d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.542 2 DEBUG oslo_concurrency.lockutils [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.543 2 DEBUG oslo_concurrency.lockutils [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.543 2 DEBUG oslo_concurrency.lockutils [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.544 2 DEBUG nova.compute.manager [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] No waiting events found dispatching network-vif-unplugged-ab1b6629-5049-4538-a364-1f9bda4b642d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.544 2 WARNING nova.compute.manager [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Received unexpected event network-vif-unplugged-ab1b6629-5049-4538-a364-1f9bda4b642d for instance with vm_state deleted and task_state None.
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.544 2 DEBUG nova.compute.manager [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Received event network-vif-plugged-ab1b6629-5049-4538-a364-1f9bda4b642d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.545 2 DEBUG oslo_concurrency.lockutils [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.545 2 DEBUG oslo_concurrency.lockutils [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.546 2 DEBUG oslo_concurrency.lockutils [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2ef41935-4444-49ed-a38d-8e6b47bf533f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.546 2 DEBUG nova.compute.manager [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] No waiting events found dispatching network-vif-plugged-ab1b6629-5049-4538-a364-1f9bda4b642d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:08 compute-1 nova_compute[192795]: 2025-09-30 21:21:08.546 2 WARNING nova.compute.manager [req-d9470f57-2d23-49aa-a8de-15ea4f856b88 req-64ec4da2-5f23-4629-9f60-6e68670e5b0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Received unexpected event network-vif-plugged-ab1b6629-5049-4538-a364-1f9bda4b642d for instance with vm_state deleted and task_state None.
Sep 30 21:21:10 compute-1 nova_compute[192795]: 2025-09-30 21:21:10.928 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "456a6c22-b801-4d95-aa63-be64cd8e4b53" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:10 compute-1 nova_compute[192795]: 2025-09-30 21:21:10.929 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:10 compute-1 nova_compute[192795]: 2025-09-30 21:21:10.949 2 DEBUG nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.119 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.120 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.128 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.128 2 INFO nova.compute.claims [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.283 2 DEBUG nova.compute.provider_tree [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.318 2 DEBUG nova.scheduler.client.report [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.453 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.454 2 DEBUG nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.565 2 DEBUG nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.566 2 DEBUG nova.network.neutron [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.584 2 INFO nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.600 2 DEBUG nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.765 2 DEBUG nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.767 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.767 2 INFO nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Creating image(s)
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.768 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.769 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.770 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.794 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.889 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.891 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.892 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.918 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.982 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:11 compute-1 nova_compute[192795]: 2025-09-30 21:21:11.983 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.042 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.043 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.044 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.077 2 DEBUG nova.network.neutron [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.078 2 DEBUG nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.141 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.142 2 DEBUG nova.virt.disk.api [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Checking if we can resize image /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.143 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.239 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.241 2 DEBUG nova.virt.disk.api [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Cannot resize image /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.243 2 DEBUG nova.objects.instance [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'migration_context' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.258 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.258 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Ensure instance console log exists: /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.259 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.259 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.260 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.261 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.268 2 WARNING nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.273 2 DEBUG nova.virt.libvirt.host [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.274 2 DEBUG nova.virt.libvirt.host [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.279 2 DEBUG nova.virt.libvirt.host [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.279 2 DEBUG nova.virt.libvirt.host [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.281 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.281 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.282 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.282 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.282 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.282 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.283 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.283 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.284 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.284 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.284 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.285 2 DEBUG nova.virt.hardware [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.292 2 DEBUG nova.objects.instance [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'pci_devices' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.318 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <uuid>456a6c22-b801-4d95-aa63-be64cd8e4b53</uuid>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <name>instance-0000001e</name>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <nova:name>tempest-MigrationsAdminTest-server-936283323</nova:name>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:21:12</nova:creationTime>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:21:12 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:21:12 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:21:12 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:21:12 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:12 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:21:12 compute-1 nova_compute[192795]:         <nova:user uuid="f11be289e05b49a7809cb1a3523abc0c">tempest-MigrationsAdminTest-1333693346-project-member</nova:user>
Sep 30 21:21:12 compute-1 nova_compute[192795]:         <nova:project uuid="7c15359849554c2382315de9f52125af">tempest-MigrationsAdminTest-1333693346</nova:project>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <system>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <entry name="serial">456a6c22-b801-4d95-aa63-be64cd8e4b53</entry>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <entry name="uuid">456a6c22-b801-4d95-aa63-be64cd8e4b53</entry>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     </system>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <os>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   </os>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <features>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   </features>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.config"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/console.log" append="off"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <video>
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     </video>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:21:12 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:21:12 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:21:12 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:21:12 compute-1 nova_compute[192795]: </domain>
Sep 30 21:21:12 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.392 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.393 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.394 2 INFO nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Using config drive
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.628 2 INFO nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Creating config drive at /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.config
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.639 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7mgaxmu3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.786 2 DEBUG oslo_concurrency.processutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7mgaxmu3" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:12 compute-1 nova_compute[192795]: 2025-09-30 21:21:12.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:12 compute-1 systemd-machined[152783]: New machine qemu-15-instance-0000001e.
Sep 30 21:21:12 compute-1 systemd[1]: Started Virtual Machine qemu-15-instance-0000001e.
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.832 2 DEBUG nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.833 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.834 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267273.8301535, 456a6c22-b801-4d95-aa63-be64cd8e4b53 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.835 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] VM Resumed (Lifecycle Event)
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.845 2 INFO nova.virt.libvirt.driver [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance spawned successfully.
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.846 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.874 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.884 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.892 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.893 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.893 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.894 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.895 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.895 2 DEBUG nova.virt.libvirt.driver [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.927 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.928 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267273.8401856, 456a6c22-b801-4d95-aa63-be64cd8e4b53 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.928 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] VM Started (Lifecycle Event)
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.956 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.962 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.998 2 INFO nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Took 2.23 seconds to spawn the instance on the hypervisor.
Sep 30 21:21:13 compute-1 nova_compute[192795]: 2025-09-30 21:21:13.999 2 DEBUG nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:14 compute-1 nova_compute[192795]: 2025-09-30 21:21:14.001 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:14 compute-1 nova_compute[192795]: 2025-09-30 21:21:14.093 2 INFO nova.compute.manager [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Took 3.08 seconds to build instance.
Sep 30 21:21:14 compute-1 nova_compute[192795]: 2025-09-30 21:21:14.114 2 DEBUG oslo_concurrency.lockutils [None req-0200099b-3376-49eb-9e38-c9e984260f41 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.617 2 DEBUG nova.compute.manager [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.762 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.763 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.790 2 DEBUG nova.objects.instance [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'pci_requests' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.814 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.814 2 INFO nova.compute.claims [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.815 2 DEBUG nova.objects.instance [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'resources' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.826 2 DEBUG nova.objects.instance [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'pci_devices' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.875 2 INFO nova.compute.resource_tracker [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Updating resource usage from migration f9a0f17b-d4d0-402b-be52-d0d84fecdb35
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.890 2 DEBUG nova.scheduler.client.report [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.908 2 DEBUG nova.scheduler.client.report [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.909 2 DEBUG nova.compute.provider_tree [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.926 2 DEBUG nova.scheduler.client.report [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:21:16 compute-1 nova_compute[192795]: 2025-09-30 21:21:16.953 2 DEBUG nova.scheduler.client.report [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.020 2 DEBUG nova.compute.provider_tree [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.036 2 DEBUG nova.scheduler.client.report [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.071 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.072 2 INFO nova.compute.manager [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Migrating
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.134 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.134 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.135 2 DEBUG nova.network.neutron [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.334 2 DEBUG nova.network.neutron [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.672 2 DEBUG nova.network.neutron [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.694 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.875 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:21:17 compute-1 nova_compute[192795]: 2025-09-30 21:21:17.880 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:21:18 compute-1 podman[224916]: 2025-09-30 21:21:18.244865544 +0000 UTC m=+0.082738228 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:21:19 compute-1 nova_compute[192795]: 2025-09-30 21:21:19.641 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:19 compute-1 nova_compute[192795]: 2025-09-30 21:21:19.642 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:19 compute-1 nova_compute[192795]: 2025-09-30 21:21:19.662 2 DEBUG nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:21:19 compute-1 nova_compute[192795]: 2025-09-30 21:21:19.764 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:19 compute-1 nova_compute[192795]: 2025-09-30 21:21:19.765 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:19 compute-1 nova_compute[192795]: 2025-09-30 21:21:19.776 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:21:19 compute-1 nova_compute[192795]: 2025-09-30 21:21:19.776 2 INFO nova.compute.claims [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:21:19 compute-1 nova_compute[192795]: 2025-09-30 21:21:19.975 2 DEBUG nova.compute.provider_tree [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:19 compute-1 nova_compute[192795]: 2025-09-30 21:21:19.992 2 DEBUG nova.scheduler.client.report [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.022 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.023 2 DEBUG nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.077 2 DEBUG nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.078 2 DEBUG nova.network.neutron [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.108 2 INFO nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.130 2 DEBUG nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.309 2 DEBUG nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.311 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.312 2 INFO nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Creating image(s)
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.313 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.313 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.314 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.327 2 DEBUG nova.policy [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c536ea061a32492a8c5e6bf941d1c9f3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '47d2c796445c4dd3affc8594502f04be', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.330 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.392 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.394 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.394 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.405 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.464 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.465 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.505 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.506 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.506 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.567 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.570 2 DEBUG nova.virt.disk.api [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Checking if we can resize image /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.571 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.640 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.641 2 DEBUG nova.virt.disk.api [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Cannot resize image /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.642 2 DEBUG nova.objects.instance [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'migration_context' on Instance uuid 04fadc55-8de2-49b0-a4db-9cc05bd5d036 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.664 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.664 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Ensure instance console log exists: /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.665 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.666 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:20 compute-1 nova_compute[192795]: 2025-09-30 21:21:20.666 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:21 compute-1 nova_compute[192795]: 2025-09-30 21:21:21.296 2 DEBUG nova.network.neutron [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Successfully created port: a5a8cb9d-f903-4595-a2d2-d2bacf341918 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:21:21 compute-1 nova_compute[192795]: 2025-09-30 21:21:21.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:21 compute-1 nova_compute[192795]: 2025-09-30 21:21:21.949 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267266.9470332, 2ef41935-4444-49ed-a38d-8e6b47bf533f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:21 compute-1 nova_compute[192795]: 2025-09-30 21:21:21.949 2 INFO nova.compute.manager [-] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] VM Stopped (Lifecycle Event)
Sep 30 21:21:21 compute-1 nova_compute[192795]: 2025-09-30 21:21:21.997 2 DEBUG nova.compute.manager [None req-542b6e38-b4e3-415c-9817-6cb022069c32 - - - - - -] [instance: 2ef41935-4444-49ed-a38d-8e6b47bf533f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:22 compute-1 nova_compute[192795]: 2025-09-30 21:21:22.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:22 compute-1 nova_compute[192795]: 2025-09-30 21:21:22.321 2 DEBUG nova.network.neutron [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Successfully updated port: a5a8cb9d-f903-4595-a2d2-d2bacf341918 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:21:22 compute-1 nova_compute[192795]: 2025-09-30 21:21:22.340 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:22 compute-1 nova_compute[192795]: 2025-09-30 21:21:22.340 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquired lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:22 compute-1 nova_compute[192795]: 2025-09-30 21:21:22.341 2 DEBUG nova.network.neutron [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:22 compute-1 nova_compute[192795]: 2025-09-30 21:21:22.481 2 DEBUG nova.compute.manager [req-9de1cfb7-5e80-4872-ac3b-59102ce0d0b8 req-08a30d7c-04e9-4263-911d-7646bc0a1813 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:22 compute-1 nova_compute[192795]: 2025-09-30 21:21:22.482 2 DEBUG nova.compute.manager [req-9de1cfb7-5e80-4872-ac3b-59102ce0d0b8 req-08a30d7c-04e9-4263-911d-7646bc0a1813 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing instance network info cache due to event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:22 compute-1 nova_compute[192795]: 2025-09-30 21:21:22.482 2 DEBUG oslo_concurrency.lockutils [req-9de1cfb7-5e80-4872-ac3b-59102ce0d0b8 req-08a30d7c-04e9-4263-911d-7646bc0a1813 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:22 compute-1 nova_compute[192795]: 2025-09-30 21:21:22.573 2 DEBUG nova.network.neutron [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.492 2 DEBUG nova.network.neutron [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.552 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Releasing lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.553 2 DEBUG nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Instance network_info: |[{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.554 2 DEBUG oslo_concurrency.lockutils [req-9de1cfb7-5e80-4872-ac3b-59102ce0d0b8 req-08a30d7c-04e9-4263-911d-7646bc0a1813 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.554 2 DEBUG nova.network.neutron [req-9de1cfb7-5e80-4872-ac3b-59102ce0d0b8 req-08a30d7c-04e9-4263-911d-7646bc0a1813 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.560 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Start _get_guest_xml network_info=[{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.568 2 WARNING nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.574 2 DEBUG nova.virt.libvirt.host [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.575 2 DEBUG nova.virt.libvirt.host [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.581 2 DEBUG nova.virt.libvirt.host [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.582 2 DEBUG nova.virt.libvirt.host [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.584 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.584 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.585 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.585 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.586 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.586 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.587 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.587 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.588 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.588 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.588 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.589 2 DEBUG nova.virt.hardware [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.596 2 DEBUG nova.virt.libvirt.vif [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-984303008',display_name='tempest-tempest.common.compute-instance-984303008',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-984303008',id=32,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-zvzpoyu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=04fadc55-8de2-49b0-a4db-9cc05bd5d036,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.597 2 DEBUG nova.network.os_vif_util [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.598 2 DEBUG nova.network.os_vif_util [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:7d:eb,bridge_name='br-int',has_traffic_filtering=True,id=a5a8cb9d-f903-4595-a2d2-d2bacf341918,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5a8cb9d-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.600 2 DEBUG nova.objects.instance [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'pci_devices' on Instance uuid 04fadc55-8de2-49b0-a4db-9cc05bd5d036 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.622 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <uuid>04fadc55-8de2-49b0-a4db-9cc05bd5d036</uuid>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <name>instance-00000020</name>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <nova:name>tempest-tempest.common.compute-instance-984303008</nova:name>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:21:23</nova:creationTime>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:21:23 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:21:23 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:21:23 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:21:23 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:23 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:21:23 compute-1 nova_compute[192795]:         <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:21:23 compute-1 nova_compute[192795]:         <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:21:23 compute-1 nova_compute[192795]:         <nova:port uuid="a5a8cb9d-f903-4595-a2d2-d2bacf341918">
Sep 30 21:21:23 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <system>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <entry name="serial">04fadc55-8de2-49b0-a4db-9cc05bd5d036</entry>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <entry name="uuid">04fadc55-8de2-49b0-a4db-9cc05bd5d036</entry>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     </system>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <os>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   </os>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <features>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   </features>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk.config"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:db:7d:eb"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <target dev="tapa5a8cb9d-f9"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/console.log" append="off"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <video>
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     </video>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:21:23 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:21:23 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:21:23 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:21:23 compute-1 nova_compute[192795]: </domain>
Sep 30 21:21:23 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.625 2 DEBUG nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Preparing to wait for external event network-vif-plugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.626 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.626 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.626 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.628 2 DEBUG nova.virt.libvirt.vif [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-984303008',display_name='tempest-tempest.common.compute-instance-984303008',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-984303008',id=32,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-zvzpoyu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=04fadc55-8de2-49b0-a4db-9cc05bd5d036,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.629 2 DEBUG nova.network.os_vif_util [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.630 2 DEBUG nova.network.os_vif_util [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:7d:eb,bridge_name='br-int',has_traffic_filtering=True,id=a5a8cb9d-f903-4595-a2d2-d2bacf341918,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5a8cb9d-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.631 2 DEBUG os_vif [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:7d:eb,bridge_name='br-int',has_traffic_filtering=True,id=a5a8cb9d-f903-4595-a2d2-d2bacf341918,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5a8cb9d-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.633 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.633 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.638 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa5a8cb9d-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.639 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa5a8cb9d-f9, col_values=(('external_ids', {'iface-id': 'a5a8cb9d-f903-4595-a2d2-d2bacf341918', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:7d:eb', 'vm-uuid': '04fadc55-8de2-49b0-a4db-9cc05bd5d036'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:23 compute-1 NetworkManager[51724]: <info>  [1759267283.6463] manager: (tapa5a8cb9d-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.663 2 INFO os_vif [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:7d:eb,bridge_name='br-int',has_traffic_filtering=True,id=a5a8cb9d-f903-4595-a2d2-d2bacf341918,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5a8cb9d-f9')
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.968 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.968 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.968 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No VIF found with MAC fa:16:3e:db:7d:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:21:23 compute-1 nova_compute[192795]: 2025-09-30 21:21:23.969 2 INFO nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Using config drive
Sep 30 21:21:24 compute-1 nova_compute[192795]: 2025-09-30 21:21:24.404 2 INFO nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Creating config drive at /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk.config
Sep 30 21:21:24 compute-1 nova_compute[192795]: 2025-09-30 21:21:24.415 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt0h4tw87 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:24 compute-1 nova_compute[192795]: 2025-09-30 21:21:24.547 2 DEBUG oslo_concurrency.processutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt0h4tw87" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:24 compute-1 kernel: tapa5a8cb9d-f9: entered promiscuous mode
Sep 30 21:21:24 compute-1 NetworkManager[51724]: <info>  [1759267284.6408] manager: (tapa5a8cb9d-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Sep 30 21:21:24 compute-1 nova_compute[192795]: 2025-09-30 21:21:24.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:24 compute-1 ovn_controller[94902]: 2025-09-30T21:21:24Z|00130|binding|INFO|Claiming lport a5a8cb9d-f903-4595-a2d2-d2bacf341918 for this chassis.
Sep 30 21:21:24 compute-1 ovn_controller[94902]: 2025-09-30T21:21:24Z|00131|binding|INFO|a5a8cb9d-f903-4595-a2d2-d2bacf341918: Claiming fa:16:3e:db:7d:eb 10.100.0.13
Sep 30 21:21:24 compute-1 systemd-udevd[225005]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.725 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:7d:eb 10.100.0.13'], port_security=['fa:16:3e:db:7d:eb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '04fadc55-8de2-49b0-a4db-9cc05bd5d036', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '47d2c796445c4dd3affc8594502f04be', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69ef8d59-f66f-4bdf-9235-591ecebd585e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f6d308b-b549-4733-b51f-a3dd42be0f08, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=a5a8cb9d-f903-4595-a2d2-d2bacf341918) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.726 103861 INFO neutron.agent.ovn.metadata.agent [-] Port a5a8cb9d-f903-4595-a2d2-d2bacf341918 in datapath 29d3fdc6-d8e1-4032-8f0c-e91da2912153 bound to our chassis
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.728 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:21:24 compute-1 NetworkManager[51724]: <info>  [1759267284.7401] device (tapa5a8cb9d-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:21:24 compute-1 NetworkManager[51724]: <info>  [1759267284.7410] device (tapa5a8cb9d-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:21:24 compute-1 systemd-machined[152783]: New machine qemu-16-instance-00000020.
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.746 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fd79cabd-4c7e-487e-904f-ac733357e508]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.748 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap29d3fdc6-d1 in ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.756 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap29d3fdc6-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.756 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd235a4-18ee-401a-948b-8cbcc55da5ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.757 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[93f9f08f-a46c-477e-8d56-9ac7982e6fd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 nova_compute[192795]: 2025-09-30 21:21:24.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:24 compute-1 systemd[1]: Started Virtual Machine qemu-16-instance-00000020.
Sep 30 21:21:24 compute-1 nova_compute[192795]: 2025-09-30 21:21:24.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:24 compute-1 ovn_controller[94902]: 2025-09-30T21:21:24Z|00132|binding|INFO|Setting lport a5a8cb9d-f903-4595-a2d2-d2bacf341918 ovn-installed in OVS
Sep 30 21:21:24 compute-1 ovn_controller[94902]: 2025-09-30T21:21:24Z|00133|binding|INFO|Setting lport a5a8cb9d-f903-4595-a2d2-d2bacf341918 up in Southbound
Sep 30 21:21:24 compute-1 nova_compute[192795]: 2025-09-30 21:21:24.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.774 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[8330f478-1a8f-4dca-96ff-377e94191387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 podman[224967]: 2025-09-30 21:21:24.77770596 +0000 UTC m=+0.149070871 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:21:24 compute-1 podman[224965]: 2025-09-30 21:21:24.784978356 +0000 UTC m=+0.161072635 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.797 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5b51bb0d-829a-4bb5-ab10-08428e19b0ca]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 podman[224966]: 2025-09-30 21:21:24.824284569 +0000 UTC m=+0.201230351 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.828 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[489f7a6f-29c7-4e89-afcf-5bc2ebde07fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 NetworkManager[51724]: <info>  [1759267284.8354] manager: (tap29d3fdc6-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.835 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7849bf-399e-4244-bdc8-184382785247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.873 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[c46778cd-6f90-40bf-bc8e-c8b934f58dc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.878 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[aae81170-8346-4e2c-9897-a6f505ae0faa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 NetworkManager[51724]: <info>  [1759267284.9031] device (tap29d3fdc6-d0): carrier: link connected
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.910 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6640dcd9-1f5b-4bf9-933d-9f1a4d69a00c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.929 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b22cb81e-36e7-42c0-bfec-c5a09e941131]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29d3fdc6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f7:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398057, 'reachable_time': 32431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225071, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.946 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[baba8d81-8461-4623-aca1-450734684d43]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:f7fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398057, 'tstamp': 398057}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225072, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.965 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[53e17126-23c1-4240-a79c-d63985472511]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29d3fdc6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f7:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398057, 'reachable_time': 32431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225073, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:24.999 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bf7f5cd5-8856-4a2c-87c8-db4d6ac403ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:25.085 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[94f7a646-afb5-41e3-920d-d7b7eedd7c74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:25.088 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29d3fdc6-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:25.088 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:25.089 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29d3fdc6-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:25 compute-1 NetworkManager[51724]: <info>  [1759267285.0931] manager: (tap29d3fdc6-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Sep 30 21:21:25 compute-1 kernel: tap29d3fdc6-d0: entered promiscuous mode
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:25.101 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap29d3fdc6-d0, col_values=(('external_ids', {'iface-id': 'ad63e4cf-251e-40e7-aea0-9713eaa58a32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:25 compute-1 ovn_controller[94902]: 2025-09-30T21:21:25Z|00134|binding|INFO|Releasing lport ad63e4cf-251e-40e7-aea0-9713eaa58a32 from this chassis (sb_readonly=0)
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:25.105 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/29d3fdc6-d8e1-4032-8f0c-e91da2912153.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/29d3fdc6-d8e1-4032-8f0c-e91da2912153.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:25.106 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7576a55d-b5a7-4f99-896f-957b44417fb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:25.107 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/29d3fdc6-d8e1-4032-8f0c-e91da2912153.pid.haproxy
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:21:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:25.109 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'env', 'PROCESS_TAG=haproxy-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/29d3fdc6-d8e1-4032-8f0c-e91da2912153.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.490 2 DEBUG nova.compute.manager [req-3e7ee2ae-2fd0-4008-a175-8ebe40ce7ba4 req-4a99e0d9-4c53-47ad-9093-9181e974e503 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-plugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.491 2 DEBUG oslo_concurrency.lockutils [req-3e7ee2ae-2fd0-4008-a175-8ebe40ce7ba4 req-4a99e0d9-4c53-47ad-9093-9181e974e503 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.491 2 DEBUG oslo_concurrency.lockutils [req-3e7ee2ae-2fd0-4008-a175-8ebe40ce7ba4 req-4a99e0d9-4c53-47ad-9093-9181e974e503 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.492 2 DEBUG oslo_concurrency.lockutils [req-3e7ee2ae-2fd0-4008-a175-8ebe40ce7ba4 req-4a99e0d9-4c53-47ad-9093-9181e974e503 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.492 2 DEBUG nova.compute.manager [req-3e7ee2ae-2fd0-4008-a175-8ebe40ce7ba4 req-4a99e0d9-4c53-47ad-9093-9181e974e503 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Processing event network-vif-plugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.564 2 DEBUG nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.566 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267285.5636263, 04fadc55-8de2-49b0-a4db-9cc05bd5d036 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.566 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] VM Started (Lifecycle Event)
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.572 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.579 2 INFO nova.virt.libvirt.driver [-] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Instance spawned successfully.
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.579 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.601 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.605 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:25 compute-1 podman[225121]: 2025-09-30 21:21:25.614174463 +0000 UTC m=+0.124429415 container create 1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:21:25 compute-1 podman[225121]: 2025-09-30 21:21:25.524282382 +0000 UTC m=+0.034537444 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.648 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.651 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.651 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.652 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.652 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.653 2 DEBUG nova.virt.libvirt.driver [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:25 compute-1 systemd[1]: Started libpod-conmon-1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37.scope.
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.670 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.671 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267285.5647147, 04fadc55-8de2-49b0-a4db-9cc05bd5d036 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.671 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] VM Paused (Lifecycle Event)
Sep 30 21:21:25 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:21:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9577bea85ffe8826fa980364590004bff0c77d86803c9f4de8789086ec5d636/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:21:25 compute-1 podman[225121]: 2025-09-30 21:21:25.703744054 +0000 UTC m=+0.213999026 container init 1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.714 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:25 compute-1 podman[225121]: 2025-09-30 21:21:25.715851231 +0000 UTC m=+0.226106183 container start 1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.721 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267285.571373, 04fadc55-8de2-49b0-a4db-9cc05bd5d036 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.722 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] VM Resumed (Lifecycle Event)
Sep 30 21:21:25 compute-1 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[225141]: [NOTICE]   (225145) : New worker (225147) forked
Sep 30 21:21:25 compute-1 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[225141]: [NOTICE]   (225145) : Loading success.
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.775 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.787 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.815 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.820 2 INFO nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Took 5.51 seconds to spawn the instance on the hypervisor.
Sep 30 21:21:25 compute-1 nova_compute[192795]: 2025-09-30 21:21:25.821 2 DEBUG nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:26 compute-1 nova_compute[192795]: 2025-09-30 21:21:26.100 2 INFO nova.compute.manager [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Took 6.37 seconds to build instance.
Sep 30 21:21:26 compute-1 nova_compute[192795]: 2025-09-30 21:21:26.159 2 DEBUG oslo_concurrency.lockutils [None req-f5325f77-b93b-4f97-a696-0b32c08f9c5e c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:26 compute-1 nova_compute[192795]: 2025-09-30 21:21:26.169 2 DEBUG nova.network.neutron [req-9de1cfb7-5e80-4872-ac3b-59102ce0d0b8 req-08a30d7c-04e9-4263-911d-7646bc0a1813 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updated VIF entry in instance network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:26 compute-1 nova_compute[192795]: 2025-09-30 21:21:26.170 2 DEBUG nova.network.neutron [req-9de1cfb7-5e80-4872-ac3b-59102ce0d0b8 req-08a30d7c-04e9-4263-911d-7646bc0a1813 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:26 compute-1 nova_compute[192795]: 2025-09-30 21:21:26.223 2 DEBUG oslo_concurrency.lockutils [req-9de1cfb7-5e80-4872-ac3b-59102ce0d0b8 req-08a30d7c-04e9-4263-911d-7646bc0a1813 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:26 compute-1 nova_compute[192795]: 2025-09-30 21:21:26.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:27 compute-1 nova_compute[192795]: 2025-09-30 21:21:27.635 2 DEBUG nova.compute.manager [req-ca378bd8-958e-4beb-abcc-fd779dac7793 req-d131c536-67c4-4e08-96b0-52bcf127f854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-plugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:27 compute-1 nova_compute[192795]: 2025-09-30 21:21:27.636 2 DEBUG oslo_concurrency.lockutils [req-ca378bd8-958e-4beb-abcc-fd779dac7793 req-d131c536-67c4-4e08-96b0-52bcf127f854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:27 compute-1 nova_compute[192795]: 2025-09-30 21:21:27.636 2 DEBUG oslo_concurrency.lockutils [req-ca378bd8-958e-4beb-abcc-fd779dac7793 req-d131c536-67c4-4e08-96b0-52bcf127f854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:27 compute-1 nova_compute[192795]: 2025-09-30 21:21:27.637 2 DEBUG oslo_concurrency.lockutils [req-ca378bd8-958e-4beb-abcc-fd779dac7793 req-d131c536-67c4-4e08-96b0-52bcf127f854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:27 compute-1 nova_compute[192795]: 2025-09-30 21:21:27.637 2 DEBUG nova.compute.manager [req-ca378bd8-958e-4beb-abcc-fd779dac7793 req-d131c536-67c4-4e08-96b0-52bcf127f854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] No waiting events found dispatching network-vif-plugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:27 compute-1 nova_compute[192795]: 2025-09-30 21:21:27.637 2 WARNING nova.compute.manager [req-ca378bd8-958e-4beb-abcc-fd779dac7793 req-d131c536-67c4-4e08-96b0-52bcf127f854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received unexpected event network-vif-plugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 for instance with vm_state active and task_state None.
Sep 30 21:21:27 compute-1 nova_compute[192795]: 2025-09-30 21:21:27.951 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:21:28 compute-1 podman[225156]: 2025-09-30 21:21:28.236321988 +0000 UTC m=+0.071523994 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:21:28 compute-1 nova_compute[192795]: 2025-09-30 21:21:28.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:29 compute-1 NetworkManager[51724]: <info>  [1759267289.1268] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Sep 30 21:21:29 compute-1 nova_compute[192795]: 2025-09-30 21:21:29.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:29 compute-1 NetworkManager[51724]: <info>  [1759267289.1277] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Sep 30 21:21:29 compute-1 nova_compute[192795]: 2025-09-30 21:21:29.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:29 compute-1 ovn_controller[94902]: 2025-09-30T21:21:29Z|00135|binding|INFO|Releasing lport ad63e4cf-251e-40e7-aea0-9713eaa58a32 from this chassis (sb_readonly=0)
Sep 30 21:21:29 compute-1 nova_compute[192795]: 2025-09-30 21:21:29.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:29 compute-1 nova_compute[192795]: 2025-09-30 21:21:29.542 2 DEBUG nova.compute.manager [req-57339e6f-f2c2-42eb-bb22-598dd4291ac8 req-6176f0be-6823-48d8-91a4-1964a5fa7dd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:29 compute-1 nova_compute[192795]: 2025-09-30 21:21:29.543 2 DEBUG nova.compute.manager [req-57339e6f-f2c2-42eb-bb22-598dd4291ac8 req-6176f0be-6823-48d8-91a4-1964a5fa7dd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing instance network info cache due to event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:29 compute-1 nova_compute[192795]: 2025-09-30 21:21:29.545 2 DEBUG oslo_concurrency.lockutils [req-57339e6f-f2c2-42eb-bb22-598dd4291ac8 req-6176f0be-6823-48d8-91a4-1964a5fa7dd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:29 compute-1 nova_compute[192795]: 2025-09-30 21:21:29.545 2 DEBUG oslo_concurrency.lockutils [req-57339e6f-f2c2-42eb-bb22-598dd4291ac8 req-6176f0be-6823-48d8-91a4-1964a5fa7dd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:29 compute-1 nova_compute[192795]: 2025-09-30 21:21:29.545 2 DEBUG nova.network.neutron [req-57339e6f-f2c2-42eb-bb22-598dd4291ac8 req-6176f0be-6823-48d8-91a4-1964a5fa7dd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:30 compute-1 nova_compute[192795]: 2025-09-30 21:21:30.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:30 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Sep 30 21:21:30 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001e.scope: Consumed 13.380s CPU time.
Sep 30 21:21:30 compute-1 systemd-machined[152783]: Machine qemu-15-instance-0000001e terminated.
Sep 30 21:21:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:30.606 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:30.608 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:21:30 compute-1 nova_compute[192795]: 2025-09-30 21:21:30.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:30 compute-1 nova_compute[192795]: 2025-09-30 21:21:30.968 2 INFO nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance shutdown successfully after 13 seconds.
Sep 30 21:21:30 compute-1 nova_compute[192795]: 2025-09-30 21:21:30.976 2 INFO nova.virt.libvirt.driver [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance destroyed successfully.
Sep 30 21:21:30 compute-1 nova_compute[192795]: 2025-09-30 21:21:30.983 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.092 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json" returned: 0 in 0.109s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.095 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.153 2 DEBUG nova.network.neutron [req-57339e6f-f2c2-42eb-bb22-598dd4291ac8 req-6176f0be-6823-48d8-91a4-1964a5fa7dd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updated VIF entry in instance network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.154 2 DEBUG nova.network.neutron [req-57339e6f-f2c2-42eb-bb22-598dd4291ac8 req-6176f0be-6823-48d8-91a4-1964a5fa7dd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.157 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.158 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53_resize/disk /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.185 2 DEBUG oslo_concurrency.lockutils [req-57339e6f-f2c2-42eb-bb22-598dd4291ac8 req-6176f0be-6823-48d8-91a4-1964a5fa7dd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.186 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "cp -r /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53_resize/disk /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.187 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53_resize/disk.config /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.217 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "cp -r /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53_resize/disk.config /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.config" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.219 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53_resize/disk.info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.250 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "cp -r /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53_resize/disk.info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.info" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.401 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "456a6c22-b801-4d95-aa63-be64cd8e4b53-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.402 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.403 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.588 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.589 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.590 2 DEBUG nova.network.neutron [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:31 compute-1 nova_compute[192795]: 2025-09-30 21:21:31.776 2 DEBUG nova.network.neutron [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.020 2 DEBUG nova.network.neutron [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.062 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.227 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.230 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.230 2 INFO nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Creating image(s)
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.232 2 DEBUG nova.objects.instance [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'trusted_certs' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.251 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.307 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.310 2 DEBUG nova.virt.disk.api [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Checking if we can resize image /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.310 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.388 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.390 2 DEBUG nova.virt.disk.api [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Cannot resize image /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.407 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.407 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Ensure instance console log exists: /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.408 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.408 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.409 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.411 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.416 2 WARNING nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.421 2 DEBUG nova.virt.libvirt.host [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.422 2 DEBUG nova.virt.libvirt.host [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.426 2 DEBUG nova.virt.libvirt.host [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.426 2 DEBUG nova.virt.libvirt.host [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.427 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.428 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9779bca-1eb6-4567-a36c-b452abeafc70',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.428 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.428 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.429 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.429 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.429 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.429 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.430 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.430 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.430 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.431 2 DEBUG nova.virt.hardware [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.431 2 DEBUG nova.objects.instance [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'vcpu_model' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.450 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.507 2 DEBUG oslo_concurrency.processutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.config --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.508 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.509 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.509 2 DEBUG oslo_concurrency.lockutils [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.512 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <uuid>456a6c22-b801-4d95-aa63-be64cd8e4b53</uuid>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <name>instance-0000001e</name>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <memory>196608</memory>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <nova:name>tempest-MigrationsAdminTest-server-936283323</nova:name>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:21:32</nova:creationTime>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <nova:flavor name="m1.micro">
Sep 30 21:21:32 compute-1 nova_compute[192795]:         <nova:memory>192</nova:memory>
Sep 30 21:21:32 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:21:32 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:21:32 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:32 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:21:32 compute-1 nova_compute[192795]:         <nova:user uuid="f11be289e05b49a7809cb1a3523abc0c">tempest-MigrationsAdminTest-1333693346-project-member</nova:user>
Sep 30 21:21:32 compute-1 nova_compute[192795]:         <nova:project uuid="7c15359849554c2382315de9f52125af">tempest-MigrationsAdminTest-1333693346</nova:project>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <system>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <entry name="serial">456a6c22-b801-4d95-aa63-be64cd8e4b53</entry>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <entry name="uuid">456a6c22-b801-4d95-aa63-be64cd8e4b53</entry>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     </system>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <os>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   </os>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <features>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   </features>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.config"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/console.log" append="off"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <video>
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     </video>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:21:32 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:21:32 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:21:32 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:21:32 compute-1 nova_compute[192795]: </domain>
Sep 30 21:21:32 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.588 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.601 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:32 compute-1 nova_compute[192795]: 2025-09-30 21:21:32.603 2 INFO nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Using config drive
Sep 30 21:21:32 compute-1 systemd-machined[152783]: New machine qemu-17-instance-0000001e.
Sep 30 21:21:32 compute-1 systemd[1]: Started Virtual Machine qemu-17-instance-0000001e.
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.378 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 456a6c22-b801-4d95-aa63-be64cd8e4b53 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.380 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267293.3785398, 456a6c22-b801-4d95-aa63-be64cd8e4b53 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.381 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] VM Resumed (Lifecycle Event)
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.383 2 DEBUG nova.compute.manager [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.389 2 INFO nova.virt.libvirt.driver [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance running successfully.
Sep 30 21:21:33 compute-1 virtqemud[192217]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.392 2 DEBUG nova.virt.libvirt.guest [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.393 2 DEBUG nova.virt.libvirt.driver [None req-15cd319f-d027-483b-919c-ca8c13ac4bc8 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.400 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.404 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.444 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.444 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267293.3796036, 456a6c22-b801-4d95-aa63-be64cd8e4b53 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.445 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] VM Started (Lifecycle Event)
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.471 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.475 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.716 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:33 compute-1 nova_compute[192795]: 2025-09-30 21:21:33.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:34 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:34.611 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:35 compute-1 nova_compute[192795]: 2025-09-30 21:21:35.276 2 DEBUG oslo_concurrency.lockutils [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "456a6c22-b801-4d95-aa63-be64cd8e4b53" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:35 compute-1 nova_compute[192795]: 2025-09-30 21:21:35.277 2 DEBUG oslo_concurrency.lockutils [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:35 compute-1 nova_compute[192795]: 2025-09-30 21:21:35.278 2 DEBUG nova.compute.manager [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Going to confirm migration 8 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Sep 30 21:21:35 compute-1 nova_compute[192795]: 2025-09-30 21:21:35.313 2 DEBUG nova.objects.instance [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'info_cache' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:36 compute-1 nova_compute[192795]: 2025-09-30 21:21:36.264 2 DEBUG oslo_concurrency.lockutils [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:36 compute-1 nova_compute[192795]: 2025-09-30 21:21:36.265 2 DEBUG oslo_concurrency.lockutils [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:36 compute-1 nova_compute[192795]: 2025-09-30 21:21:36.265 2 DEBUG nova.network.neutron [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:36 compute-1 nova_compute[192795]: 2025-09-30 21:21:36.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:37 compute-1 nova_compute[192795]: 2025-09-30 21:21:37.249 2 DEBUG nova.network.neutron [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:37 compute-1 podman[225236]: 2025-09-30 21:21:37.267612716 +0000 UTC m=+0.090510788 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:21:37 compute-1 nova_compute[192795]: 2025-09-30 21:21:37.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:37 compute-1 ovn_controller[94902]: 2025-09-30T21:21:37Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:7d:eb 10.100.0.13
Sep 30 21:21:37 compute-1 ovn_controller[94902]: 2025-09-30T21:21:37Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:7d:eb 10.100.0.13
Sep 30 21:21:37 compute-1 nova_compute[192795]: 2025-09-30 21:21:37.887 2 DEBUG nova.network.neutron [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:37 compute-1 nova_compute[192795]: 2025-09-30 21:21:37.908 2 DEBUG oslo_concurrency.lockutils [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:37 compute-1 nova_compute[192795]: 2025-09-30 21:21:37.909 2 DEBUG nova.objects.instance [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'migration_context' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:37 compute-1 nova_compute[192795]: 2025-09-30 21:21:37.924 2 DEBUG oslo_concurrency.lockutils [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:37 compute-1 nova_compute[192795]: 2025-09-30 21:21:37.924 2 DEBUG oslo_concurrency.lockutils [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:38 compute-1 nova_compute[192795]: 2025-09-30 21:21:38.039 2 DEBUG nova.compute.provider_tree [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:38 compute-1 nova_compute[192795]: 2025-09-30 21:21:38.055 2 DEBUG nova.scheduler.client.report [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:38 compute-1 nova_compute[192795]: 2025-09-30 21:21:38.125 2 DEBUG oslo_concurrency.lockutils [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:38 compute-1 nova_compute[192795]: 2025-09-30 21:21:38.286 2 INFO nova.scheduler.client.report [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Deleted allocation for migration f9a0f17b-d4d0-402b-be52-d0d84fecdb35
Sep 30 21:21:38 compute-1 nova_compute[192795]: 2025-09-30 21:21:38.386 2 DEBUG oslo_concurrency.lockutils [None req-c9339f70-2a4a-40de-a64e-f894dfde3561 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:38 compute-1 nova_compute[192795]: 2025-09-30 21:21:38.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:38.682 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:38.683 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:38.684 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:38 compute-1 nova_compute[192795]: 2025-09-30 21:21:38.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:39 compute-1 podman[225260]: 2025-09-30 21:21:39.228566267 +0000 UTC m=+0.063333073 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:21:39 compute-1 podman[225259]: 2025-09-30 21:21:39.24938566 +0000 UTC m=+0.092564173 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 21:21:39 compute-1 nova_compute[192795]: 2025-09-30 21:21:39.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:39 compute-1 nova_compute[192795]: 2025-09-30 21:21:39.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.742 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.744 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.744 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.744 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.835 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.891 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.893 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.945 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:40 compute-1 nova_compute[192795]: 2025-09-30 21:21:40.953 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.009 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.010 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.091 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.236 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.239 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5395MB free_disk=73.4073257446289GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.239 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.240 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.366 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 456a6c22-b801-4d95-aa63-be64cd8e4b53 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.366 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 04fadc55-8de2-49b0-a4db-9cc05bd5d036 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.367 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.367 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.438 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.489 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.530 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.531 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:41 compute-1 nova_compute[192795]: 2025-09-30 21:21:41.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:42 compute-1 ovn_controller[94902]: 2025-09-30T21:21:42Z|00136|binding|INFO|Releasing lport ad63e4cf-251e-40e7-aea0-9713eaa58a32 from this chassis (sb_readonly=0)
Sep 30 21:21:42 compute-1 nova_compute[192795]: 2025-09-30 21:21:42.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:43 compute-1 nova_compute[192795]: 2025-09-30 21:21:43.531 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:43 compute-1 nova_compute[192795]: 2025-09-30 21:21:43.532 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:21:43 compute-1 nova_compute[192795]: 2025-09-30 21:21:43.532 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:21:43 compute-1 nova_compute[192795]: 2025-09-30 21:21:43.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:44 compute-1 nova_compute[192795]: 2025-09-30 21:21:44.253 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:44 compute-1 nova_compute[192795]: 2025-09-30 21:21:44.254 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:44 compute-1 nova_compute[192795]: 2025-09-30 21:21:44.255 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:21:44 compute-1 nova_compute[192795]: 2025-09-30 21:21:44.255 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:44 compute-1 nova_compute[192795]: 2025-09-30 21:21:44.472 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:45 compute-1 nova_compute[192795]: 2025-09-30 21:21:45.280 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:45 compute-1 nova_compute[192795]: 2025-09-30 21:21:45.307 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:45 compute-1 nova_compute[192795]: 2025-09-30 21:21:45.307 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:21:45 compute-1 nova_compute[192795]: 2025-09-30 21:21:45.308 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:45 compute-1 nova_compute[192795]: 2025-09-30 21:21:45.344 2 DEBUG nova.compute.manager [req-d984f2dc-e9df-48af-b7cc-f85b1294c770 req-9079a20b-69c2-4037-82d4-8740df20421c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:45 compute-1 nova_compute[192795]: 2025-09-30 21:21:45.345 2 DEBUG nova.compute.manager [req-d984f2dc-e9df-48af-b7cc-f85b1294c770 req-9079a20b-69c2-4037-82d4-8740df20421c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing instance network info cache due to event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:45 compute-1 nova_compute[192795]: 2025-09-30 21:21:45.345 2 DEBUG oslo_concurrency.lockutils [req-d984f2dc-e9df-48af-b7cc-f85b1294c770 req-9079a20b-69c2-4037-82d4-8740df20421c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:45 compute-1 nova_compute[192795]: 2025-09-30 21:21:45.345 2 DEBUG oslo_concurrency.lockutils [req-d984f2dc-e9df-48af-b7cc-f85b1294c770 req-9079a20b-69c2-4037-82d4-8740df20421c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:45 compute-1 nova_compute[192795]: 2025-09-30 21:21:45.346 2 DEBUG nova.network.neutron [req-d984f2dc-e9df-48af-b7cc-f85b1294c770 req-9079a20b-69c2-4037-82d4-8740df20421c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.240 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "857a84cb-03ec-4e88-a3e8-da80fda2c446" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.241 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "857a84cb-03ec-4e88-a3e8-da80fda2c446" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.258 2 DEBUG nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.361 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.361 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.367 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.368 2 INFO nova.compute.claims [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.558 2 DEBUG nova.compute.provider_tree [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.578 2 DEBUG nova.scheduler.client.report [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.601 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.602 2 DEBUG nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.680 2 DEBUG nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.681 2 DEBUG nova.network.neutron [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.710 2 INFO nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.733 2 DEBUG nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.902 2 DEBUG nova.network.neutron [req-d984f2dc-e9df-48af-b7cc-f85b1294c770 req-9079a20b-69c2-4037-82d4-8740df20421c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updated VIF entry in instance network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.903 2 DEBUG nova.network.neutron [req-d984f2dc-e9df-48af-b7cc-f85b1294c770 req-9079a20b-69c2-4037-82d4-8740df20421c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.908 2 DEBUG nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.909 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.909 2 INFO nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Creating image(s)
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.910 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.910 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.911 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.924 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:46 compute-1 nova_compute[192795]: 2025-09-30 21:21:46.952 2 DEBUG oslo_concurrency.lockutils [req-d984f2dc-e9df-48af-b7cc-f85b1294c770 req-9079a20b-69c2-4037-82d4-8740df20421c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.007 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.008 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.009 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.025 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.089 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.090 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.130 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.131 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.132 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.204 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.205 2 DEBUG nova.virt.disk.api [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Checking if we can resize image /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.205 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.268 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.269 2 DEBUG nova.virt.disk.api [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Cannot resize image /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.270 2 DEBUG nova.objects.instance [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'migration_context' on Instance uuid 857a84cb-03ec-4e88-a3e8-da80fda2c446 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.304 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.305 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Ensure instance console log exists: /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.306 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.306 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.307 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.365 2 DEBUG nova.network.neutron [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.366 2 DEBUG nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.368 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.375 2 WARNING nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.380 2 DEBUG nova.virt.libvirt.host [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.381 2 DEBUG nova.virt.libvirt.host [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.385 2 DEBUG nova.virt.libvirt.host [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.386 2 DEBUG nova.virt.libvirt.host [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.387 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.388 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:21:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3263a425-4ab0-4c30-af21-557885cc2a70',id=28,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-344349667',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.390 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.390 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.391 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.391 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.392 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.392 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.393 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.393 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.393 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.394 2 DEBUG nova.virt.hardware [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.399 2 DEBUG nova.objects.instance [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'pci_devices' on Instance uuid 857a84cb-03ec-4e88-a3e8-da80fda2c446 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.414 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <uuid>857a84cb-03ec-4e88-a3e8-da80fda2c446</uuid>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <name>instance-00000023</name>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <nova:name>tempest-MigrationsAdminTest-server-70232550</nova:name>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:21:47</nova:creationTime>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <nova:flavor name="tempest-test_resize_flavor_-344349667">
Sep 30 21:21:47 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:21:47 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:21:47 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:21:47 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:47 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:21:47 compute-1 nova_compute[192795]:         <nova:user uuid="f11be289e05b49a7809cb1a3523abc0c">tempest-MigrationsAdminTest-1333693346-project-member</nova:user>
Sep 30 21:21:47 compute-1 nova_compute[192795]:         <nova:project uuid="7c15359849554c2382315de9f52125af">tempest-MigrationsAdminTest-1333693346</nova:project>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <system>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <entry name="serial">857a84cb-03ec-4e88-a3e8-da80fda2c446</entry>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <entry name="uuid">857a84cb-03ec-4e88-a3e8-da80fda2c446</entry>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     </system>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <os>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   </os>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <features>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   </features>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/console.log" append="off"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <video>
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     </video>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:21:47 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:21:47 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:21:47 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:21:47 compute-1 nova_compute[192795]: </domain>
Sep 30 21:21:47 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.471 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.471 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.472 2 INFO nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Using config drive
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.795 2 INFO nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Creating config drive at /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.800 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpegt7sicq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:47 compute-1 nova_compute[192795]: 2025-09-30 21:21:47.931 2 DEBUG oslo_concurrency.processutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpegt7sicq" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:48 compute-1 systemd-machined[152783]: New machine qemu-18-instance-00000023.
Sep 30 21:21:48 compute-1 systemd[1]: Started Virtual Machine qemu-18-instance-00000023.
Sep 30 21:21:48 compute-1 podman[225367]: 2025-09-30 21:21:48.366386394 +0000 UTC m=+0.080175418 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.718 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267308.717524, 857a84cb-03ec-4e88-a3e8-da80fda2c446 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.718 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] VM Resumed (Lifecycle Event)
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.721 2 DEBUG nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.722 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.725 2 INFO nova.virt.libvirt.driver [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance spawned successfully.
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.726 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.768 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.772 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.773 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.773 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.775 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.776 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.776 2 DEBUG nova.virt.libvirt.driver [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.779 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.819 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.819 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267308.7189796, 857a84cb-03ec-4e88-a3e8-da80fda2c446 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.819 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] VM Started (Lifecycle Event)
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.840 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.844 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.858 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.874 2 INFO nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Took 1.97 seconds to spawn the instance on the hypervisor.
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.875 2 DEBUG nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:48 compute-1 nova_compute[192795]: 2025-09-30 21:21:48.993 2 INFO nova.compute.manager [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Took 2.66 seconds to build instance.
Sep 30 21:21:49 compute-1 nova_compute[192795]: 2025-09-30 21:21:49.028 2 DEBUG oslo_concurrency.lockutils [None req-e8946f3b-b668-4bb8-b314-dbcbd5b0258f f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "857a84cb-03ec-4e88-a3e8-da80fda2c446" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:50 compute-1 nova_compute[192795]: 2025-09-30 21:21:50.465 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:50 compute-1 nova_compute[192795]: 2025-09-30 21:21:50.536 2 DEBUG oslo_concurrency.lockutils [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "interface-04fadc55-8de2-49b0-a4db-9cc05bd5d036-5008167c-4cca-4768-963e-ab0119180625" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:50 compute-1 nova_compute[192795]: 2025-09-30 21:21:50.537 2 DEBUG oslo_concurrency.lockutils [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "interface-04fadc55-8de2-49b0-a4db-9cc05bd5d036-5008167c-4cca-4768-963e-ab0119180625" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:50 compute-1 nova_compute[192795]: 2025-09-30 21:21:50.538 2 DEBUG nova.objects.instance [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'flavor' on Instance uuid 04fadc55-8de2-49b0-a4db-9cc05bd5d036 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:51 compute-1 nova_compute[192795]: 2025-09-30 21:21:51.004 2 DEBUG nova.compute.manager [req-69eaa29b-89d1-4b73-82f7-7f642bd05981 req-e133d852-6042-4f53-8d13-b3d85b63b0b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:51 compute-1 nova_compute[192795]: 2025-09-30 21:21:51.005 2 DEBUG nova.compute.manager [req-69eaa29b-89d1-4b73-82f7-7f642bd05981 req-e133d852-6042-4f53-8d13-b3d85b63b0b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing instance network info cache due to event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:51 compute-1 nova_compute[192795]: 2025-09-30 21:21:51.005 2 DEBUG oslo_concurrency.lockutils [req-69eaa29b-89d1-4b73-82f7-7f642bd05981 req-e133d852-6042-4f53-8d13-b3d85b63b0b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:51 compute-1 nova_compute[192795]: 2025-09-30 21:21:51.005 2 DEBUG oslo_concurrency.lockutils [req-69eaa29b-89d1-4b73-82f7-7f642bd05981 req-e133d852-6042-4f53-8d13-b3d85b63b0b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:51 compute-1 nova_compute[192795]: 2025-09-30 21:21:51.005 2 DEBUG nova.network.neutron [req-69eaa29b-89d1-4b73-82f7-7f642bd05981 req-e133d852-6042-4f53-8d13-b3d85b63b0b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:51 compute-1 nova_compute[192795]: 2025-09-30 21:21:51.279 2 DEBUG nova.objects.instance [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'pci_requests' on Instance uuid 04fadc55-8de2-49b0-a4db-9cc05bd5d036 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:51 compute-1 nova_compute[192795]: 2025-09-30 21:21:51.305 2 DEBUG nova.network.neutron [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:21:51 compute-1 nova_compute[192795]: 2025-09-30 21:21:51.750 2 DEBUG nova.policy [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c536ea061a32492a8c5e6bf941d1c9f3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '47d2c796445c4dd3affc8594502f04be', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:21:51 compute-1 nova_compute[192795]: 2025-09-30 21:21:51.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.281 2 DEBUG nova.network.neutron [req-69eaa29b-89d1-4b73-82f7-7f642bd05981 req-e133d852-6042-4f53-8d13-b3d85b63b0b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updated VIF entry in instance network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.282 2 DEBUG nova.network.neutron [req-69eaa29b-89d1-4b73-82f7-7f642bd05981 req-e133d852-6042-4f53-8d13-b3d85b63b0b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.319 2 DEBUG oslo_concurrency.lockutils [req-69eaa29b-89d1-4b73-82f7-7f642bd05981 req-e133d852-6042-4f53-8d13-b3d85b63b0b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.586 2 DEBUG nova.network.neutron [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Successfully updated port: 5008167c-4cca-4768-963e-ab0119180625 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.619 2 DEBUG oslo_concurrency.lockutils [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.619 2 DEBUG oslo_concurrency.lockutils [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquired lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.619 2 DEBUG nova.network.neutron [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.800 2 WARNING nova.network.neutron [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] 29d3fdc6-d8e1-4032-8f0c-e91da2912153 already exists in list: networks containing: ['29d3fdc6-d8e1-4032-8f0c-e91da2912153']. ignoring it
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.933 2 DEBUG nova.compute.manager [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-changed-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.934 2 DEBUG nova.compute.manager [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing instance network info cache due to event network-changed-5008167c-4cca-4768-963e-ab0119180625. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:52 compute-1 nova_compute[192795]: 2025-09-30 21:21:52.935 2 DEBUG oslo_concurrency.lockutils [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:53 compute-1 nova_compute[192795]: 2025-09-30 21:21:53.352 2 DEBUG oslo_concurrency.lockutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-857a84cb-03ec-4e88-a3e8-da80fda2c446" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:53 compute-1 nova_compute[192795]: 2025-09-30 21:21:53.353 2 DEBUG oslo_concurrency.lockutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-857a84cb-03ec-4e88-a3e8-da80fda2c446" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:53 compute-1 nova_compute[192795]: 2025-09-30 21:21:53.353 2 DEBUG nova.network.neutron [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:53 compute-1 nova_compute[192795]: 2025-09-30 21:21:53.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:54 compute-1 nova_compute[192795]: 2025-09-30 21:21:54.306 2 DEBUG nova.network.neutron [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:54 compute-1 nova_compute[192795]: 2025-09-30 21:21:54.641 2 DEBUG nova.network.neutron [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:54 compute-1 nova_compute[192795]: 2025-09-30 21:21:54.653 2 DEBUG oslo_concurrency.lockutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-857a84cb-03ec-4e88-a3e8-da80fda2c446" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:54 compute-1 nova_compute[192795]: 2025-09-30 21:21:54.767 2 DEBUG nova.virt.libvirt.driver [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:21:54 compute-1 nova_compute[192795]: 2025-09-30 21:21:54.767 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Creating file /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/9344ac6528ee48bab93e955e482a3dea.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 21:21:54 compute-1 nova_compute[192795]: 2025-09-30 21:21:54.768 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/9344ac6528ee48bab93e955e482a3dea.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:55 compute-1 podman[225393]: 2025-09-30 21:21:55.257425014 +0000 UTC m=+0.076698415 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:21:55 compute-1 podman[225391]: 2025-09-30 21:21:55.257878086 +0000 UTC m=+0.090365343 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:21:55 compute-1 podman[225392]: 2025-09-30 21:21:55.278810062 +0000 UTC m=+0.102350338 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:21:55 compute-1 nova_compute[192795]: 2025-09-30 21:21:55.364 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/9344ac6528ee48bab93e955e482a3dea.tmp" returned: 1 in 0.596s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:55 compute-1 nova_compute[192795]: 2025-09-30 21:21:55.365 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/9344ac6528ee48bab93e955e482a3dea.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Sep 30 21:21:55 compute-1 nova_compute[192795]: 2025-09-30 21:21:55.365 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Creating directory /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 21:21:55 compute-1 nova_compute[192795]: 2025-09-30 21:21:55.366 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:55 compute-1 nova_compute[192795]: 2025-09-30 21:21:55.616 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446" returned: 0 in 0.250s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:55 compute-1 nova_compute[192795]: 2025-09-30 21:21:55.622 2 DEBUG nova.virt.libvirt.driver [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:21:56 compute-1 nova_compute[192795]: 2025-09-30 21:21:56.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.341 2 DEBUG nova.network.neutron [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.364 2 DEBUG oslo_concurrency.lockutils [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Releasing lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.365 2 DEBUG oslo_concurrency.lockutils [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.366 2 DEBUG nova.network.neutron [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing network info cache for port 5008167c-4cca-4768-963e-ab0119180625 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.369 2 DEBUG nova.virt.libvirt.vif [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-984303008',display_name='tempest-tempest.common.compute-instance-984303008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-984303008',id=32,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-zvzpoyu0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=04fadc55-8de2-49b0-a4db-9cc05bd5d036,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.370 2 DEBUG nova.network.os_vif_util [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.370 2 DEBUG nova.network.os_vif_util [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.371 2 DEBUG os_vif [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.372 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.372 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.376 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5008167c-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.376 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5008167c-4c, col_values=(('external_ids', {'iface-id': '5008167c-4cca-4768-963e-ab0119180625', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3c:a2:3f', 'vm-uuid': '04fadc55-8de2-49b0-a4db-9cc05bd5d036'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:21:57 compute-1 NetworkManager[51724]: <info>  [1759267317.3809] manager: (tap5008167c-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.392 2 INFO os_vif [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c')
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.393 2 DEBUG nova.virt.libvirt.vif [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-984303008',display_name='tempest-tempest.common.compute-instance-984303008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-984303008',id=32,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-zvzpoyu0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=04fadc55-8de2-49b0-a4db-9cc05bd5d036,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.393 2 DEBUG nova.network.os_vif_util [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.394 2 DEBUG nova.network.os_vif_util [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.396 2 DEBUG nova.virt.libvirt.guest [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] attach device xml: <interface type="ethernet">
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <mac address="fa:16:3e:3c:a2:3f"/>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <model type="virtio"/>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <mtu size="1442"/>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <target dev="tap5008167c-4c"/>
Sep 30 21:21:57 compute-1 nova_compute[192795]: </interface>
Sep 30 21:21:57 compute-1 nova_compute[192795]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Sep 30 21:21:57 compute-1 kernel: tap5008167c-4c: entered promiscuous mode
Sep 30 21:21:57 compute-1 ovn_controller[94902]: 2025-09-30T21:21:57Z|00137|binding|INFO|Claiming lport 5008167c-4cca-4768-963e-ab0119180625 for this chassis.
Sep 30 21:21:57 compute-1 ovn_controller[94902]: 2025-09-30T21:21:57Z|00138|binding|INFO|5008167c-4cca-4768-963e-ab0119180625: Claiming fa:16:3e:3c:a2:3f 10.100.0.11
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 NetworkManager[51724]: <info>  [1759267317.4297] manager: (tap5008167c-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.428 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:a2:3f 10.100.0.11'], port_security=['fa:16:3e:3c:a2:3f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1560063103', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '04fadc55-8de2-49b0-a4db-9cc05bd5d036', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1560063103', 'neutron:project_id': '47d2c796445c4dd3affc8594502f04be', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0583c73e-88bf-4029-98b4-7475adfa8c7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f6d308b-b549-4733-b51f-a3dd42be0f08, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=5008167c-4cca-4768-963e-ab0119180625) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.429 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 5008167c-4cca-4768-963e-ab0119180625 in datapath 29d3fdc6-d8e1-4032-8f0c-e91da2912153 bound to our chassis
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.430 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:21:57 compute-1 ovn_controller[94902]: 2025-09-30T21:21:57Z|00139|binding|INFO|Setting lport 5008167c-4cca-4768-963e-ab0119180625 ovn-installed in OVS
Sep 30 21:21:57 compute-1 ovn_controller[94902]: 2025-09-30T21:21:57Z|00140|binding|INFO|Setting lport 5008167c-4cca-4768-963e-ab0119180625 up in Southbound
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.453 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[769a82d7-0711-4fd7-925b-c647a85dcb00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:57 compute-1 systemd-udevd[225470]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:21:57 compute-1 NetworkManager[51724]: <info>  [1759267317.4846] device (tap5008167c-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:21:57 compute-1 NetworkManager[51724]: <info>  [1759267317.4864] device (tap5008167c-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.494 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b5c232a7-8950-479f-a3e5-0bf424c4c174]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.499 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[1984cdc8-24e9-4fda-ae93-83614a6dd30c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.539 2 DEBUG nova.virt.libvirt.driver [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.540 2 DEBUG nova.virt.libvirt.driver [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.540 2 DEBUG nova.virt.libvirt.driver [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No VIF found with MAC fa:16:3e:db:7d:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.540 2 DEBUG nova.virt.libvirt.driver [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No VIF found with MAC fa:16:3e:3c:a2:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.547 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5589dc74-3775-4af1-8106-2d648bdb6675]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.568 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e46e7a4c-c990-4d5e-a7f5-db209bbd20c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29d3fdc6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f7:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398057, 'reachable_time': 32431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225476, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.571 2 DEBUG nova.virt.libvirt.guest [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <nova:name>tempest-tempest.common.compute-instance-984303008</nova:name>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <nova:creationTime>2025-09-30 21:21:57</nova:creationTime>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <nova:flavor name="m1.nano">
Sep 30 21:21:57 compute-1 nova_compute[192795]:     <nova:memory>128</nova:memory>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     <nova:disk>1</nova:disk>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     <nova:swap>0</nova:swap>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   </nova:flavor>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <nova:owner>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   </nova:owner>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   <nova:ports>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     <nova:port uuid="a5a8cb9d-f903-4595-a2d2-d2bacf341918">
Sep 30 21:21:57 compute-1 nova_compute[192795]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     </nova:port>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     <nova:port uuid="5008167c-4cca-4768-963e-ab0119180625">
Sep 30 21:21:57 compute-1 nova_compute[192795]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:21:57 compute-1 nova_compute[192795]:     </nova:port>
Sep 30 21:21:57 compute-1 nova_compute[192795]:   </nova:ports>
Sep 30 21:21:57 compute-1 nova_compute[192795]: </nova:instance>
Sep 30 21:21:57 compute-1 nova_compute[192795]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.587 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[be58849d-c81f-4ce0-8d19-e717a438dd76]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap29d3fdc6-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398070, 'tstamp': 398070}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225477, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap29d3fdc6-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398074, 'tstamp': 398074}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225477, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.589 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29d3fdc6-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.593 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29d3fdc6-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.593 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.594 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap29d3fdc6-d0, col_values=(('external_ids', {'iface-id': 'ad63e4cf-251e-40e7-aea0-9713eaa58a32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:57.594 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:57 compute-1 nova_compute[192795]: 2025-09-30 21:21:57.601 2 DEBUG oslo_concurrency.lockutils [None req-1565c156-188b-460a-92c6-05f020e5d378 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "interface-04fadc55-8de2-49b0-a4db-9cc05bd5d036-5008167c-4cca-4768-963e-ab0119180625" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.045 2 DEBUG nova.compute.manager [req-eada69c2-ab09-4c25-8127-27b59711d3dc req-f6a10db1-cf82-4c8f-a4a4-e9c3a7a702a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.046 2 DEBUG oslo_concurrency.lockutils [req-eada69c2-ab09-4c25-8127-27b59711d3dc req-f6a10db1-cf82-4c8f-a4a4-e9c3a7a702a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.047 2 DEBUG oslo_concurrency.lockutils [req-eada69c2-ab09-4c25-8127-27b59711d3dc req-f6a10db1-cf82-4c8f-a4a4-e9c3a7a702a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.047 2 DEBUG oslo_concurrency.lockutils [req-eada69c2-ab09-4c25-8127-27b59711d3dc req-f6a10db1-cf82-4c8f-a4a4-e9c3a7a702a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.048 2 DEBUG nova.compute.manager [req-eada69c2-ab09-4c25-8127-27b59711d3dc req-f6a10db1-cf82-4c8f-a4a4-e9c3a7a702a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] No waiting events found dispatching network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.048 2 WARNING nova.compute.manager [req-eada69c2-ab09-4c25-8127-27b59711d3dc req-f6a10db1-cf82-4c8f-a4a4-e9c3a7a702a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received unexpected event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 for instance with vm_state active and task_state None.
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.935 2 DEBUG oslo_concurrency.lockutils [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "interface-04fadc55-8de2-49b0-a4db-9cc05bd5d036-5008167c-4cca-4768-963e-ab0119180625" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.936 2 DEBUG oslo_concurrency.lockutils [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "interface-04fadc55-8de2-49b0-a4db-9cc05bd5d036-5008167c-4cca-4768-963e-ab0119180625" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:58 compute-1 nova_compute[192795]: 2025-09-30 21:21:58.957 2 DEBUG nova.objects.instance [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'flavor' on Instance uuid 04fadc55-8de2-49b0-a4db-9cc05bd5d036 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.005 2 DEBUG nova.virt.libvirt.vif [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-984303008',display_name='tempest-tempest.common.compute-instance-984303008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-984303008',id=32,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-zvzpoyu0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=04fadc55-8de2-49b0-a4db-9cc05bd5d036,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.007 2 DEBUG nova.network.os_vif_util [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.008 2 DEBUG nova.network.os_vif_util [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.012 2 DEBUG nova.virt.libvirt.guest [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.016 2 DEBUG nova.virt.libvirt.guest [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.019 2 DEBUG nova.virt.libvirt.driver [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Attempting to detach device tap5008167c-4c from instance 04fadc55-8de2-49b0-a4db-9cc05bd5d036 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.019 2 DEBUG nova.virt.libvirt.guest [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] detach device xml: <interface type="ethernet">
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <mac address="fa:16:3e:3c:a2:3f"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <model type="virtio"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <mtu size="1442"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <target dev="tap5008167c-4c"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]: </interface>
Sep 30 21:21:59 compute-1 nova_compute[192795]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.033 2 DEBUG nova.virt.libvirt.guest [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.037 2 DEBUG nova.virt.libvirt.guest [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface>not found in domain: <domain type='kvm' id='16'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <name>instance-00000020</name>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <uuid>04fadc55-8de2-49b0-a4db-9cc05bd5d036</uuid>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:name>tempest-tempest.common.compute-instance-984303008</nova:name>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:creationTime>2025-09-30 21:21:57</nova:creationTime>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:flavor name="m1.nano">
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:memory>128</nova:memory>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:disk>1</nova:disk>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:swap>0</nova:swap>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </nova:flavor>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:owner>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </nova:owner>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:ports>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:port uuid="a5a8cb9d-f903-4595-a2d2-d2bacf341918">
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </nova:port>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:port uuid="5008167c-4cca-4768-963e-ab0119180625">
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </nova:port>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </nova:ports>
Sep 30 21:21:59 compute-1 nova_compute[192795]: </nova:instance>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <memory unit='KiB'>131072</memory>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <currentMemory unit='KiB'>131072</currentMemory>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <vcpu placement='static'>1</vcpu>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <resource>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <partition>/machine</partition>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </resource>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <sysinfo type='smbios'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <system>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='manufacturer'>RDO</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='product'>OpenStack Compute</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='serial'>04fadc55-8de2-49b0-a4db-9cc05bd5d036</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='uuid'>04fadc55-8de2-49b0-a4db-9cc05bd5d036</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='family'>Virtual Machine</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </system>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <os>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <boot dev='hd'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <smbios mode='sysinfo'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </os>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <features>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <vmcoreinfo state='on'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </features>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <cpu mode='custom' match='exact' check='full'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <model fallback='forbid'>Nehalem</model>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <feature policy='require' name='x2apic'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <feature policy='require' name='hypervisor'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <feature policy='require' name='vme'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <clock offset='utc'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <timer name='pit' tickpolicy='delay'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <timer name='rtc' tickpolicy='catchup'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <timer name='hpet' present='no'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <on_poweroff>destroy</on_poweroff>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <on_reboot>restart</on_reboot>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <on_crash>destroy</on_crash>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <disk type='file' device='disk'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <driver name='qemu' type='qcow2' cache='none'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <source file='/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk' index='2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <backingStore type='file' index='3'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:         <format type='raw'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:         <source file='/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:         <backingStore/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       </backingStore>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target dev='vda' bus='virtio'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='virtio-disk0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <disk type='file' device='cdrom'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <driver name='qemu' type='raw' cache='none'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <source file='/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk.config' index='1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <backingStore/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target dev='sda' bus='sata'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <readonly/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='sata0-0-0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='0' model='pcie-root'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pcie.0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='1' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='1' port='0x10'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='2' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='2' port='0x11'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='3' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='3' port='0x12'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.3'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='4' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='4' port='0x13'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.4'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='5' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='5' port='0x14'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.5'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='6' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='6' port='0x15'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.6'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='7' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='7' port='0x16'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.7'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='8' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='8' port='0x17'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.8'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='9' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='9' port='0x18'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.9'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='10' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='10' port='0x19'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.10'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='11' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='11' port='0x1a'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.11'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='12' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='12' port='0x1b'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.12'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='13' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='13' port='0x1c'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.13'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='14' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='14' port='0x1d'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.14'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='15' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='15' port='0x1e'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.15'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='16' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='16' port='0x1f'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.16'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='17' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='17' port='0x20'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.17'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='18' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='18' port='0x21'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.18'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='19' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='19' port='0x22'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.19'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='20' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='20' port='0x23'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.20'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='21' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='21' port='0x24'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.21'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='22' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='22' port='0x25'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.22'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='23' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='23' port='0x26'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.23'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='24' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='24' port='0x27'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.24'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='25' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='25' port='0x28'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.25'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-pci-bridge'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.26'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='usb' index='0' model='piix3-uhci'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='usb'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='sata' index='0'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='ide'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <interface type='ethernet'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <mac address='fa:16:3e:db:7d:eb'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target dev='tapa5a8cb9d-f9'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model type='virtio'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <driver name='vhost' rx_queue_size='512'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <mtu size='1442'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='net0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <interface type='ethernet'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <mac address='fa:16:3e:3c:a2:3f'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target dev='tap5008167c-4c'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model type='virtio'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <driver name='vhost' rx_queue_size='512'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <mtu size='1442'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='net1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <serial type='pty'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <source path='/dev/pts/1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <log file='/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/console.log' append='off'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target type='isa-serial' port='0'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:         <model name='isa-serial'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       </target>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='serial0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <console type='pty' tty='/dev/pts/1'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <source path='/dev/pts/1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <log file='/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/console.log' append='off'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target type='serial' port='0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='serial0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </console>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <input type='tablet' bus='usb'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='input0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='usb' bus='0' port='1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </input>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <input type='mouse' bus='ps2'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='input1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </input>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <input type='keyboard' bus='ps2'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='input2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </input>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <listen type='address' address='::0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </graphics>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <audio id='1' type='none'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <video>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model type='virtio' heads='1' primary='yes'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='video0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </video>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <watchdog model='itco' action='reset'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='watchdog0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </watchdog>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <memballoon model='virtio'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <stats period='10'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='balloon0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <rng model='virtio'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <backend model='random'>/dev/urandom</backend>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='rng0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <label>system_u:system_r:svirt_t:s0:c831,c965</label>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c831,c965</imagelabel>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </seclabel>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <label>+107:+107</label>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <imagelabel>+107:+107</imagelabel>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </seclabel>
Sep 30 21:21:59 compute-1 nova_compute[192795]: </domain>
Sep 30 21:21:59 compute-1 nova_compute[192795]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.039 2 INFO nova.virt.libvirt.driver [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully detached device tap5008167c-4c from instance 04fadc55-8de2-49b0-a4db-9cc05bd5d036 from the persistent domain config.
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.040 2 DEBUG nova.virt.libvirt.driver [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] (1/8): Attempting to detach device tap5008167c-4c with device alias net1 from instance 04fadc55-8de2-49b0-a4db-9cc05bd5d036 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.040 2 DEBUG nova.virt.libvirt.guest [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] detach device xml: <interface type="ethernet">
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <mac address="fa:16:3e:3c:a2:3f"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <model type="virtio"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <mtu size="1442"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <target dev="tap5008167c-4c"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]: </interface>
Sep 30 21:21:59 compute-1 nova_compute[192795]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Sep 30 21:21:59 compute-1 kernel: tap5008167c-4c (unregistering): left promiscuous mode
Sep 30 21:21:59 compute-1 NetworkManager[51724]: <info>  [1759267319.1698] device (tap5008167c-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.171 2 DEBUG nova.virt.libvirt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Received event <DeviceRemovedEvent: 1759267319.1709132, 04fadc55-8de2-49b0-a4db-9cc05bd5d036 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.176 2 DEBUG nova.virt.libvirt.driver [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Start waiting for the detach event from libvirt for device tap5008167c-4c with device alias net1 for instance 04fadc55-8de2-49b0-a4db-9cc05bd5d036 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.177 2 DEBUG nova.virt.libvirt.guest [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Sep 30 21:21:59 compute-1 ovn_controller[94902]: 2025-09-30T21:21:59Z|00141|binding|INFO|Releasing lport 5008167c-4cca-4768-963e-ab0119180625 from this chassis (sb_readonly=0)
Sep 30 21:21:59 compute-1 ovn_controller[94902]: 2025-09-30T21:21:59Z|00142|binding|INFO|Setting lport 5008167c-4cca-4768-963e-ab0119180625 down in Southbound
Sep 30 21:21:59 compute-1 ovn_controller[94902]: 2025-09-30T21:21:59Z|00143|binding|INFO|Removing iface tap5008167c-4c ovn-installed in OVS
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.184 2 DEBUG nova.virt.libvirt.guest [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface>not found in domain: <domain type='kvm' id='16'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <name>instance-00000020</name>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <uuid>04fadc55-8de2-49b0-a4db-9cc05bd5d036</uuid>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:name>tempest-tempest.common.compute-instance-984303008</nova:name>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:creationTime>2025-09-30 21:21:57</nova:creationTime>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:flavor name="m1.nano">
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:memory>128</nova:memory>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:disk>1</nova:disk>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:swap>0</nova:swap>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </nova:flavor>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:owner>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </nova:owner>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:ports>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:port uuid="a5a8cb9d-f903-4595-a2d2-d2bacf341918">
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </nova:port>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:port uuid="5008167c-4cca-4768-963e-ab0119180625">
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </nova:port>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </nova:ports>
Sep 30 21:21:59 compute-1 nova_compute[192795]: </nova:instance>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <memory unit='KiB'>131072</memory>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <currentMemory unit='KiB'>131072</currentMemory>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <vcpu placement='static'>1</vcpu>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <resource>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <partition>/machine</partition>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </resource>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <sysinfo type='smbios'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <system>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='manufacturer'>RDO</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='product'>OpenStack Compute</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='serial'>04fadc55-8de2-49b0-a4db-9cc05bd5d036</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='uuid'>04fadc55-8de2-49b0-a4db-9cc05bd5d036</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <entry name='family'>Virtual Machine</entry>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </system>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <os>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <boot dev='hd'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <smbios mode='sysinfo'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </os>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <features>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <vmcoreinfo state='on'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </features>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <cpu mode='custom' match='exact' check='full'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <model fallback='forbid'>Nehalem</model>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <feature policy='require' name='x2apic'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <feature policy='require' name='hypervisor'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <feature policy='require' name='vme'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <clock offset='utc'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <timer name='pit' tickpolicy='delay'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <timer name='rtc' tickpolicy='catchup'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <timer name='hpet' present='no'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <on_poweroff>destroy</on_poweroff>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <on_reboot>restart</on_reboot>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <on_crash>destroy</on_crash>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <disk type='file' device='disk'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <driver name='qemu' type='qcow2' cache='none'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <source file='/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk' index='2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <backingStore type='file' index='3'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:         <format type='raw'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:         <source file='/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:         <backingStore/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       </backingStore>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target dev='vda' bus='virtio'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='virtio-disk0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <disk type='file' device='cdrom'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <driver name='qemu' type='raw' cache='none'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <source file='/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/disk.config' index='1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <backingStore/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target dev='sda' bus='sata'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <readonly/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='sata0-0-0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='0' model='pcie-root'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pcie.0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='1' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='1' port='0x10'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='2' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='2' port='0x11'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='3' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='3' port='0x12'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.3'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='4' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='4' port='0x13'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.4'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='5' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='5' port='0x14'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.5'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='6' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='6' port='0x15'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.6'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='7' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='7' port='0x16'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.7'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='8' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='8' port='0x17'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.8'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='9' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='9' port='0x18'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.9'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='10' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='10' port='0x19'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.10'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='11' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='11' port='0x1a'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.11'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='12' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='12' port='0x1b'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.12'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='13' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='13' port='0x1c'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.13'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='14' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='14' port='0x1d'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.14'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='15' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='15' port='0x1e'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.15'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='16' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='16' port='0x1f'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.16'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='17' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='17' port='0x20'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.17'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='18' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='18' port='0x21'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.18'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='19' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='19' port='0x22'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.19'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='20' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='20' port='0x23'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.20'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='21' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='21' port='0x24'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.21'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='22' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='22' port='0x25'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.22'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='23' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='23' port='0x26'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.23'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='24' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='24' port='0x27'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.24'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='25' model='pcie-root-port'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-root-port'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target chassis='25' port='0x28'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.25'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model name='pcie-pci-bridge'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='pci.26'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='usb' index='0' model='piix3-uhci'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='usb'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <controller type='sata' index='0'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='ide'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </controller>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <interface type='ethernet'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <mac address='fa:16:3e:db:7d:eb'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target dev='tapa5a8cb9d-f9'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model type='virtio'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <driver name='vhost' rx_queue_size='512'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <mtu size='1442'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='net0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <serial type='pty'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <source path='/dev/pts/1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <log file='/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/console.log' append='off'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target type='isa-serial' port='0'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:         <model name='isa-serial'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       </target>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='serial0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <console type='pty' tty='/dev/pts/1'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <source path='/dev/pts/1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <log file='/var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036/console.log' append='off'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <target type='serial' port='0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='serial0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </console>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <input type='tablet' bus='usb'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='input0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='usb' bus='0' port='1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </input>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <input type='mouse' bus='ps2'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='input1'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </input>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <input type='keyboard' bus='ps2'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='input2'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </input>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <listen type='address' address='::0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </graphics>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <audio id='1' type='none'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <video>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <model type='virtio' heads='1' primary='yes'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='video0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </video>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <watchdog model='itco' action='reset'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='watchdog0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </watchdog>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <memballoon model='virtio'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <stats period='10'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='balloon0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <rng model='virtio'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <backend model='random'>/dev/urandom</backend>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <alias name='rng0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <label>system_u:system_r:svirt_t:s0:c831,c965</label>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c831,c965</imagelabel>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </seclabel>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <label>+107:+107</label>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <imagelabel>+107:+107</imagelabel>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </seclabel>
Sep 30 21:21:59 compute-1 nova_compute[192795]: </domain>
Sep 30 21:21:59 compute-1 nova_compute[192795]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.184 2 INFO nova.virt.libvirt.driver [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully detached device tap5008167c-4c from instance 04fadc55-8de2-49b0-a4db-9cc05bd5d036 from the live domain config.
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.185 2 DEBUG nova.virt.libvirt.vif [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-984303008',display_name='tempest-tempest.common.compute-instance-984303008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-984303008',id=32,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-zvzpoyu0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=04fadc55-8de2-49b0-a4db-9cc05bd5d036,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.185 2 DEBUG nova.network.os_vif_util [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.186 2 DEBUG nova.network.os_vif_util [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.186 2 DEBUG os_vif [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.188 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5008167c-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.215 2 INFO os_vif [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c')
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.216 2 DEBUG nova.virt.libvirt.guest [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:name>tempest-tempest.common.compute-instance-984303008</nova:name>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:creationTime>2025-09-30 21:21:59</nova:creationTime>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:flavor name="m1.nano">
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:memory>128</nova:memory>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:disk>1</nova:disk>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:swap>0</nova:swap>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </nova:flavor>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:owner>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </nova:owner>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   <nova:ports>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     <nova:port uuid="a5a8cb9d-f903-4595-a2d2-d2bacf341918">
Sep 30 21:21:59 compute-1 nova_compute[192795]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:21:59 compute-1 nova_compute[192795]:     </nova:port>
Sep 30 21:21:59 compute-1 nova_compute[192795]:   </nova:ports>
Sep 30 21:21:59 compute-1 nova_compute[192795]: </nova:instance>
Sep 30 21:21:59 compute-1 nova_compute[192795]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.217 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:a2:3f 10.100.0.11'], port_security=['fa:16:3e:3c:a2:3f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1560063103', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '04fadc55-8de2-49b0-a4db-9cc05bd5d036', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1560063103', 'neutron:project_id': '47d2c796445c4dd3affc8594502f04be', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0583c73e-88bf-4029-98b4-7475adfa8c7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f6d308b-b549-4733-b51f-a3dd42be0f08, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=5008167c-4cca-4768-963e-ab0119180625) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.219 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 5008167c-4cca-4768-963e-ab0119180625 in datapath 29d3fdc6-d8e1-4032-8f0c-e91da2912153 unbound from our chassis
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.221 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.245 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ef86763f-5aa6-4389-aaa0-d21d57b6681a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.291 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb89944-869b-4f07-8584-256791241160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.295 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ce81af-aa0f-4120-acd5-fe1d60d3fe97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:59 compute-1 podman[225478]: 2025-09-30 21:21:59.327891203 +0000 UTC m=+0.150521890 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm)
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.328 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[28b53ef3-eae3-4a64-bbe2-4d8db328d39f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.360 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c06bcff8-1ef3-42df-858a-770bb74e3b9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29d3fdc6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f7:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398057, 'reachable_time': 32431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225503, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.379 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[22b80f08-8407-41d8-8f56-07893836120b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap29d3fdc6-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398070, 'tstamp': 398070}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225504, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap29d3fdc6-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398074, 'tstamp': 398074}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225504, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.381 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29d3fdc6-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.420 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29d3fdc6-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.420 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.420 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap29d3fdc6-d0, col_values=(('external_ids', {'iface-id': 'ad63e4cf-251e-40e7-aea0-9713eaa58a32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:21:59.421 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.659 2 DEBUG nova.network.neutron [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updated VIF entry in instance network info cache for port 5008167c-4cca-4768-963e-ab0119180625. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.660 2 DEBUG nova.network.neutron [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:59 compute-1 nova_compute[192795]: 2025-09-30 21:21:59.684 2 DEBUG oslo_concurrency.lockutils [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.188 2 DEBUG nova.compute.manager [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.189 2 DEBUG oslo_concurrency.lockutils [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.189 2 DEBUG oslo_concurrency.lockutils [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.190 2 DEBUG oslo_concurrency.lockutils [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.190 2 DEBUG nova.compute.manager [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] No waiting events found dispatching network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.190 2 WARNING nova.compute.manager [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received unexpected event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 for instance with vm_state active and task_state None.
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.191 2 DEBUG nova.compute.manager [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-unplugged-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.191 2 DEBUG oslo_concurrency.lockutils [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.192 2 DEBUG oslo_concurrency.lockutils [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.192 2 DEBUG oslo_concurrency.lockutils [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.192 2 DEBUG nova.compute.manager [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] No waiting events found dispatching network-vif-unplugged-5008167c-4cca-4768-963e-ab0119180625 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.193 2 WARNING nova.compute.manager [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received unexpected event network-vif-unplugged-5008167c-4cca-4768-963e-ab0119180625 for instance with vm_state active and task_state None.
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.193 2 DEBUG nova.compute.manager [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.194 2 DEBUG oslo_concurrency.lockutils [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.194 2 DEBUG oslo_concurrency.lockutils [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.194 2 DEBUG oslo_concurrency.lockutils [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.195 2 DEBUG nova.compute.manager [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] No waiting events found dispatching network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.195 2 WARNING nova.compute.manager [req-7f0b5802-fa4f-4008-b8b9-0f664b65993a req-469a1cd9-ec39-4c3d-83d6-3209ac9400b8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received unexpected event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 for instance with vm_state active and task_state None.
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.703 2 DEBUG oslo_concurrency.lockutils [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.704 2 DEBUG oslo_concurrency.lockutils [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquired lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:00 compute-1 nova_compute[192795]: 2025-09-30 21:22:00.704 2 DEBUG nova.network.neutron [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:01 compute-1 nova_compute[192795]: 2025-09-30 21:22:01.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:02 compute-1 nova_compute[192795]: 2025-09-30 21:22:02.033 2 INFO nova.network.neutron [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Port 5008167c-4cca-4768-963e-ab0119180625 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Sep 30 21:22:02 compute-1 nova_compute[192795]: 2025-09-30 21:22:02.034 2 DEBUG nova.network.neutron [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:02 compute-1 nova_compute[192795]: 2025-09-30 21:22:02.056 2 DEBUG oslo_concurrency.lockutils [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Releasing lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:02 compute-1 nova_compute[192795]: 2025-09-30 21:22:02.094 2 DEBUG oslo_concurrency.lockutils [None req-00850d66-e9a3-468a-982a-736fe428deef c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "interface-04fadc55-8de2-49b0-a4db-9cc05bd5d036-5008167c-4cca-4768-963e-ab0119180625" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:02 compute-1 nova_compute[192795]: 2025-09-30 21:22:02.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:03 compute-1 nova_compute[192795]: 2025-09-30 21:22:03.050 2 DEBUG nova.compute.manager [req-7aed89d0-d210-45dd-865f-a48989cf3ee8 req-ca4d3acc-d1d4-4b16-9924-d085572792dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:03 compute-1 nova_compute[192795]: 2025-09-30 21:22:03.051 2 DEBUG nova.compute.manager [req-7aed89d0-d210-45dd-865f-a48989cf3ee8 req-ca4d3acc-d1d4-4b16-9924-d085572792dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing instance network info cache due to event network-changed-a5a8cb9d-f903-4595-a2d2-d2bacf341918. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:22:03 compute-1 nova_compute[192795]: 2025-09-30 21:22:03.052 2 DEBUG oslo_concurrency.lockutils [req-7aed89d0-d210-45dd-865f-a48989cf3ee8 req-ca4d3acc-d1d4-4b16-9924-d085572792dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:03 compute-1 nova_compute[192795]: 2025-09-30 21:22:03.053 2 DEBUG oslo_concurrency.lockutils [req-7aed89d0-d210-45dd-865f-a48989cf3ee8 req-ca4d3acc-d1d4-4b16-9924-d085572792dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:03 compute-1 nova_compute[192795]: 2025-09-30 21:22:03.053 2 DEBUG nova.network.neutron [req-7aed89d0-d210-45dd-865f-a48989cf3ee8 req-ca4d3acc-d1d4-4b16-9924-d085572792dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Refreshing network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:22:04 compute-1 nova_compute[192795]: 2025-09-30 21:22:04.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:05 compute-1 nova_compute[192795]: 2025-09-30 21:22:05.681 2 DEBUG nova.virt.libvirt.driver [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:22:06 compute-1 nova_compute[192795]: 2025-09-30 21:22:06.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:07 compute-1 nova_compute[192795]: 2025-09-30 21:22:07.806 2 DEBUG nova.network.neutron [req-7aed89d0-d210-45dd-865f-a48989cf3ee8 req-ca4d3acc-d1d4-4b16-9924-d085572792dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updated VIF entry in instance network info cache for port a5a8cb9d-f903-4595-a2d2-d2bacf341918. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:22:07 compute-1 nova_compute[192795]: 2025-09-30 21:22:07.807 2 DEBUG nova.network.neutron [req-7aed89d0-d210-45dd-865f-a48989cf3ee8 req-ca4d3acc-d1d4-4b16-9924-d085572792dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [{"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:07 compute-1 nova_compute[192795]: 2025-09-30 21:22:07.848 2 DEBUG oslo_concurrency.lockutils [req-7aed89d0-d210-45dd-865f-a48989cf3ee8 req-ca4d3acc-d1d4-4b16-9924-d085572792dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-04fadc55-8de2-49b0-a4db-9cc05bd5d036" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:07 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000023.scope: Deactivated successfully.
Sep 30 21:22:07 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000023.scope: Consumed 13.273s CPU time.
Sep 30 21:22:07 compute-1 systemd-machined[152783]: Machine qemu-18-instance-00000023 terminated.
Sep 30 21:22:07 compute-1 podman[225525]: 2025-09-30 21:22:07.938210352 +0000 UTC m=+0.068682478 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Sep 30 21:22:08 compute-1 nova_compute[192795]: 2025-09-30 21:22:08.701 2 INFO nova.virt.libvirt.driver [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance shutdown successfully after 13 seconds.
Sep 30 21:22:08 compute-1 nova_compute[192795]: 2025-09-30 21:22:08.707 2 INFO nova.virt.libvirt.driver [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance destroyed successfully.
Sep 30 21:22:08 compute-1 nova_compute[192795]: 2025-09-30 21:22:08.710 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:08 compute-1 nova_compute[192795]: 2025-09-30 21:22:08.768 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:08 compute-1 nova_compute[192795]: 2025-09-30 21:22:08.769 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:08 compute-1 nova_compute[192795]: 2025-09-30 21:22:08.829 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:08 compute-1 nova_compute[192795]: 2025-09-30 21:22:08.832 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Copying file /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_resize/disk to 192.168.122.102:/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:22:08 compute-1 nova_compute[192795]: 2025-09-30 21:22:08.832 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_resize/disk 192.168.122.102:/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:09 compute-1 nova_compute[192795]: 2025-09-30 21:22:09.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:09 compute-1 nova_compute[192795]: 2025-09-30 21:22:09.808 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "scp -r /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_resize/disk 192.168.122.102:/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk" returned: 0 in 0.975s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:09 compute-1 nova_compute[192795]: 2025-09-30 21:22:09.809 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Copying file /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:22:09 compute-1 nova_compute[192795]: 2025-09-30 21:22:09.810 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_resize/disk.config 192.168.122.102:/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:10 compute-1 nova_compute[192795]: 2025-09-30 21:22:10.104 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "scp -C -r /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_resize/disk.config 192.168.122.102:/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config" returned: 0 in 0.294s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:10 compute-1 nova_compute[192795]: 2025-09-30 21:22:10.106 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Copying file /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:22:10 compute-1 nova_compute[192795]: 2025-09-30 21:22:10.106 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_resize/disk.info 192.168.122.102:/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:10 compute-1 podman[225569]: 2025-09-30 21:22:10.24969432 +0000 UTC m=+0.072663156 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:22:10 compute-1 podman[225568]: 2025-09-30 21:22:10.27267475 +0000 UTC m=+0.094318780 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Sep 30 21:22:10 compute-1 nova_compute[192795]: 2025-09-30 21:22:10.390 2 DEBUG oslo_concurrency.processutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "scp -C -r /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_resize/disk.info 192.168.122.102:/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.info" returned: 0 in 0.284s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:10 compute-1 nova_compute[192795]: 2025-09-30 21:22:10.500 2 DEBUG oslo_concurrency.lockutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "857a84cb-03ec-4e88-a3e8-da80fda2c446-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:10 compute-1 nova_compute[192795]: 2025-09-30 21:22:10.501 2 DEBUG oslo_concurrency.lockutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "857a84cb-03ec-4e88-a3e8-da80fda2c446-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:10 compute-1 nova_compute[192795]: 2025-09-30 21:22:10.501 2 DEBUG oslo_concurrency.lockutils [None req-0011f9b5-0829-4cbe-b706-f94fc3ea9b23 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "857a84cb-03ec-4e88-a3e8-da80fda2c446-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:11 compute-1 nova_compute[192795]: 2025-09-30 21:22:11.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:13.646 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:13 compute-1 nova_compute[192795]: 2025-09-30 21:22:13.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:13.648 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:22:14 compute-1 nova_compute[192795]: 2025-09-30 21:22:14.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:15 compute-1 nova_compute[192795]: 2025-09-30 21:22:15.506 2 INFO nova.compute.manager [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Swapping old allocation on dict_keys(['e551d5b4-e9f6-409e-b2a1-508a20c11333']) held by migration cf2ef652-a24f-48fe-b2b3-8961d1356a44 for instance
Sep 30 21:22:15 compute-1 nova_compute[192795]: 2025-09-30 21:22:15.550 2 DEBUG nova.scheduler.client.report [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Overwriting current allocation {'allocations': {'43534f47-0785-4193-83a7-711a984caccc': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 16}}, 'project_id': '7c15359849554c2382315de9f52125af', 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'consumer_generation': 1} on consumer 857a84cb-03ec-4e88-a3e8-da80fda2c446 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Sep 30 21:22:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:15.653 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:15 compute-1 nova_compute[192795]: 2025-09-30 21:22:15.791 2 DEBUG oslo_concurrency.lockutils [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-857a84cb-03ec-4e88-a3e8-da80fda2c446" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:15 compute-1 nova_compute[192795]: 2025-09-30 21:22:15.792 2 DEBUG oslo_concurrency.lockutils [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-857a84cb-03ec-4e88-a3e8-da80fda2c446" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:15 compute-1 nova_compute[192795]: 2025-09-30 21:22:15.792 2 DEBUG nova.network.neutron [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:15 compute-1 nova_compute[192795]: 2025-09-30 21:22:15.969 2 DEBUG nova.network.neutron [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.444 2 DEBUG nova.network.neutron [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.460 2 DEBUG oslo_concurrency.lockutils [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-857a84cb-03ec-4e88-a3e8-da80fda2c446" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.460 2 DEBUG nova.virt.libvirt.driver [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.472 2 DEBUG nova.virt.libvirt.driver [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.478 2 WARNING nova.virt.libvirt.driver [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.487 2 DEBUG nova.virt.libvirt.host [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.488 2 DEBUG nova.virt.libvirt.host [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.494 2 DEBUG nova.virt.libvirt.host [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.495 2 DEBUG nova.virt.libvirt.host [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.496 2 DEBUG nova.virt.libvirt.driver [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.496 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:21:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3263a425-4ab0-4c30-af21-557885cc2a70',id=28,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-344349667',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.497 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.497 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.497 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.497 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.498 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.498 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.498 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.498 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.498 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.499 2 DEBUG nova.virt.hardware [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.499 2 DEBUG nova.objects.instance [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'vcpu_model' on Instance uuid 857a84cb-03ec-4e88-a3e8-da80fda2c446 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.520 2 DEBUG oslo_concurrency.processutils [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.614 2 DEBUG oslo_concurrency.processutils [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.615 2 DEBUG oslo_concurrency.lockutils [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.616 2 DEBUG oslo_concurrency.lockutils [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.617 2 DEBUG oslo_concurrency.lockutils [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.619 2 DEBUG nova.virt.libvirt.driver [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <uuid>857a84cb-03ec-4e88-a3e8-da80fda2c446</uuid>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <name>instance-00000023</name>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <nova:name>tempest-MigrationsAdminTest-server-70232550</nova:name>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:22:16</nova:creationTime>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <nova:flavor name="tempest-test_resize_flavor_-344349667">
Sep 30 21:22:16 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:22:16 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:22:16 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:22:16 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:22:16 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:22:16 compute-1 nova_compute[192795]:         <nova:user uuid="f11be289e05b49a7809cb1a3523abc0c">tempest-MigrationsAdminTest-1333693346-project-member</nova:user>
Sep 30 21:22:16 compute-1 nova_compute[192795]:         <nova:project uuid="7c15359849554c2382315de9f52125af">tempest-MigrationsAdminTest-1333693346</nova:project>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <system>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <entry name="serial">857a84cb-03ec-4e88-a3e8-da80fda2c446</entry>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <entry name="uuid">857a84cb-03ec-4e88-a3e8-da80fda2c446</entry>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     </system>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <os>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   </os>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <features>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   </features>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.config"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/console.log" append="off"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <video>
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     </video>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <input type="keyboard" bus="usb"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:22:16 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:22:16 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:22:16 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:22:16 compute-1 nova_compute[192795]: </domain>
Sep 30 21:22:16 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:22:16 compute-1 systemd-machined[152783]: New machine qemu-19-instance-00000023.
Sep 30 21:22:16 compute-1 systemd[1]: Started Virtual Machine qemu-19-instance-00000023.
Sep 30 21:22:16 compute-1 nova_compute[192795]: 2025-09-30 21:22:16.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.664 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 857a84cb-03ec-4e88-a3e8-da80fda2c446 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.665 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267337.6641214, 857a84cb-03ec-4e88-a3e8-da80fda2c446 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.665 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] VM Resumed (Lifecycle Event)
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.667 2 DEBUG nova.compute.manager [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.675 2 INFO nova.virt.libvirt.driver [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance running successfully.
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.675 2 DEBUG nova.virt.libvirt.driver [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.698 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.700 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.738 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.738 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267337.6649394, 857a84cb-03ec-4e88-a3e8-da80fda2c446 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.738 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] VM Started (Lifecycle Event)
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.761 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.765 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.787 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Sep 30 21:22:17 compute-1 nova_compute[192795]: 2025-09-30 21:22:17.794 2 INFO nova.compute.manager [None req-a0ab2da9-6007-4682-b805-4e5a58c921a0 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Updating instance to original state: 'active'
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:19 compute-1 podman[225642]: 2025-09-30 21:22:19.260413969 +0000 UTC m=+0.087551537 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.500 2 DEBUG oslo_concurrency.lockutils [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.501 2 DEBUG oslo_concurrency.lockutils [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.501 2 DEBUG oslo_concurrency.lockutils [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.502 2 DEBUG oslo_concurrency.lockutils [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.502 2 DEBUG oslo_concurrency.lockutils [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.518 2 INFO nova.compute.manager [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Terminating instance
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.532 2 DEBUG nova.compute.manager [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:22:19 compute-1 kernel: tapa5a8cb9d-f9 (unregistering): left promiscuous mode
Sep 30 21:22:19 compute-1 NetworkManager[51724]: <info>  [1759267339.5555] device (tapa5a8cb9d-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:22:19 compute-1 ovn_controller[94902]: 2025-09-30T21:22:19Z|00144|binding|INFO|Releasing lport a5a8cb9d-f903-4595-a2d2-d2bacf341918 from this chassis (sb_readonly=0)
Sep 30 21:22:19 compute-1 ovn_controller[94902]: 2025-09-30T21:22:19Z|00145|binding|INFO|Setting lport a5a8cb9d-f903-4595-a2d2-d2bacf341918 down in Southbound
Sep 30 21:22:19 compute-1 ovn_controller[94902]: 2025-09-30T21:22:19Z|00146|binding|INFO|Removing iface tapa5a8cb9d-f9 ovn-installed in OVS
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:19 compute-1 ovn_controller[94902]: 2025-09-30T21:22:19Z|00147|binding|INFO|Releasing lport ad63e4cf-251e-40e7-aea0-9713eaa58a32 from this chassis (sb_readonly=0)
Sep 30 21:22:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.578 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:7d:eb 10.100.0.13'], port_security=['fa:16:3e:db:7d:eb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '04fadc55-8de2-49b0-a4db-9cc05bd5d036', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '47d2c796445c4dd3affc8594502f04be', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69ef8d59-f66f-4bdf-9235-591ecebd585e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f6d308b-b549-4733-b51f-a3dd42be0f08, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=a5a8cb9d-f903-4595-a2d2-d2bacf341918) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.579 103861 INFO neutron.agent.ovn.metadata.agent [-] Port a5a8cb9d-f903-4595-a2d2-d2bacf341918 in datapath 29d3fdc6-d8e1-4032-8f0c-e91da2912153 unbound from our chassis
Sep 30 21:22:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.580 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 29d3fdc6-d8e1-4032-8f0c-e91da2912153, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:22:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.581 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bd56368c-0f72-4802-a9a3-87bbd6d3435a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.582 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 namespace which is not needed anymore
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:19 compute-1 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[225141]: [NOTICE]   (225145) : haproxy version is 2.8.14-c23fe91
Sep 30 21:22:19 compute-1 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[225141]: [NOTICE]   (225145) : path to executable is /usr/sbin/haproxy
Sep 30 21:22:19 compute-1 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[225141]: [WARNING]  (225145) : Exiting Master process...
Sep 30 21:22:19 compute-1 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[225141]: [WARNING]  (225145) : Exiting Master process...
Sep 30 21:22:19 compute-1 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[225141]: [ALERT]    (225145) : Current worker (225147) exited with code 143 (Terminated)
Sep 30 21:22:19 compute-1 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[225141]: [WARNING]  (225145) : All workers exited. Exiting... (0)
Sep 30 21:22:19 compute-1 systemd[1]: libpod-1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37.scope: Deactivated successfully.
Sep 30 21:22:19 compute-1 podman[225687]: 2025-09-30 21:22:19.763177881 +0000 UTC m=+0.064598088 container died 1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:22:19 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000020.scope: Deactivated successfully.
Sep 30 21:22:19 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000020.scope: Consumed 14.843s CPU time.
Sep 30 21:22:19 compute-1 systemd-machined[152783]: Machine qemu-16-instance-00000020 terminated.
Sep 30 21:22:19 compute-1 ovn_controller[94902]: 2025-09-30T21:22:19Z|00148|binding|INFO|Releasing lport ad63e4cf-251e-40e7-aea0-9713eaa58a32 from this chassis (sb_readonly=0)
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:19 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37-userdata-shm.mount: Deactivated successfully.
Sep 30 21:22:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-d9577bea85ffe8826fa980364590004bff0c77d86803c9f4de8789086ec5d636-merged.mount: Deactivated successfully.
Sep 30 21:22:19 compute-1 podman[225687]: 2025-09-30 21:22:19.851662943 +0000 UTC m=+0.153083140 container cleanup 1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:22:19 compute-1 systemd[1]: libpod-conmon-1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37.scope: Deactivated successfully.
Sep 30 21:22:19 compute-1 podman[225717]: 2025-09-30 21:22:19.932954201 +0000 UTC m=+0.051710879 container remove 1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:22:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.939 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e7fd56f8-7f85-435b-ba26-7a0bb0a8189c]: (4, ('Tue Sep 30 09:22:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 (1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37)\n1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37\nTue Sep 30 09:22:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 (1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37)\n1eaa517a0fb3c5dbf6836df75ed7c10dbb331de725b2420643fafadf915b4d37\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.943 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[88531d33-bbeb-46ee-8138-8da0329cda1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.944 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29d3fdc6-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:19 compute-1 kernel: tap29d3fdc6-d0: left promiscuous mode
Sep 30 21:22:19 compute-1 nova_compute[192795]: 2025-09-30 21:22:19.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:19 compute-1 NetworkManager[51724]: <info>  [1759267339.9650] manager: (tapa5a8cb9d-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Sep 30 21:22:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.966 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[55640dff-3b6f-4018-91ea-f16ca9f530a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.997 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cab08426-1c5a-4422-9865-85d9ee8f1ad4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:19.999 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cac075f6-303b-49b9-b548-2bf86e477b59]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.015 2 INFO nova.virt.libvirt.driver [-] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Instance destroyed successfully.
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.015 2 DEBUG nova.objects.instance [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'resources' on Instance uuid 04fadc55-8de2-49b0-a4db-9cc05bd5d036 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:20.027 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca02301-a590-4171-892f-ac19351ff669]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398049, 'reachable_time': 18990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225748, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:20 compute-1 systemd[1]: run-netns-ovnmeta\x2d29d3fdc6\x2dd8e1\x2d4032\x2d8f0c\x2de91da2912153.mount: Deactivated successfully.
Sep 30 21:22:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:20.031 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:22:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:20.031 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[0eec3c29-4fde-40c2-8a5b-9165b02b88b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.042 2 DEBUG nova.virt.libvirt.vif [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-984303008',display_name='tempest-tempest.common.compute-instance-984303008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-984303008',id=32,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-zvzpoyu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=04fadc55-8de2-49b0-a4db-9cc05bd5d036,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.042 2 DEBUG nova.network.os_vif_util [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "address": "fa:16:3e:db:7d:eb", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5a8cb9d-f9", "ovs_interfaceid": "a5a8cb9d-f903-4595-a2d2-d2bacf341918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.043 2 DEBUG nova.network.os_vif_util [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:7d:eb,bridge_name='br-int',has_traffic_filtering=True,id=a5a8cb9d-f903-4595-a2d2-d2bacf341918,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5a8cb9d-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.043 2 DEBUG os_vif [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:7d:eb,bridge_name='br-int',has_traffic_filtering=True,id=a5a8cb9d-f903-4595-a2d2-d2bacf341918,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5a8cb9d-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.045 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5a8cb9d-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.052 2 INFO os_vif [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:7d:eb,bridge_name='br-int',has_traffic_filtering=True,id=a5a8cb9d-f903-4595-a2d2-d2bacf341918,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5a8cb9d-f9')
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.053 2 INFO nova.virt.libvirt.driver [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Deleting instance files /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036_del
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.054 2 INFO nova.virt.libvirt.driver [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Deletion of /var/lib/nova/instances/04fadc55-8de2-49b0-a4db-9cc05bd5d036_del complete
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.121 2 INFO nova.compute.manager [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Took 0.59 seconds to destroy the instance on the hypervisor.
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.122 2 DEBUG oslo.service.loopingcall [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.122 2 DEBUG nova.compute.manager [-] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.122 2 DEBUG nova.network.neutron [-] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.647 2 DEBUG nova.network.neutron [-] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.670 2 INFO nova.compute.manager [-] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Took 0.55 seconds to deallocate network for instance.
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.746 2 DEBUG nova.compute.manager [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-unplugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.747 2 DEBUG oslo_concurrency.lockutils [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.747 2 DEBUG oslo_concurrency.lockutils [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.747 2 DEBUG oslo_concurrency.lockutils [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.748 2 DEBUG nova.compute.manager [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] No waiting events found dispatching network-vif-unplugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.748 2 DEBUG nova.compute.manager [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-unplugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.749 2 DEBUG nova.compute.manager [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-plugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.749 2 DEBUG oslo_concurrency.lockutils [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.749 2 DEBUG oslo_concurrency.lockutils [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.750 2 DEBUG oslo_concurrency.lockutils [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.750 2 DEBUG nova.compute.manager [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] No waiting events found dispatching network-vif-plugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.751 2 WARNING nova.compute.manager [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received unexpected event network-vif-plugged-a5a8cb9d-f903-4595-a2d2-d2bacf341918 for instance with vm_state active and task_state deleting.
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.751 2 DEBUG nova.compute.manager [req-6f982c2e-14ce-403e-a144-1933097bee27 req-210906b5-2be0-4382-a858-3c0366019d82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Received event network-vif-deleted-a5a8cb9d-f903-4595-a2d2-d2bacf341918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.921 2 DEBUG oslo_concurrency.lockutils [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:20 compute-1 nova_compute[192795]: 2025-09-30 21:22:20.922 2 DEBUG oslo_concurrency.lockutils [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:21 compute-1 nova_compute[192795]: 2025-09-30 21:22:21.075 2 DEBUG nova.compute.provider_tree [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:21 compute-1 nova_compute[192795]: 2025-09-30 21:22:21.092 2 DEBUG nova.scheduler.client.report [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:21 compute-1 nova_compute[192795]: 2025-09-30 21:22:21.113 2 DEBUG oslo_concurrency.lockutils [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:21 compute-1 nova_compute[192795]: 2025-09-30 21:22:21.156 2 INFO nova.scheduler.client.report [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Deleted allocations for instance 04fadc55-8de2-49b0-a4db-9cc05bd5d036
Sep 30 21:22:21 compute-1 nova_compute[192795]: 2025-09-30 21:22:21.242 2 DEBUG oslo_concurrency.lockutils [None req-05b3e9eb-d56e-4048-91fc-ff69f9cac2e1 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "04fadc55-8de2-49b0-a4db-9cc05bd5d036" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.741s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:21 compute-1 nova_compute[192795]: 2025-09-30 21:22:21.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:25 compute-1 nova_compute[192795]: 2025-09-30 21:22:25.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-1 podman[225753]: 2025-09-30 21:22:26.237136845 +0000 UTC m=+0.071475393 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:22:26 compute-1 podman[225751]: 2025-09-30 21:22:26.260894617 +0000 UTC m=+0.103898559 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd)
Sep 30 21:22:26 compute-1 podman[225752]: 2025-09-30 21:22:26.276994333 +0000 UTC m=+0.111340242 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 21:22:26 compute-1 nova_compute[192795]: 2025-09-30 21:22:26.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.423 2 DEBUG nova.compute.manager [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.545 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.545 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.564 2 DEBUG nova.objects.instance [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.577 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.578 2 INFO nova.compute.claims [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.579 2 DEBUG nova.objects.instance [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lazy-loading 'resources' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.590 2 DEBUG nova.objects.instance [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.601 2 DEBUG nova.objects.instance [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.636 2 INFO nova.compute.resource_tracker [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating resource usage from migration 5013b103-02ec-49ec-a8fd-5104ae6c9ef4
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.636 2 DEBUG nova.compute.resource_tracker [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Starting to track incoming migration 5013b103-02ec-49ec-a8fd-5104ae6c9ef4 with flavor afe5c12d-500a-499b-9438-9e9c37698acc _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.750 2 DEBUG nova.compute.provider_tree [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.768 2 DEBUG nova.scheduler.client.report [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.800 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:28 compute-1 nova_compute[192795]: 2025-09-30 21:22:28.801 2 INFO nova.compute.manager [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Migrating
Sep 30 21:22:30 compute-1 nova_compute[192795]: 2025-09-30 21:22:30.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:30 compute-1 podman[225830]: 2025-09-30 21:22:30.273704927 +0000 UTC m=+0.104261339 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute)
Sep 30 21:22:30 compute-1 sshd-session[225850]: Accepted publickey for nova from 192.168.122.100 port 56122 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:22:30 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:22:30 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:22:30 compute-1 systemd-logind[793]: New session 35 of user nova.
Sep 30 21:22:30 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:22:30 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:22:30 compute-1 systemd[225854]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:22:30 compute-1 systemd[225854]: Queued start job for default target Main User Target.
Sep 30 21:22:30 compute-1 systemd[225854]: Created slice User Application Slice.
Sep 30 21:22:30 compute-1 systemd[225854]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:22:30 compute-1 systemd[225854]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:22:30 compute-1 systemd[225854]: Reached target Paths.
Sep 30 21:22:30 compute-1 systemd[225854]: Reached target Timers.
Sep 30 21:22:30 compute-1 systemd[225854]: Starting D-Bus User Message Bus Socket...
Sep 30 21:22:30 compute-1 systemd[225854]: Starting Create User's Volatile Files and Directories...
Sep 30 21:22:30 compute-1 systemd[225854]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:22:30 compute-1 systemd[225854]: Reached target Sockets.
Sep 30 21:22:30 compute-1 systemd[225854]: Finished Create User's Volatile Files and Directories.
Sep 30 21:22:30 compute-1 systemd[225854]: Reached target Basic System.
Sep 30 21:22:30 compute-1 systemd[225854]: Reached target Main User Target.
Sep 30 21:22:30 compute-1 systemd[225854]: Startup finished in 154ms.
Sep 30 21:22:30 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:22:30 compute-1 systemd[1]: Started Session 35 of User nova.
Sep 30 21:22:30 compute-1 sshd-session[225850]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:22:30 compute-1 sshd-session[225869]: Received disconnect from 192.168.122.100 port 56122:11: disconnected by user
Sep 30 21:22:30 compute-1 sshd-session[225869]: Disconnected from user nova 192.168.122.100 port 56122
Sep 30 21:22:30 compute-1 sshd-session[225850]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:22:30 compute-1 systemd[1]: session-35.scope: Deactivated successfully.
Sep 30 21:22:30 compute-1 systemd-logind[793]: Session 35 logged out. Waiting for processes to exit.
Sep 30 21:22:30 compute-1 systemd-logind[793]: Removed session 35.
Sep 30 21:22:31 compute-1 sshd-session[225871]: Accepted publickey for nova from 192.168.122.100 port 56124 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:22:31 compute-1 systemd-logind[793]: New session 37 of user nova.
Sep 30 21:22:31 compute-1 systemd[1]: Started Session 37 of User nova.
Sep 30 21:22:31 compute-1 sshd-session[225871]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:22:31 compute-1 sshd-session[225874]: Received disconnect from 192.168.122.100 port 56124:11: disconnected by user
Sep 30 21:22:31 compute-1 sshd-session[225874]: Disconnected from user nova 192.168.122.100 port 56124
Sep 30 21:22:31 compute-1 sshd-session[225871]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:22:31 compute-1 systemd[1]: session-37.scope: Deactivated successfully.
Sep 30 21:22:31 compute-1 systemd-logind[793]: Session 37 logged out. Waiting for processes to exit.
Sep 30 21:22:31 compute-1 systemd-logind[793]: Removed session 37.
Sep 30 21:22:31 compute-1 nova_compute[192795]: 2025-09-30 21:22:31.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:33 compute-1 nova_compute[192795]: 2025-09-30 21:22:33.767 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:35 compute-1 nova_compute[192795]: 2025-09-30 21:22:35.010 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267340.0093303, 04fadc55-8de2-49b0-a4db-9cc05bd5d036 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:35 compute-1 nova_compute[192795]: 2025-09-30 21:22:35.010 2 INFO nova.compute.manager [-] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] VM Stopped (Lifecycle Event)
Sep 30 21:22:35 compute-1 nova_compute[192795]: 2025-09-30 21:22:35.030 2 DEBUG nova.compute.manager [None req-4127d6ed-e920-4510-b85a-813a5e032ef7 - - - - - -] [instance: 04fadc55-8de2-49b0-a4db-9cc05bd5d036] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:35 compute-1 nova_compute[192795]: 2025-09-30 21:22:35.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:36 compute-1 nova_compute[192795]: 2025-09-30 21:22:36.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:38 compute-1 podman[225876]: 2025-09-30 21:22:38.228258513 +0000 UTC m=+0.065798226 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:22:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:38.683 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:38.684 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:38.684 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:38 compute-1 nova_compute[192795]: 2025-09-30 21:22:38.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:39 compute-1 nova_compute[192795]: 2025-09-30 21:22:39.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:39 compute-1 nova_compute[192795]: 2025-09-30 21:22:39.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.721 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.721 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.721 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.722 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.797 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:40 compute-1 podman[225899]: 2025-09-30 21:22:40.826209812 +0000 UTC m=+0.052551326 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:22:40 compute-1 podman[225898]: 2025-09-30 21:22:40.83098122 +0000 UTC m=+0.062208205 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.863 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.864 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.921 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.927 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.983 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:40 compute-1 nova_compute[192795]: 2025-09-30 21:22:40.985 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.049 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.192 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.194 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5356MB free_disk=73.40685272216797GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.194 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.195 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.335 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Migration for instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.363 2 INFO nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating resource usage from migration 5013b103-02ec-49ec-a8fd-5104ae6c9ef4
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.364 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Starting to track incoming migration 5013b103-02ec-49ec-a8fd-5104ae6c9ef4 with flavor afe5c12d-500a-499b-9438-9e9c37698acc _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:22:41 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:22:41 compute-1 systemd[225854]: Activating special unit Exit the Session...
Sep 30 21:22:41 compute-1 systemd[225854]: Stopped target Main User Target.
Sep 30 21:22:41 compute-1 systemd[225854]: Stopped target Basic System.
Sep 30 21:22:41 compute-1 systemd[225854]: Stopped target Paths.
Sep 30 21:22:41 compute-1 systemd[225854]: Stopped target Sockets.
Sep 30 21:22:41 compute-1 systemd[225854]: Stopped target Timers.
Sep 30 21:22:41 compute-1 systemd[225854]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:22:41 compute-1 systemd[225854]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:22:41 compute-1 systemd[225854]: Closed D-Bus User Message Bus Socket.
Sep 30 21:22:41 compute-1 systemd[225854]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:22:41 compute-1 systemd[225854]: Removed slice User Application Slice.
Sep 30 21:22:41 compute-1 systemd[225854]: Reached target Shutdown.
Sep 30 21:22:41 compute-1 systemd[225854]: Finished Exit the Session.
Sep 30 21:22:41 compute-1 systemd[225854]: Reached target Exit the Session.
Sep 30 21:22:41 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:22:41 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:22:41 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.412 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 456a6c22-b801-4d95-aa63-be64cd8e4b53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.412 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 857a84cb-03ec-4e88-a3e8-da80fda2c446 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:22:41 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:22:41 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:22:41 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:22:41 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.435 2 WARNING nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.435 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.435 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.548 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.566 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.592 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.593 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:41 compute-1 nova_compute[192795]: 2025-09-30 21:22:41.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.366 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}14e5db1849b70689dbb8f5c07f9763ac1a74da54d89edda9de905bf920d22a92" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.468 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Tue, 30 Sep 2025 21:22:44 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a5d165be-0e0c-42f4-83e2-9eb9b494041b x-openstack-request-id: req-a5d165be-0e0c-42f4-83e2-9eb9b494041b _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.468 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "afe5c12d-500a-499b-9438-9e9c37698acc", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}]}, {"id": "c9779bca-1eb6-4567-a36c-b452abeafc70", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.468 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-a5d165be-0e0c-42f4-83e2-9eb9b494041b request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.470 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'name': 'tempest-MigrationsAdminTest-server-70232550', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000023', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7c15359849554c2382315de9f52125af', 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'hostId': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.472 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}14e5db1849b70689dbb8f5c07f9763ac1a74da54d89edda9de905bf920d22a92" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.554 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Tue, 30 Sep 2025 21:22:44 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d3133e08-9e01-4687-a4c0-f2c1e081291a x-openstack-request-id: req-d3133e08-9e01-4687-a4c0-f2c1e081291a _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.555 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "afe5c12d-500a-499b-9438-9e9c37698acc", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}]}, {"id": "c9779bca-1eb6-4567-a36c-b452abeafc70", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.555 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-d3133e08-9e01-4687-a4c0-f2c1e081291a request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.556 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}14e5db1849b70689dbb8f5c07f9763ac1a74da54d89edda9de905bf920d22a92" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Sep 30 21:22:44 compute-1 nova_compute[192795]: 2025-09-30 21:22:44.592 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:44 compute-1 sshd-session[225955]: Accepted publickey for nova from 192.168.122.100 port 47518 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:22:44 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.634 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Tue, 30 Sep 2025 21:22:44 GMT Keep-Alive: timeout=5, max=98 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-8b3c3939-2580-4a9d-a17a-9549fb0de65e x-openstack-request-id: req-8b3c3939-2580-4a9d-a17a-9549fb0de65e _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.634 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "c9779bca-1eb6-4567-a36c-b452abeafc70", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.634 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70 used request id req-8b3c3939-2580-4a9d-a17a-9549fb0de65e request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.635 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'name': 'tempest-MigrationsAdminTest-server-936283323', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7c15359849554c2382315de9f52125af', 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'hostId': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.635 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.638 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.639 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.639 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:22:44 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.654 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/memory.usage volume: 40.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 systemd-logind[793]: New session 38 of user nova.
Sep 30 21:22:44 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.675 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/memory.usage volume: 43.03515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c37085c-09d9-46ba-8c65-a2ce421705ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.60546875, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'timestamp': '2025-09-30T21:22:44.639147', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '9adb5416-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.386691789, 'message_signature': 'a60aa09884a3655914204e66aa625a323ee075bad2cf3b2995cd03bd10eae8fd'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.03515625, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'timestamp': '2025-09-30T21:22:44.639147', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '9ade931a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.408403113, 'message_signature': '058fb6deae6efec24870c8c3770ed45c81173677815910baf7270e1d4f0130b8'}]}, 'timestamp': '2025-09-30 21:22:44.676271', '_unique_id': '88ad4665e44d46979a2a0408132f7dff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.681 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.683 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.683 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:22:44 compute-1 nova_compute[192795]: 2025-09-30 21:22:44.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:44 compute-1 nova_compute[192795]: 2025-09-30 21:22:44.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:22:44 compute-1 systemd[225959]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.704 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.write.latency volume: 39296110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.704 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 nova_compute[192795]: 2025-09-30 21:22:44.715 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.723 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.write.latency volume: 51050185 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.723 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14eaa876-3ba1-42b3-b07e-beeac60f2a77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39296110, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-vda', 'timestamp': '2025-09-30T21:22:44.683941', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9ae2e974-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': 'd91b2ace924ff43839b478401461eb411909991118ef23c83ea37922db1c52a8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-sda', 'timestamp': '2025-09-30T21:22:44.683941', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9ae2f504-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': '96fd12be805e53195095fc4331c0ca8ef9e75b001146f2a174ad5b48a3b76479'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51050185, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-vda', 'timestamp': '2025-09-30T21:22:44.683941', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9ae5dcf6-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': '811dd65e4135918ec2d911e98a426a9c74bde60020c0b74479d164ff9cc78350'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-sda', 'timestamp': '2025-09-30T21:22:44.683941', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9ae5e7f0-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': '37d91ab3214244ff3ef57e73e720efc253171eca81c50cec7f9c2c3a50bd4a31'}]}, 'timestamp': '2025-09-30 21:22:44.724233', '_unique_id': '8204c4edf581469ab44fc86d6c804383'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.725 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.726 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.726 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.726 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-70232550>, <NovaLikeServer: tempest-MigrationsAdminTest-server-936283323>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-70232550>, <NovaLikeServer: tempest-MigrationsAdminTest-server-936283323>]
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.727 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.727 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.727 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.write.bytes volume: 249856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.727 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.728 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.write.bytes volume: 274432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.728 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78aca4ad-3bdc-4c2b-956a-153b00835515', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 249856, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-vda', 'timestamp': '2025-09-30T21:22:44.727664', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9ae6785a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': 'a9ffde8afacd51826846ac03834c9bdf77b29515df64b7de44c3f318ba5d77e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-sda', 'timestamp': '2025-09-30T21:22:44.727664', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9ae680f2-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': '5eb87b789603741f5201687bb859ca74020136c35e140cf81537cf5ece3cda1e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274432, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-vda', 'timestamp': '2025-09-30T21:22:44.727664', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9ae68944-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': 'cfc9accd88fc0661c02c9d9044a7535816be50d2ff38aa1af4e7abaa59014098'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-sda', 'timestamp': '2025-09-30T21:22:44.727664', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9ae6925e-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': '18bdeaabb20e87bb556c4eb5ccd88d6bf8b7b51b44b8aea3a16a6e4c06921f45'}]}, 'timestamp': '2025-09-30 21:22:44.728678', '_unique_id': 'ede7160d4649409590367a726256d90d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.729 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.730 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.730 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.730 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-70232550>, <NovaLikeServer: tempest-MigrationsAdminTest-server-936283323>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-70232550>, <NovaLikeServer: tempest-MigrationsAdminTest-server-936283323>]
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.730 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.744 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.745 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.754 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.755 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dd371ba-557b-4f58-9d99-952140fe115d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-vda', 'timestamp': '2025-09-30T21:22:44.730561', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9ae919f2-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.463490497, 'message_signature': '23bce187dd3135647238873662c220f70846ff6f037083c4f926e2083940aba9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-sda', 'timestamp': '2025-09-30T21:22:44.730561', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9ae929a6-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.463490497, 'message_signature': '9d6af612a6af1b70e3cc6e387a905812315e58f57e344c890b3251018ffa292f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-vda', 'timestamp': '2025-09-30T21:22:44.730561', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aeaa470-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.478342706, 'message_signature': '4582986b78a8423a31f8b329569ca47ad4befae88f3aa1717d0ce42728188dfe'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-sda', 'timestamp': '2025-09-30T21:22:44.730561', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aeaaf88-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.478342706, 'message_signature': '2802fc5e0285c02a11d7e67de392be7886a407cbe34a2689a6d9e682ea248306'}]}, 'timestamp': '2025-09-30 21:22:44.755641', '_unique_id': 'f003abc63bb24b438489d26f1023d90d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.756 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.757 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.757 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.read.requests volume: 1206 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.757 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.758 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.read.requests volume: 1206 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.758 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37471260-f41d-4b85-81b5-e1f5c17512c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1206, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-vda', 'timestamp': '2025-09-30T21:22:44.757617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aeb0bae-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': 'd941750ca71b23c8c8f7d1117cf54084d52253d963efffc5f1bf30ea2814a0ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-sda', 'timestamp': '2025-09-30T21:22:44.757617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aeb13ce-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': 'd98f002479439c1122a1b9f355c47e728f7a1aaafd25a94d6f1ba02fedf0cc43'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1206, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-vda', 'timestamp': '2025-09-30T21:22:44.757617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aeb1b44-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': '213fb769f265d8a44bdc3c49bcd35f19ff8fcee11affc22b36fd46069260240f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-sda', 'timestamp': '2025-09-30T21:22:44.757617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aeb235a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': '890cfa4b0256de8fb0f846f47ba8593c38e82bc30900cab665b34667dc03c7d1'}]}, 'timestamp': '2025-09-30 21:22:44.758500', '_unique_id': '0c3b683ad1324d3e8e27757b551d07db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.759 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.760 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.760 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.760 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13880eb1-731d-429b-a382-d038f56a52c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-vda', 'timestamp': '2025-09-30T21:22:44.759842', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aeb60ae-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.463490497, 'message_signature': '3b037a5c86195630f33b84323bd16a1930b26e2e00e1368503a9e422ed7728c6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-sda', 'timestamp': '2025-09-30T21:22:44.759842', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aeb6b8a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.463490497, 'message_signature': '2d8d69294702b35ea58dbbe9e9af1074c44fd04cd20d355c7c7275776aec09cc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-vda', 'timestamp': '2025-09-30T21:22:44.759842', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aeb756c-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.478342706, 'message_signature': '713ed2d39b56c2d53550d01e337840eb7f1cde6e4d4a3733c4ca8627a8215b45'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-sda', 'timestamp': '2025-09-30T21:22:44.759842', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aeb81b0-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.478342706, 'message_signature': '35020713ec56cbba3491acd91c3598bc4392f7cdf28d126c5891b41ae4867de4'}]}, 'timestamp': '2025-09-30 21:22:44.760959', '_unique_id': '05d6b27ff20f4e3abf70c18dfa1e6f22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.761 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.762 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.762 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.763 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.allocation volume: 27914240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.763 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a09e966-691f-43e6-9b99-75564c1d0f6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-vda', 'timestamp': '2025-09-30T21:22:44.762626', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aebd0de-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.463490497, 'message_signature': 'eef7c1d379b5c26fa9cd782b889c226ff2e759ad677fd091db724ad878a99071'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-sda', 'timestamp': '2025-09-30T21:22:44.762626', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aebda20-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.463490497, 'message_signature': 'd836bcd09569b4c992cef441e936936509be14a5ba0019a2eaae1b30d520748d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 27914240, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-vda', 'timestamp': '2025-09-30T21:22:44.762626', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aebe51a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.478342706, 'message_signature': '9b5a204dcd83cc0250530c5c54c71fd07811880ce5f572acdc80201e0ee9faad'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-sda', 'timestamp': '2025-09-30T21:22:44.762626', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aebef6a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.478342706, 'message_signature': 'c0a39a118ddd226f3fd5ba0e3fed2327f576896204c18a6149aac8b7f22457b1'}]}, 'timestamp': '2025-09-30 21:22:44.763820', '_unique_id': '4db8b9ab69d544188f09cb9c4ced3bee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.764 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.765 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.read.bytes volume: 32020480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.765 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.765 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.read.bytes volume: 32020480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.765 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3064cb78-76fb-45eb-b6b0-6657e877be3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32020480, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-vda', 'timestamp': '2025-09-30T21:22:44.765217', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aec32c2-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': '210f52093ca914b10b9f8b2a7a75d7254f5266b5608c203654454a764832a83e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-sda', 'timestamp': '2025-09-30T21:22:44.765217', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aec3b46-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': 'f2c28266a7e1a27a0d399cd0ae7ee8fba607307ed212c4662358dc3c5a9e915e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32020480, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-vda', 'timestamp': '2025-09-30T21:22:44.765217', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aec4456-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': 'adec2b75327614c0e2c76c050a764c44dab1ce39ed2a96e87f51a7c336946c30'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-sda', 'timestamp': '2025-09-30T21:22:44.765217', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aec4e4c-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': '8002ca69b1624489309c140cdd270a06390385760f5ae15ede512b942200c9ae'}]}, 'timestamp': '2025-09-30 21:22:44.766170', '_unique_id': '68e69063b3d4441b8427899623cd24ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.766 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.767 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.767 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/cpu volume: 11360000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.767 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/cpu volume: 11220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '347300ad-4d70-4fde-a75e-ee9b6df0c0e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11360000000, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'timestamp': '2025-09-30T21:22:44.767569', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9aec8de4-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.386691789, 'message_signature': '20337a15cfc013f0bc886ade2cb4c00acf3c24f59ed8787b7e3b3e02b2f7b665'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11220000000, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'timestamp': '2025-09-30T21:22:44.767569', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9aec991a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.408403113, 'message_signature': '71461c3c60722efc286e5806713fe53cc3b1c97b6c62664fbd5eb3217da4917f'}]}, 'timestamp': '2025-09-30 21:22:44.768093', '_unique_id': '04e8d29c2d97487bbbef20e342b299eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.768 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.769 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.769 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.769 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-70232550>, <NovaLikeServer: tempest-MigrationsAdminTest-server-936283323>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-70232550>, <NovaLikeServer: tempest-MigrationsAdminTest-server-936283323>]
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.769 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.769 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-70232550>, <NovaLikeServer: tempest-MigrationsAdminTest-server-936283323>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-70232550>, <NovaLikeServer: tempest-MigrationsAdminTest-server-936283323>]
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.770 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.770 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.read.latency volume: 684321411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.770 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.read.latency volume: 44839755 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.770 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.read.latency volume: 701875529 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.770 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.read.latency volume: 50740420 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '420b702f-0cf4-4a22-abfa-767c6c52c3dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 684321411, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-vda', 'timestamp': '2025-09-30T21:22:44.770070', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aecefd2-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': '6647816925d8f6577e39a59f814509cecfd9418774147778b89873c2da2f7db2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 44839755, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-sda', 'timestamp': '2025-09-30T21:22:44.770070', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aecf8ec-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': 'ffd712bc2958ae1cbbd65d3d7f37cd0836ce3e8e49389ca30bd4f9ea7a29bc2f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 701875529, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-vda', 'timestamp': '2025-09-30T21:22:44.770070', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aed0030-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': '77ab0b2d3580152fbb5efd7a434d9964c2be47d5b7923ae46aee422879f3b781'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50740420, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-sda', 'timestamp': '2025-09-30T21:22:44.770070', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aed0756-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': 'e2c4fa84a59d328c1b602cd6af172f186745ed3f2fc5d767400a946900aa614a'}]}, 'timestamp': '2025-09-30 21:22:44.770962', '_unique_id': '1d4b21639f3e4eb28984127a23c9ecfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.771 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.772 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.772 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.772 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.write.requests volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.772 12 DEBUG ceilometer.compute.pollsters [-] 857a84cb-03ec-4e88-a3e8-da80fda2c446/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.772 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.write.requests volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.773 12 DEBUG ceilometer.compute.pollsters [-] 456a6c22-b801-4d95-aa63-be64cd8e4b53/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d8743b7-05b0-4897-9457-a556f64323e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 28, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-vda', 'timestamp': '2025-09-30T21:22:44.772376', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aed495a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': '77d158dcb74f870b2065d988caa8ca574cd150c4b6c04f762f9e1a4746ea084b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446-sda', 'timestamp': '2025-09-30T21:22:44.772376', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-70232550', 'name': 'instance-00000023', 'instance_id': '857a84cb-03ec-4e88-a3e8-da80fda2c446', 'instance_type': 'tempest-test_resize_flavor_-344349667', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-344349667', 'name': 'tempest-test_resize_flavor_-344349667', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aed5170-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.416796979, 'message_signature': '16250e46047cda7103cc3117f8319f51ee23d1ccf3a5cb4207397534623f5693'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 30, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-vda', 'timestamp': '2025-09-30T21:22:44.772376', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9aed5a8a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': '79315691c460b92686deecd64ccec27eb5f34cdb4484f019f6889daa4c28c361'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53-sda', 'timestamp': '2025-09-30T21:22:44.772376', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-936283323', 'name': 'instance-0000001e', 'instance_id': '456a6c22-b801-4d95-aa63-be64cd8e4b53', 'instance_type': 'm1.micro', 'host': '6a11b86166f4aafba4122910d4988e3ec9ab2fad65d750045ed7d0c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c9779bca-1eb6-4567-a36c-b452abeafc70', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9aed6746-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4060.437707362, 'message_signature': 'ed73b1b98f68544311667540e3ea9fbd12ae005def86b73337d1f4cc693ccc8f'}]}, 'timestamp': '2025-09-30 21:22:44.773401', '_unique_id': '00b43895219a4f36aee4abea104e9a55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:22:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:22:44.774 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:22:44 compute-1 systemd[225959]: Queued start job for default target Main User Target.
Sep 30 21:22:44 compute-1 systemd[225959]: Created slice User Application Slice.
Sep 30 21:22:44 compute-1 systemd[225959]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:22:44 compute-1 systemd[225959]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:22:44 compute-1 systemd[225959]: Reached target Paths.
Sep 30 21:22:44 compute-1 systemd[225959]: Reached target Timers.
Sep 30 21:22:44 compute-1 systemd[225959]: Starting D-Bus User Message Bus Socket...
Sep 30 21:22:44 compute-1 systemd[225959]: Starting Create User's Volatile Files and Directories...
Sep 30 21:22:44 compute-1 systemd[225959]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:22:44 compute-1 systemd[225959]: Reached target Sockets.
Sep 30 21:22:44 compute-1 systemd[225959]: Finished Create User's Volatile Files and Directories.
Sep 30 21:22:44 compute-1 systemd[225959]: Reached target Basic System.
Sep 30 21:22:44 compute-1 systemd[225959]: Reached target Main User Target.
Sep 30 21:22:44 compute-1 systemd[225959]: Startup finished in 121ms.
Sep 30 21:22:44 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:22:44 compute-1 systemd[1]: Started Session 38 of User nova.
Sep 30 21:22:44 compute-1 sshd-session[225955]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:22:45 compute-1 nova_compute[192795]: 2025-09-30 21:22:45.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:45 compute-1 sshd-session[225974]: Received disconnect from 192.168.122.100 port 47518:11: disconnected by user
Sep 30 21:22:45 compute-1 sshd-session[225974]: Disconnected from user nova 192.168.122.100 port 47518
Sep 30 21:22:45 compute-1 sshd-session[225955]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:22:45 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Sep 30 21:22:45 compute-1 systemd-logind[793]: Session 38 logged out. Waiting for processes to exit.
Sep 30 21:22:45 compute-1 systemd-logind[793]: Removed session 38.
Sep 30 21:22:45 compute-1 sshd-session[225976]: Accepted publickey for nova from 192.168.122.100 port 47534 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:22:45 compute-1 systemd-logind[793]: New session 40 of user nova.
Sep 30 21:22:45 compute-1 systemd[1]: Started Session 40 of User nova.
Sep 30 21:22:45 compute-1 sshd-session[225976]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:22:45 compute-1 sshd-session[225979]: Received disconnect from 192.168.122.100 port 47534:11: disconnected by user
Sep 30 21:22:45 compute-1 sshd-session[225979]: Disconnected from user nova 192.168.122.100 port 47534
Sep 30 21:22:45 compute-1 sshd-session[225976]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:22:45 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Sep 30 21:22:45 compute-1 systemd-logind[793]: Session 40 logged out. Waiting for processes to exit.
Sep 30 21:22:45 compute-1 systemd-logind[793]: Removed session 40.
Sep 30 21:22:45 compute-1 sshd-session[225981]: Accepted publickey for nova from 192.168.122.100 port 47544 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:22:45 compute-1 systemd-logind[793]: New session 41 of user nova.
Sep 30 21:22:45 compute-1 systemd[1]: Started Session 41 of User nova.
Sep 30 21:22:45 compute-1 sshd-session[225981]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:22:45 compute-1 sshd-session[225984]: Received disconnect from 192.168.122.100 port 47544:11: disconnected by user
Sep 30 21:22:45 compute-1 sshd-session[225984]: Disconnected from user nova 192.168.122.100 port 47544
Sep 30 21:22:45 compute-1 sshd-session[225981]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:22:45 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Sep 30 21:22:45 compute-1 systemd-logind[793]: Session 41 logged out. Waiting for processes to exit.
Sep 30 21:22:45 compute-1 systemd-logind[793]: Removed session 41.
Sep 30 21:22:46 compute-1 nova_compute[192795]: 2025-09-30 21:22:46.300 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquiring lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:46 compute-1 nova_compute[192795]: 2025-09-30 21:22:46.300 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquired lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:46 compute-1 nova_compute[192795]: 2025-09-30 21:22:46.300 2 DEBUG nova.network.neutron [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:46 compute-1 nova_compute[192795]: 2025-09-30 21:22:46.540 2 DEBUG nova.network.neutron [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:22:46 compute-1 nova_compute[192795]: 2025-09-30 21:22:46.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:46 compute-1 nova_compute[192795]: 2025-09-30 21:22:46.911 2 DEBUG nova.network.neutron [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:46 compute-1 nova_compute[192795]: 2025-09-30 21:22:46.925 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Releasing lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.047 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.049 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.049 2 INFO nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Creating image(s)
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.050 2 DEBUG nova.objects.instance [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.067 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.132 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.133 2 DEBUG nova.virt.disk.api [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Checking if we can resize image /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.133 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.195 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.196 2 DEBUG nova.virt.disk.api [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Cannot resize image /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.214 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.215 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Ensure instance console log exists: /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.215 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.216 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.216 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.219 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.224 2 WARNING nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.230 2 DEBUG nova.virt.libvirt.host [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.231 2 DEBUG nova.virt.libvirt.host [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.234 2 DEBUG nova.virt.libvirt.host [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.235 2 DEBUG nova.virt.libvirt.host [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.237 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.237 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.238 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.238 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.238 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.239 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.239 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.239 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.239 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.240 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.240 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.240 2 DEBUG nova.virt.hardware [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.241 2 DEBUG nova.objects.instance [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.256 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.319 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.320 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquiring lock "/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.320 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.321 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.324 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <uuid>5c749a3a-92bd-47ce-a966-33f62c7e3019</uuid>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <name>instance-00000025</name>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <nova:name>tempest-MigrationsAdminTest-server-565661956</nova:name>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:22:47</nova:creationTime>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:22:47 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:22:47 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:22:47 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:22:47 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:22:47 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:22:47 compute-1 nova_compute[192795]:         <nova:user uuid="f11be289e05b49a7809cb1a3523abc0c">tempest-MigrationsAdminTest-1333693346-project-member</nova:user>
Sep 30 21:22:47 compute-1 nova_compute[192795]:         <nova:project uuid="7c15359849554c2382315de9f52125af">tempest-MigrationsAdminTest-1333693346</nova:project>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <system>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <entry name="serial">5c749a3a-92bd-47ce-a966-33f62c7e3019</entry>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <entry name="uuid">5c749a3a-92bd-47ce-a966-33f62c7e3019</entry>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     </system>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <os>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   </os>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <features>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   </features>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/console.log" append="off"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <video>
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     </video>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:22:47 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:22:47 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:22:47 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:22:47 compute-1 nova_compute[192795]: </domain>
Sep 30 21:22:47 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.371 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.372 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:22:47 compute-1 nova_compute[192795]: 2025-09-30 21:22:47.372 2 INFO nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Using config drive
Sep 30 21:22:47 compute-1 systemd-machined[152783]: New machine qemu-20-instance-00000025.
Sep 30 21:22:47 compute-1 systemd[1]: Started Virtual Machine qemu-20-instance-00000025.
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.491 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267368.4905128, 5c749a3a-92bd-47ce-a966-33f62c7e3019 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.491 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] VM Resumed (Lifecycle Event)
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.493 2 DEBUG nova.compute.manager [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.495 2 INFO nova.virt.libvirt.driver [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance running successfully.
Sep 30 21:22:48 compute-1 virtqemud[192217]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.497 2 DEBUG nova.virt.libvirt.guest [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.497 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.544 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.550 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.586 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.586 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267368.4907136, 5c749a3a-92bd-47ce-a966-33f62c7e3019 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.586 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] VM Started (Lifecycle Event)
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.606 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:48 compute-1 nova_compute[192795]: 2025-09-30 21:22:48.610 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:50 compute-1 nova_compute[192795]: 2025-09-30 21:22:50.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:50 compute-1 podman[226032]: 2025-09-30 21:22:50.240608537 +0000 UTC m=+0.076047369 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:22:50 compute-1 nova_compute[192795]: 2025-09-30 21:22:50.809 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:50 compute-1 nova_compute[192795]: 2025-09-30 21:22:50.810 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:50 compute-1 nova_compute[192795]: 2025-09-30 21:22:50.811 2 DEBUG nova.network.neutron [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.000 2 DEBUG nova.network.neutron [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.405 2 DEBUG nova.network.neutron [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.430 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.450 2 DEBUG nova.virt.libvirt.driver [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Creating tmpfile /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/tmp1ppy_zrt to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618
Sep 30 21:22:51 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000025.scope: Deactivated successfully.
Sep 30 21:22:51 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000025.scope: Consumed 4.021s CPU time.
Sep 30 21:22:51 compute-1 systemd-machined[152783]: Machine qemu-20-instance-00000025 terminated.
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.703 2 INFO nova.virt.libvirt.driver [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance destroyed successfully.
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.703 2 DEBUG nova.objects.instance [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'resources' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.716 2 INFO nova.virt.libvirt.driver [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Deleting instance files /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_del
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.724 2 INFO nova.virt.libvirt.driver [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Deletion of /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_del complete
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.797 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.798 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.820 2 DEBUG nova.objects.instance [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'migration_context' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.920 2 DEBUG nova.compute.provider_tree [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.935 2 DEBUG nova.scheduler.client.report [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:51 compute-1 nova_compute[192795]: 2025-09-30 21:22:51.990 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:54.063 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:54 compute-1 nova_compute[192795]: 2025-09-30 21:22:54.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:22:54.066 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:22:55 compute-1 nova_compute[192795]: 2025-09-30 21:22:55.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:55 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:22:55 compute-1 systemd[225959]: Activating special unit Exit the Session...
Sep 30 21:22:55 compute-1 systemd[225959]: Stopped target Main User Target.
Sep 30 21:22:55 compute-1 systemd[225959]: Stopped target Basic System.
Sep 30 21:22:55 compute-1 systemd[225959]: Stopped target Paths.
Sep 30 21:22:55 compute-1 systemd[225959]: Stopped target Sockets.
Sep 30 21:22:55 compute-1 systemd[225959]: Stopped target Timers.
Sep 30 21:22:55 compute-1 systemd[225959]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:22:55 compute-1 systemd[225959]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:22:55 compute-1 systemd[225959]: Closed D-Bus User Message Bus Socket.
Sep 30 21:22:55 compute-1 systemd[225959]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:22:55 compute-1 systemd[225959]: Removed slice User Application Slice.
Sep 30 21:22:55 compute-1 systemd[225959]: Reached target Shutdown.
Sep 30 21:22:55 compute-1 systemd[225959]: Finished Exit the Session.
Sep 30 21:22:55 compute-1 systemd[225959]: Reached target Exit the Session.
Sep 30 21:22:55 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:22:55 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:22:55 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:22:55 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:22:55 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:22:55 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:22:55 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:22:56 compute-1 nova_compute[192795]: 2025-09-30 21:22:56.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:57 compute-1 podman[226064]: 2025-09-30 21:22:57.262514441 +0000 UTC m=+0.078319769 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:22:57 compute-1 podman[226062]: 2025-09-30 21:22:57.286839575 +0000 UTC m=+0.109033225 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:22:57 compute-1 podman[226063]: 2025-09-30 21:22:57.319180466 +0000 UTC m=+0.138685654 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:22:58 compute-1 nova_compute[192795]: 2025-09-30 21:22:58.807 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "857a84cb-03ec-4e88-a3e8-da80fda2c446" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:58 compute-1 nova_compute[192795]: 2025-09-30 21:22:58.808 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "857a84cb-03ec-4e88-a3e8-da80fda2c446" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:58 compute-1 nova_compute[192795]: 2025-09-30 21:22:58.808 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "857a84cb-03ec-4e88-a3e8-da80fda2c446-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:58 compute-1 nova_compute[192795]: 2025-09-30 21:22:58.809 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "857a84cb-03ec-4e88-a3e8-da80fda2c446-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:58 compute-1 nova_compute[192795]: 2025-09-30 21:22:58.809 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "857a84cb-03ec-4e88-a3e8-da80fda2c446-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:58 compute-1 nova_compute[192795]: 2025-09-30 21:22:58.825 2 INFO nova.compute.manager [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Terminating instance
Sep 30 21:22:58 compute-1 nova_compute[192795]: 2025-09-30 21:22:58.839 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-857a84cb-03ec-4e88-a3e8-da80fda2c446" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:58 compute-1 nova_compute[192795]: 2025-09-30 21:22:58.839 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-857a84cb-03ec-4e88-a3e8-da80fda2c446" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:58 compute-1 nova_compute[192795]: 2025-09-30 21:22:58.840 2 DEBUG nova.network.neutron [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.045 2 DEBUG nova.network.neutron [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.554 2 DEBUG nova.network.neutron [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.581 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-857a84cb-03ec-4e88-a3e8-da80fda2c446" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.582 2 DEBUG nova.compute.manager [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:22:59 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000023.scope: Deactivated successfully.
Sep 30 21:22:59 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000023.scope: Consumed 14.292s CPU time.
Sep 30 21:22:59 compute-1 systemd-machined[152783]: Machine qemu-19-instance-00000023 terminated.
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.845 2 INFO nova.virt.libvirt.driver [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance destroyed successfully.
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.845 2 DEBUG nova.objects.instance [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'resources' on Instance uuid 857a84cb-03ec-4e88-a3e8-da80fda2c446 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.862 2 INFO nova.virt.libvirt.driver [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Deleting instance files /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_del
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.873 2 INFO nova.virt.libvirt.driver [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Deletion of /var/lib/nova/instances/857a84cb-03ec-4e88-a3e8-da80fda2c446_del complete
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.963 2 INFO nova.compute.manager [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.964 2 DEBUG oslo.service.loopingcall [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.965 2 DEBUG nova.compute.manager [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:22:59 compute-1 nova_compute[192795]: 2025-09-30 21:22:59.965 2 DEBUG nova.network.neutron [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:00.068 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.305 2 DEBUG nova.network.neutron [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.323 2 DEBUG nova.network.neutron [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.338 2 INFO nova.compute.manager [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Took 0.37 seconds to deallocate network for instance.
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.418 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.419 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.574 2 DEBUG nova.compute.provider_tree [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.601 2 DEBUG nova.scheduler.client.report [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.630 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.680 2 INFO nova.scheduler.client.report [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Deleted allocations for instance 857a84cb-03ec-4e88-a3e8-da80fda2c446
Sep 30 21:23:00 compute-1 nova_compute[192795]: 2025-09-30 21:23:00.779 2 DEBUG oslo_concurrency.lockutils [None req-a67361ed-96e2-46bd-a42c-505ab1f65a11 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "857a84cb-03ec-4e88-a3e8-da80fda2c446" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.971s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:01 compute-1 podman[226139]: 2025-09-30 21:23:01.246451774 +0000 UTC m=+0.081330240 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:23:01 compute-1 nova_compute[192795]: 2025-09-30 21:23:01.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:02 compute-1 nova_compute[192795]: 2025-09-30 21:23:02.902 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "456a6c22-b801-4d95-aa63-be64cd8e4b53" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:02 compute-1 nova_compute[192795]: 2025-09-30 21:23:02.903 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:02 compute-1 nova_compute[192795]: 2025-09-30 21:23:02.903 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "456a6c22-b801-4d95-aa63-be64cd8e4b53-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:02 compute-1 nova_compute[192795]: 2025-09-30 21:23:02.903 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:02 compute-1 nova_compute[192795]: 2025-09-30 21:23:02.904 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:02 compute-1 nova_compute[192795]: 2025-09-30 21:23:02.913 2 INFO nova.compute.manager [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Terminating instance
Sep 30 21:23:02 compute-1 nova_compute[192795]: 2025-09-30 21:23:02.922 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:02 compute-1 nova_compute[192795]: 2025-09-30 21:23:02.923 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:02 compute-1 nova_compute[192795]: 2025-09-30 21:23:02.923 2 DEBUG nova.network.neutron [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.146 2 DEBUG nova.network.neutron [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.464 2 DEBUG nova.network.neutron [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.482 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-456a6c22-b801-4d95-aa63-be64cd8e4b53" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.483 2 DEBUG nova.compute.manager [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:23:03 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Sep 30 21:23:03 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001e.scope: Consumed 15.398s CPU time.
Sep 30 21:23:03 compute-1 systemd-machined[152783]: Machine qemu-17-instance-0000001e terminated.
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.723 2 INFO nova.virt.libvirt.driver [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance destroyed successfully.
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.724 2 DEBUG nova.objects.instance [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'resources' on Instance uuid 456a6c22-b801-4d95-aa63-be64cd8e4b53 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.737 2 INFO nova.virt.libvirt.driver [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Deleting instance files /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53_del
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.738 2 INFO nova.virt.libvirt.driver [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Deletion of /var/lib/nova/instances/456a6c22-b801-4d95-aa63-be64cd8e4b53_del complete
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.802 2 INFO nova.compute.manager [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Took 0.32 seconds to destroy the instance on the hypervisor.
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.803 2 DEBUG oslo.service.loopingcall [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.803 2 DEBUG nova.compute.manager [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.803 2 DEBUG nova.network.neutron [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:23:03 compute-1 nova_compute[192795]: 2025-09-30 21:23:03.959 2 DEBUG nova.network.neutron [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:04 compute-1 nova_compute[192795]: 2025-09-30 21:23:04.021 2 DEBUG nova.network.neutron [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:04 compute-1 nova_compute[192795]: 2025-09-30 21:23:04.153 2 INFO nova.compute.manager [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Took 0.35 seconds to deallocate network for instance.
Sep 30 21:23:04 compute-1 nova_compute[192795]: 2025-09-30 21:23:04.258 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:04 compute-1 nova_compute[192795]: 2025-09-30 21:23:04.259 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:04 compute-1 nova_compute[192795]: 2025-09-30 21:23:04.316 2 DEBUG nova.compute.provider_tree [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:04 compute-1 nova_compute[192795]: 2025-09-30 21:23:04.335 2 DEBUG nova.scheduler.client.report [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:04 compute-1 nova_compute[192795]: 2025-09-30 21:23:04.359 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:04 compute-1 nova_compute[192795]: 2025-09-30 21:23:04.387 2 INFO nova.scheduler.client.report [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Deleted allocations for instance 456a6c22-b801-4d95-aa63-be64cd8e4b53
Sep 30 21:23:04 compute-1 nova_compute[192795]: 2025-09-30 21:23:04.490 2 DEBUG oslo_concurrency.lockutils [None req-6537807b-5e48-462d-b2b7-e5d99c1d9740 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "456a6c22-b801-4d95-aa63-be64cd8e4b53" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:05 compute-1 nova_compute[192795]: 2025-09-30 21:23:05.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:06 compute-1 nova_compute[192795]: 2025-09-30 21:23:06.702 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267371.7014267, 5c749a3a-92bd-47ce-a966-33f62c7e3019 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:06 compute-1 nova_compute[192795]: 2025-09-30 21:23:06.702 2 INFO nova.compute.manager [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] VM Stopped (Lifecycle Event)
Sep 30 21:23:06 compute-1 nova_compute[192795]: 2025-09-30 21:23:06.726 2 DEBUG nova.compute.manager [None req-78cd2b2f-b61a-4553-a021-64f84a79f1a5 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:06 compute-1 nova_compute[192795]: 2025-09-30 21:23:06.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.410 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.410 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.430 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.597 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.597 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.613 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.614 2 INFO nova.compute.claims [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.814 2 DEBUG nova.compute.provider_tree [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.835 2 DEBUG nova.scheduler.client.report [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.866 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.867 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.929 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.930 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.955 2 INFO nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:23:07 compute-1 nova_compute[192795]: 2025-09-30 21:23:07.974 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.069 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.071 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.072 2 INFO nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Creating image(s)
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.072 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "/var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.073 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "/var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.073 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "/var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.091 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.128 2 DEBUG nova.policy [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '80abdc8ce51444378234b07daa877ac7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.160 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.161 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.162 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.178 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.241 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.243 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.285 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.287 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.287 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.354 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.356 2 DEBUG nova.virt.disk.api [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Checking if we can resize image /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.357 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.415 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.416 2 DEBUG nova.virt.disk.api [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Cannot resize image /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.416 2 DEBUG nova.objects.instance [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lazy-loading 'migration_context' on Instance uuid bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.431 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.431 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Ensure instance console log exists: /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.431 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.432 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.432 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:08 compute-1 nova_compute[192795]: 2025-09-30 21:23:08.683 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Successfully created port: e9d92195-9ca5-43a8-8e02-f057dfdf8378 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:23:09 compute-1 podman[226183]: 2025-09-30 21:23:09.20351528 +0000 UTC m=+0.051814256 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:23:09 compute-1 nova_compute[192795]: 2025-09-30 21:23:09.294 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Successfully created port: 9b270a14-f424-4491-9f14-3084a3431112 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:23:10 compute-1 nova_compute[192795]: 2025-09-30 21:23:10.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:10 compute-1 nova_compute[192795]: 2025-09-30 21:23:10.101 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Successfully created port: 8c94fd31-501a-4da2-853a-b0a5df9f1c8e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:23:10 compute-1 nova_compute[192795]: 2025-09-30 21:23:10.937 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Successfully updated port: e9d92195-9ca5-43a8-8e02-f057dfdf8378 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:23:11 compute-1 nova_compute[192795]: 2025-09-30 21:23:11.105 2 DEBUG nova.compute.manager [req-18eab2b9-fc4a-4acd-ad58-32280cbc8491 req-bf88553d-b58e-47b4-8462-04d6880da437 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-changed-e9d92195-9ca5-43a8-8e02-f057dfdf8378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:11 compute-1 nova_compute[192795]: 2025-09-30 21:23:11.105 2 DEBUG nova.compute.manager [req-18eab2b9-fc4a-4acd-ad58-32280cbc8491 req-bf88553d-b58e-47b4-8462-04d6880da437 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Refreshing instance network info cache due to event network-changed-e9d92195-9ca5-43a8-8e02-f057dfdf8378. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:11 compute-1 nova_compute[192795]: 2025-09-30 21:23:11.106 2 DEBUG oslo_concurrency.lockutils [req-18eab2b9-fc4a-4acd-ad58-32280cbc8491 req-bf88553d-b58e-47b4-8462-04d6880da437 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:11 compute-1 nova_compute[192795]: 2025-09-30 21:23:11.106 2 DEBUG oslo_concurrency.lockutils [req-18eab2b9-fc4a-4acd-ad58-32280cbc8491 req-bf88553d-b58e-47b4-8462-04d6880da437 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:11 compute-1 nova_compute[192795]: 2025-09-30 21:23:11.106 2 DEBUG nova.network.neutron [req-18eab2b9-fc4a-4acd-ad58-32280cbc8491 req-bf88553d-b58e-47b4-8462-04d6880da437 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Refreshing network info cache for port e9d92195-9ca5-43a8-8e02-f057dfdf8378 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:11 compute-1 podman[226204]: 2025-09-30 21:23:11.218147747 +0000 UTC m=+0.056364118 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:23:11 compute-1 podman[226203]: 2025-09-30 21:23:11.218147817 +0000 UTC m=+0.061084935 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:23:11 compute-1 nova_compute[192795]: 2025-09-30 21:23:11.415 2 DEBUG nova.network.neutron [req-18eab2b9-fc4a-4acd-ad58-32280cbc8491 req-bf88553d-b58e-47b4-8462-04d6880da437 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:11 compute-1 nova_compute[192795]: 2025-09-30 21:23:11.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:11 compute-1 nova_compute[192795]: 2025-09-30 21:23:11.954 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Successfully updated port: 9b270a14-f424-4491-9f14-3084a3431112 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:23:12 compute-1 nova_compute[192795]: 2025-09-30 21:23:12.004 2 DEBUG nova.network.neutron [req-18eab2b9-fc4a-4acd-ad58-32280cbc8491 req-bf88553d-b58e-47b4-8462-04d6880da437 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:12 compute-1 nova_compute[192795]: 2025-09-30 21:23:12.021 2 DEBUG oslo_concurrency.lockutils [req-18eab2b9-fc4a-4acd-ad58-32280cbc8491 req-bf88553d-b58e-47b4-8462-04d6880da437 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:12 compute-1 nova_compute[192795]: 2025-09-30 21:23:12.873 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Successfully updated port: 8c94fd31-501a-4da2-853a-b0a5df9f1c8e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:23:12 compute-1 nova_compute[192795]: 2025-09-30 21:23:12.890 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:12 compute-1 nova_compute[192795]: 2025-09-30 21:23:12.890 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquired lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:12 compute-1 nova_compute[192795]: 2025-09-30 21:23:12.891 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:23:13 compute-1 nova_compute[192795]: 2025-09-30 21:23:13.219 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:14 compute-1 nova_compute[192795]: 2025-09-30 21:23:14.138 2 DEBUG nova.compute.manager [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-changed-9b270a14-f424-4491-9f14-3084a3431112 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:14 compute-1 nova_compute[192795]: 2025-09-30 21:23:14.138 2 DEBUG nova.compute.manager [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Refreshing instance network info cache due to event network-changed-9b270a14-f424-4491-9f14-3084a3431112. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:14 compute-1 nova_compute[192795]: 2025-09-30 21:23:14.139 2 DEBUG oslo_concurrency.lockutils [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:14 compute-1 nova_compute[192795]: 2025-09-30 21:23:14.843 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267379.8414564, 857a84cb-03ec-4e88-a3e8-da80fda2c446 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:14 compute-1 nova_compute[192795]: 2025-09-30 21:23:14.843 2 INFO nova.compute.manager [-] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] VM Stopped (Lifecycle Event)
Sep 30 21:23:14 compute-1 nova_compute[192795]: 2025-09-30 21:23:14.863 2 DEBUG nova.compute.manager [None req-47759bfa-0042-4acd-9128-8b8f3c9efb4f - - - - - -] [instance: 857a84cb-03ec-4e88-a3e8-da80fda2c446] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:15 compute-1 nova_compute[192795]: 2025-09-30 21:23:15.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:16 compute-1 nova_compute[192795]: 2025-09-30 21:23:16.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.414 2 DEBUG nova.network.neutron [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Updating instance_info_cache with network_info: [{"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.439 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Releasing lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.439 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Instance network_info: |[{"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.440 2 DEBUG oslo_concurrency.lockutils [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.440 2 DEBUG nova.network.neutron [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Refreshing network info cache for port 9b270a14-f424-4491-9f14-3084a3431112 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.445 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Start _get_guest_xml network_info=[{"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.449 2 WARNING nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.454 2 DEBUG nova.virt.libvirt.host [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.454 2 DEBUG nova.virt.libvirt.host [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.459 2 DEBUG nova.virt.libvirt.host [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.460 2 DEBUG nova.virt.libvirt.host [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.461 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.461 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.462 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.462 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.462 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.462 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.462 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.463 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.463 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.463 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.463 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.463 2 DEBUG nova.virt.hardware [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.466 2 DEBUG nova.virt.libvirt.vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1854388411',display_name='tempest-ServersTestMultiNic-server-1854388411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1854388411',id=39,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-py8p7vm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:08Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.467 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.467 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:63:7f,bridge_name='br-int',has_traffic_filtering=True,id=e9d92195-9ca5-43a8-8e02-f057dfdf8378,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9d92195-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.468 2 DEBUG nova.virt.libvirt.vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1854388411',display_name='tempest-ServersTestMultiNic-server-1854388411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1854388411',id=39,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-py8p7vm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:08Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.468 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.469 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:14:cc,bridge_name='br-int',has_traffic_filtering=True,id=9b270a14-f424-4491-9f14-3084a3431112,network=Network(acbf49ed-afb9-4cf1-a75d-f198ee3fb120),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b270a14-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.469 2 DEBUG nova.virt.libvirt.vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1854388411',display_name='tempest-ServersTestMultiNic-server-1854388411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1854388411',id=39,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-py8p7vm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:08Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.470 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.470 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e7:bc,bridge_name='br-int',has_traffic_filtering=True,id=8c94fd31-501a-4da2-853a-b0a5df9f1c8e,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c94fd31-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.471 2 DEBUG nova.objects.instance [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lazy-loading 'pci_devices' on Instance uuid bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.484 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <uuid>bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa</uuid>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <name>instance-00000027</name>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersTestMultiNic-server-1854388411</nova:name>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:23:17</nova:creationTime>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:user uuid="80abdc8ce51444378234b07daa877ac7">tempest-ServersTestMultiNic-108666806-project-member</nova:user>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:project uuid="da6a860bf9ab4d91b946d9a35e448d94">tempest-ServersTestMultiNic-108666806</nova:project>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:port uuid="e9d92195-9ca5-43a8-8e02-f057dfdf8378">
Sep 30 21:23:17 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.142" ipVersion="4"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:port uuid="9b270a14-f424-4491-9f14-3084a3431112">
Sep 30 21:23:17 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.1.90" ipVersion="4"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         <nova:port uuid="8c94fd31-501a-4da2-853a-b0a5df9f1c8e">
Sep 30 21:23:17 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.39" ipVersion="4"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <system>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <entry name="serial">bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa</entry>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <entry name="uuid">bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa</entry>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </system>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <os>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   </os>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <features>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   </features>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk.config"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:ef:63:7f"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <target dev="tape9d92195-9c"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:19:14:cc"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <target dev="tap9b270a14-f4"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:ac:e7:bc"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <target dev="tap8c94fd31-50"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/console.log" append="off"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <video>
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </video>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:23:17 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:23:17 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:23:17 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:23:17 compute-1 nova_compute[192795]: </domain>
Sep 30 21:23:17 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.485 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Preparing to wait for external event network-vif-plugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.486 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.486 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.486 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.486 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Preparing to wait for external event network-vif-plugged-9b270a14-f424-4491-9f14-3084a3431112 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.486 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.487 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.487 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.487 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Preparing to wait for external event network-vif-plugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.487 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.487 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.487 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.488 2 DEBUG nova.virt.libvirt.vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1854388411',display_name='tempest-ServersTestMultiNic-server-1854388411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1854388411',id=39,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-py8p7vm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:08Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.488 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.489 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:63:7f,bridge_name='br-int',has_traffic_filtering=True,id=e9d92195-9ca5-43a8-8e02-f057dfdf8378,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9d92195-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.489 2 DEBUG os_vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:63:7f,bridge_name='br-int',has_traffic_filtering=True,id=e9d92195-9ca5-43a8-8e02-f057dfdf8378,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9d92195-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.490 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.490 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.495 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9d92195-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.497 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape9d92195-9c, col_values=(('external_ids', {'iface-id': 'e9d92195-9ca5-43a8-8e02-f057dfdf8378', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:63:7f', 'vm-uuid': 'bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 NetworkManager[51724]: <info>  [1759267397.5008] manager: (tape9d92195-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.507 2 INFO os_vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:63:7f,bridge_name='br-int',has_traffic_filtering=True,id=e9d92195-9ca5-43a8-8e02-f057dfdf8378,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9d92195-9c')
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.509 2 DEBUG nova.virt.libvirt.vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1854388411',display_name='tempest-ServersTestMultiNic-server-1854388411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1854388411',id=39,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-py8p7vm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:08Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.509 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.510 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:14:cc,bridge_name='br-int',has_traffic_filtering=True,id=9b270a14-f424-4491-9f14-3084a3431112,network=Network(acbf49ed-afb9-4cf1-a75d-f198ee3fb120),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b270a14-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.510 2 DEBUG os_vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:14:cc,bridge_name='br-int',has_traffic_filtering=True,id=9b270a14-f424-4491-9f14-3084a3431112,network=Network(acbf49ed-afb9-4cf1-a75d-f198ee3fb120),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b270a14-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.511 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.511 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.514 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b270a14-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.514 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9b270a14-f4, col_values=(('external_ids', {'iface-id': '9b270a14-f424-4491-9f14-3084a3431112', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:14:cc', 'vm-uuid': 'bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 NetworkManager[51724]: <info>  [1759267397.5183] manager: (tap9b270a14-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.523 2 INFO os_vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:14:cc,bridge_name='br-int',has_traffic_filtering=True,id=9b270a14-f424-4491-9f14-3084a3431112,network=Network(acbf49ed-afb9-4cf1-a75d-f198ee3fb120),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b270a14-f4')
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.524 2 DEBUG nova.virt.libvirt.vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1854388411',display_name='tempest-ServersTestMultiNic-server-1854388411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1854388411',id=39,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-py8p7vm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:08Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.525 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.525 2 DEBUG nova.network.os_vif_util [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e7:bc,bridge_name='br-int',has_traffic_filtering=True,id=8c94fd31-501a-4da2-853a-b0a5df9f1c8e,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c94fd31-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.526 2 DEBUG os_vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e7:bc,bridge_name='br-int',has_traffic_filtering=True,id=8c94fd31-501a-4da2-853a-b0a5df9f1c8e,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c94fd31-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.527 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.527 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.530 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c94fd31-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.531 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8c94fd31-50, col_values=(('external_ids', {'iface-id': '8c94fd31-501a-4da2-853a-b0a5df9f1c8e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:e7:bc', 'vm-uuid': 'bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 NetworkManager[51724]: <info>  [1759267397.5338] manager: (tap8c94fd31-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.541 2 INFO os_vif [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e7:bc,bridge_name='br-int',has_traffic_filtering=True,id=8c94fd31-501a-4da2-853a-b0a5df9f1c8e,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c94fd31-50')
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.593 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.594 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.594 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] No VIF found with MAC fa:16:3e:ef:63:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.595 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] No VIF found with MAC fa:16:3e:19:14:cc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.595 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] No VIF found with MAC fa:16:3e:ac:e7:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:23:17 compute-1 nova_compute[192795]: 2025-09-30 21:23:17.595 2 INFO nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Using config drive
Sep 30 21:23:17 compute-1 ovn_controller[94902]: 2025-09-30T21:23:17Z|00149|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.355 2 INFO nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Creating config drive at /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk.config
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.360 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2z4yp87x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.487 2 DEBUG oslo_concurrency.processutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2z4yp87x" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:18 compute-1 kernel: tape9d92195-9c: entered promiscuous mode
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.5649] manager: (tape9d92195-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00150|binding|INFO|Claiming lport e9d92195-9ca5-43a8-8e02-f057dfdf8378 for this chassis.
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00151|binding|INFO|e9d92195-9ca5-43a8-8e02-f057dfdf8378: Claiming fa:16:3e:ef:63:7f 10.100.0.142
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.6147] manager: (tap9b270a14-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Sep 30 21:23:18 compute-1 systemd-udevd[226275]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:18 compute-1 systemd-udevd[226274]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.628 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:63:7f 10.100.0.142'], port_security=['fa:16:3e:ef:63:7f 10.100.0.142'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.142/24', 'neutron:device_id': 'bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c23309b9-6834-440c-b70d-83c63e6f455b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f3de690-43a5-4d81-baca-cfa7cf84724b, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=e9d92195-9ca5-43a8-8e02-f057dfdf8378) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.630 103861 INFO neutron.agent.ovn.metadata.agent [-] Port e9d92195-9ca5-43a8-8e02-f057dfdf8378 in datapath c23309b9-6834-440c-b70d-83c63e6f455b bound to our chassis
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.6327] device (tape9d92195-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.6334] device (tape9d92195-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.634 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c23309b9-6834-440c-b70d-83c63e6f455b
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.6456] manager: (tap8c94fd31-50): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.652 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5c5c05-c8bb-46ad-9b91-472a697552d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.653 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc23309b9-61 in ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.663 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc23309b9-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.663 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ff785137-d6c5-485c-af5e-84e421d5761a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.664 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[65ca8fd8-be26-4a7f-bad1-896380d3dd2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 kernel: tap9b270a14-f4: entered promiscuous mode
Sep 30 21:23:18 compute-1 kernel: tap8c94fd31-50: entered promiscuous mode
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.6690] device (tap9b270a14-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.6699] device (tap8c94fd31-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.6706] device (tap9b270a14-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.6709] device (tap8c94fd31-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00152|binding|INFO|Claiming lport 9b270a14-f424-4491-9f14-3084a3431112 for this chassis.
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00153|binding|INFO|9b270a14-f424-4491-9f14-3084a3431112: Claiming fa:16:3e:19:14:cc 10.100.1.90
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00154|binding|INFO|Claiming lport 8c94fd31-501a-4da2-853a-b0a5df9f1c8e for this chassis.
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00155|binding|INFO|8c94fd31-501a-4da2-853a-b0a5df9f1c8e: Claiming fa:16:3e:ac:e7:bc 10.100.0.39
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00156|binding|INFO|Setting lport e9d92195-9ca5-43a8-8e02-f057dfdf8378 ovn-installed in OVS
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00157|binding|INFO|Setting lport e9d92195-9ca5-43a8-8e02-f057dfdf8378 up in Southbound
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.684 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[600638f6-f7b6-4243-b187-669d5e91f774]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.687 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:e7:bc 10.100.0.39'], port_security=['fa:16:3e:ac:e7:bc 10.100.0.39'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.39/24', 'neutron:device_id': 'bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c23309b9-6834-440c-b70d-83c63e6f455b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f3de690-43a5-4d81-baca-cfa7cf84724b, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=8c94fd31-501a-4da2-853a-b0a5df9f1c8e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.690 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:14:cc 10.100.1.90'], port_security=['fa:16:3e:19:14:cc 10.100.1.90'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.90/24', 'neutron:device_id': 'bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-acbf49ed-afb9-4cf1-a75d-f198ee3fb120', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ad8b189-10f5-4834-b0fc-ffc4a44401ab, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=9b270a14-f424-4491-9f14-3084a3431112) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:18 compute-1 systemd-machined[152783]: New machine qemu-21-instance-00000027.
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00158|binding|INFO|Setting lport 9b270a14-f424-4491-9f14-3084a3431112 ovn-installed in OVS
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00159|binding|INFO|Setting lport 9b270a14-f424-4491-9f14-3084a3431112 up in Southbound
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00160|binding|INFO|Setting lport 8c94fd31-501a-4da2-853a-b0a5df9f1c8e ovn-installed in OVS
Sep 30 21:23:18 compute-1 ovn_controller[94902]: 2025-09-30T21:23:18Z|00161|binding|INFO|Setting lport 8c94fd31-501a-4da2-853a-b0a5df9f1c8e up in Southbound
Sep 30 21:23:18 compute-1 systemd[1]: Started Virtual Machine qemu-21-instance-00000027.
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.722 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267383.7217708, 456a6c22-b801-4d95-aa63-be64cd8e4b53 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.722 2 INFO nova.compute.manager [-] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] VM Stopped (Lifecycle Event)
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.728 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb70ae3-7c58-4aa6-8504-9f79a1059f9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.771 2 DEBUG nova.compute.manager [None req-87d487db-e406-4bcc-b425-78c1d5042243 - - - - - -] [instance: 456a6c22-b801-4d95-aa63-be64cd8e4b53] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.779 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[290e0ed8-eacb-48dd-b779-53b7a1e91e7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.7870] manager: (tapc23309b9-60): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.786 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c48e6ac6-e8b8-4dd2-87e7-75f3d58067c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.832 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbc70af-fabb-4f9a-839c-2bf67f4726d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.835 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a13aa14c-9c35-4dfd-b5d3-b5bc5d4ed1bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 NetworkManager[51724]: <info>  [1759267398.8643] device (tapc23309b9-60): carrier: link connected
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.879 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[721a08a2-3480-45ac-8665-b2168ac02ecf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.895 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[71669bb7-9e9c-4ce3-b6cf-a1ffb3fa783d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc23309b9-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:11:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409453, 'reachable_time': 28654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226316, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.920 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5c915d01-77f7-4bdb-8870-4f9c542703be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5e:11e0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409453, 'tstamp': 409453}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226317, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.945 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e498850d-09e0-4a4c-a908-7e535f2e61fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc23309b9-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:11:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409453, 'reachable_time': 28654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226318, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.976 2 DEBUG nova.compute.manager [req-a99d5a28-f1ae-48d8-9fc9-a79f53e052f9 req-2edf0243-baa6-4e22-abd8-4e9b5855a66e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-plugged-9b270a14-f424-4491-9f14-3084a3431112 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.976 2 DEBUG oslo_concurrency.lockutils [req-a99d5a28-f1ae-48d8-9fc9-a79f53e052f9 req-2edf0243-baa6-4e22-abd8-4e9b5855a66e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.977 2 DEBUG oslo_concurrency.lockutils [req-a99d5a28-f1ae-48d8-9fc9-a79f53e052f9 req-2edf0243-baa6-4e22-abd8-4e9b5855a66e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.977 2 DEBUG oslo_concurrency.lockutils [req-a99d5a28-f1ae-48d8-9fc9-a79f53e052f9 req-2edf0243-baa6-4e22-abd8-4e9b5855a66e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:18 compute-1 nova_compute[192795]: 2025-09-30 21:23:18.978 2 DEBUG nova.compute.manager [req-a99d5a28-f1ae-48d8-9fc9-a79f53e052f9 req-2edf0243-baa6-4e22-abd8-4e9b5855a66e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Processing event network-vif-plugged-9b270a14-f424-4491-9f14-3084a3431112 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:23:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:18.987 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[135fc28a-535d-4f9c-a703-f98c25e41fd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.071 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2472ae6e-cdd1-4d77-97b2-5114cdd250fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.073 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc23309b9-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.073 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.074 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc23309b9-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:19 compute-1 kernel: tapc23309b9-60: entered promiscuous mode
Sep 30 21:23:19 compute-1 NetworkManager[51724]: <info>  [1759267399.0766] manager: (tapc23309b9-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.086 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc23309b9-60, col_values=(('external_ids', {'iface-id': '47235677-cd26-4edc-b4dc-078e745e8c3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:19 compute-1 ovn_controller[94902]: 2025-09-30T21:23:19Z|00162|binding|INFO|Releasing lport 47235677-cd26-4edc-b4dc-078e745e8c3b from this chassis (sb_readonly=0)
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.092 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c23309b9-6834-440c-b70d-83c63e6f455b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c23309b9-6834-440c-b70d-83c63e6f455b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.093 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfecac3-9de3-461c-a723-12e0e0caf0dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.094 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-c23309b9-6834-440c-b70d-83c63e6f455b
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/c23309b9-6834-440c-b70d-83c63e6f455b.pid.haproxy
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID c23309b9-6834-440c-b70d-83c63e6f455b
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.097 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'env', 'PROCESS_TAG=haproxy-c23309b9-6834-440c-b70d-83c63e6f455b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c23309b9-6834-440c-b70d-83c63e6f455b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.367 2 DEBUG nova.network.neutron [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Updated VIF entry in instance network info cache for port 9b270a14-f424-4491-9f14-3084a3431112. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.368 2 DEBUG nova.network.neutron [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Updating instance_info_cache with network_info: [{"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.438 2 DEBUG oslo_concurrency.lockutils [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.439 2 DEBUG nova.compute.manager [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-changed-8c94fd31-501a-4da2-853a-b0a5df9f1c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.439 2 DEBUG nova.compute.manager [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Refreshing instance network info cache due to event network-changed-8c94fd31-501a-4da2-853a-b0a5df9f1c8e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.439 2 DEBUG oslo_concurrency.lockutils [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.440 2 DEBUG oslo_concurrency.lockutils [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.440 2 DEBUG nova.network.neutron [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Refreshing network info cache for port 8c94fd31-501a-4da2-853a-b0a5df9f1c8e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:19 compute-1 podman[226360]: 2025-09-30 21:23:19.551949174 +0000 UTC m=+0.073276143 container create a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.571 2 DEBUG nova.compute.manager [req-9b1344de-8149-4409-b92e-8168a7734e49 req-af60b236-45a9-4110-b030-d6b1ae44300f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-plugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.573 2 DEBUG oslo_concurrency.lockutils [req-9b1344de-8149-4409-b92e-8168a7734e49 req-af60b236-45a9-4110-b030-d6b1ae44300f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.573 2 DEBUG oslo_concurrency.lockutils [req-9b1344de-8149-4409-b92e-8168a7734e49 req-af60b236-45a9-4110-b030-d6b1ae44300f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.574 2 DEBUG oslo_concurrency.lockutils [req-9b1344de-8149-4409-b92e-8168a7734e49 req-af60b236-45a9-4110-b030-d6b1ae44300f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.574 2 DEBUG nova.compute.manager [req-9b1344de-8149-4409-b92e-8168a7734e49 req-af60b236-45a9-4110-b030-d6b1ae44300f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Processing event network-vif-plugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:23:19 compute-1 systemd[1]: Started libpod-conmon-a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658.scope.
Sep 30 21:23:19 compute-1 podman[226360]: 2025-09-30 21:23:19.508734421 +0000 UTC m=+0.030061430 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:23:19 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:23:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/824d8d5d31ac2d8ba45b191192b59d721bb7455221f14596dbd3f64701d184e0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:23:19 compute-1 podman[226360]: 2025-09-30 21:23:19.649694145 +0000 UTC m=+0.171021124 container init a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:23:19 compute-1 podman[226360]: 2025-09-30 21:23:19.659354645 +0000 UTC m=+0.180681634 container start a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:23:19 compute-1 neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b[226375]: [NOTICE]   (226379) : New worker (226381) forked
Sep 30 21:23:19 compute-1 neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b[226375]: [NOTICE]   (226379) : Loading success.
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.721 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 8c94fd31-501a-4da2-853a-b0a5df9f1c8e in datapath c23309b9-6834-440c-b70d-83c63e6f455b unbound from our chassis
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.724 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c23309b9-6834-440c-b70d-83c63e6f455b
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.743 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a9acfffc-42c0-43f4-abeb-9304b3717e13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.778 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5364bcf7-24ac-4263-8b14-31c1bce70a47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.783 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[108e8def-7e8a-41cb-bd0e-55fcffc184e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.829 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5aca0097-4fba-4ff1-ba66-7ae38a3c00fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.858 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ae6d56f4-65fa-43e4-b9cb-21d5033e3d78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc23309b9-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:11:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 306, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 306, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409453, 'reachable_time': 28654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226395, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.887 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[82350ce5-1761-4fd2-815a-1dfe26769455]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tapc23309b9-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409469, 'tstamp': 409469}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226396, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc23309b9-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409473, 'tstamp': 409473}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226396, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.890 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc23309b9-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.936 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc23309b9-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.937 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.941 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc23309b9-60, col_values=(('external_ids', {'iface-id': '47235677-cd26-4edc-b4dc-078e745e8c3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.942 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.946 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 9b270a14-f424-4491-9f14-3084a3431112 in datapath acbf49ed-afb9-4cf1-a75d-f198ee3fb120 unbound from our chassis
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.948 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network acbf49ed-afb9-4cf1-a75d-f198ee3fb120
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.965 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[55d4635d-578c-491a-90c6-1edd330f1dd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.966 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapacbf49ed-a1 in ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.969 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapacbf49ed-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.969 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee36aec-0d58-44c8-a612-e958b8fce5ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.971 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a162ddc2-0dfd-4689-93fa-2d4df49485c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:19.987 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[34f8904d-b032-4d82-8d7f-ae810770861a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.996 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267399.9957345, bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:19 compute-1 nova_compute[192795]: 2025-09-30 21:23:19.997 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] VM Started (Lifecycle Event)
Sep 30 21:23:20 compute-1 nova_compute[192795]: 2025-09-30 21:23:20.021 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.022 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7ea927-0091-40cb-ad54-d64c6feb38ff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 nova_compute[192795]: 2025-09-30 21:23:20.026 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267399.9960299, bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:20 compute-1 nova_compute[192795]: 2025-09-30 21:23:20.027 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] VM Paused (Lifecycle Event)
Sep 30 21:23:20 compute-1 nova_compute[192795]: 2025-09-30 21:23:20.045 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:20 compute-1 nova_compute[192795]: 2025-09-30 21:23:20.049 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.063 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[75976d18-71ab-4094-8c16-f60c7e3cb931]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 nova_compute[192795]: 2025-09-30 21:23:20.069 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:20 compute-1 NetworkManager[51724]: <info>  [1759267400.0712] manager: (tapacbf49ed-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.070 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dbd1f385-0b03-48bc-a8a9-51200f7161a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 systemd-udevd[226310]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.122 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a9148d8c-5d72-487f-b916-6fa52cab4760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.127 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2c02cb5e-9ebe-454a-8b44-8867f22015e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 NetworkManager[51724]: <info>  [1759267400.1654] device (tapacbf49ed-a0): carrier: link connected
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.174 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[54c1544e-6583-44c6-a15e-5d4859bbb7ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.197 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[133d345b-7905-4aea-a583-edac112cc583]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapacbf49ed-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:6f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409583, 'reachable_time': 23597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226407, 'error': None, 'target': 'ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.217 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9b02ee-7fb9-4fdd-a6f8-dec6441d7af8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9e:6fb5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409583, 'tstamp': 409583}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226408, 'error': None, 'target': 'ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.239 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[95bd6cbb-2bfb-48e0-b03e-0b0a2ccfc63f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapacbf49ed-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:6f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409583, 'reachable_time': 23597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226409, 'error': None, 'target': 'ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.287 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[803f1016-7114-45ea-943c-bb733e20ab0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.375 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[72417064-a142-47e3-a116-9d2366114cd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.377 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapacbf49ed-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.377 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.378 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapacbf49ed-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:20 compute-1 kernel: tapacbf49ed-a0: entered promiscuous mode
Sep 30 21:23:20 compute-1 NetworkManager[51724]: <info>  [1759267400.3803] manager: (tapacbf49ed-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Sep 30 21:23:20 compute-1 nova_compute[192795]: 2025-09-30 21:23:20.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.383 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapacbf49ed-a0, col_values=(('external_ids', {'iface-id': '2b3e9bd9-52df-4f96-8ccb-b6303c636c64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:20 compute-1 nova_compute[192795]: 2025-09-30 21:23:20.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:20 compute-1 ovn_controller[94902]: 2025-09-30T21:23:20Z|00163|binding|INFO|Releasing lport 2b3e9bd9-52df-4f96-8ccb-b6303c636c64 from this chassis (sb_readonly=0)
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.395 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/acbf49ed-afb9-4cf1-a75d-f198ee3fb120.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/acbf49ed-afb9-4cf1-a75d-f198ee3fb120.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:23:20 compute-1 nova_compute[192795]: 2025-09-30 21:23:20.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.396 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[481ffcbb-ddeb-43f1-957f-005f7a176aaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.397 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-acbf49ed-afb9-4cf1-a75d-f198ee3fb120
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/acbf49ed-afb9-4cf1-a75d-f198ee3fb120.pid.haproxy
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID acbf49ed-afb9-4cf1-a75d-f198ee3fb120
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:23:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:20.397 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120', 'env', 'PROCESS_TAG=haproxy-acbf49ed-afb9-4cf1-a75d-f198ee3fb120', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/acbf49ed-afb9-4cf1-a75d-f198ee3fb120.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:23:20 compute-1 podman[226442]: 2025-09-30 21:23:20.830381885 +0000 UTC m=+0.070100918 container create cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:23:20 compute-1 systemd[1]: Started libpod-conmon-cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c.scope.
Sep 30 21:23:20 compute-1 podman[226442]: 2025-09-30 21:23:20.790823841 +0000 UTC m=+0.030542974 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:23:20 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:23:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9712d8139c6764d82e2e15e25c7d1cce66f80eeae43d036ef447bcb197add4e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:23:20 compute-1 podman[226442]: 2025-09-30 21:23:20.937476938 +0000 UTC m=+0.177195981 container init cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:23:20 compute-1 podman[226442]: 2025-09-30 21:23:20.945013721 +0000 UTC m=+0.184732744 container start cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:23:20 compute-1 podman[226457]: 2025-09-30 21:23:20.990178346 +0000 UTC m=+0.100548936 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:23:20 compute-1 neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120[226458]: [NOTICE]   (226479) : New worker (226482) forked
Sep 30 21:23:20 compute-1 neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120[226458]: [NOTICE]   (226479) : Loading success.
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.014 2 DEBUG nova.network.neutron [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Updated VIF entry in instance network info cache for port 8c94fd31-501a-4da2-853a-b0a5df9f1c8e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.015 2 DEBUG nova.network.neutron [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Updating instance_info_cache with network_info: [{"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.052 2 DEBUG oslo_concurrency.lockutils [req-b10dc1be-2f59-46ec-b4ab-c90115ec9587 req-7a1d2819-aec6-4d81-b14b-fc18e1598c6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.092 2 DEBUG nova.compute.manager [req-ff75bf81-ccb5-4c7d-9b2d-1ec836a0903b req-50ded9ab-1d40-4103-8b58-c92737cedb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-plugged-9b270a14-f424-4491-9f14-3084a3431112 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.095 2 DEBUG oslo_concurrency.lockutils [req-ff75bf81-ccb5-4c7d-9b2d-1ec836a0903b req-50ded9ab-1d40-4103-8b58-c92737cedb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.096 2 DEBUG oslo_concurrency.lockutils [req-ff75bf81-ccb5-4c7d-9b2d-1ec836a0903b req-50ded9ab-1d40-4103-8b58-c92737cedb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.096 2 DEBUG oslo_concurrency.lockutils [req-ff75bf81-ccb5-4c7d-9b2d-1ec836a0903b req-50ded9ab-1d40-4103-8b58-c92737cedb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.096 2 DEBUG nova.compute.manager [req-ff75bf81-ccb5-4c7d-9b2d-1ec836a0903b req-50ded9ab-1d40-4103-8b58-c92737cedb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] No event matching network-vif-plugged-9b270a14-f424-4491-9f14-3084a3431112 in dict_keys([('network-vif-plugged', '8c94fd31-501a-4da2-853a-b0a5df9f1c8e')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.097 2 WARNING nova.compute.manager [req-ff75bf81-ccb5-4c7d-9b2d-1ec836a0903b req-50ded9ab-1d40-4103-8b58-c92737cedb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received unexpected event network-vif-plugged-9b270a14-f424-4491-9f14-3084a3431112 for instance with vm_state building and task_state spawning.
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.802 2 DEBUG nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-plugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.803 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.804 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.804 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.804 2 DEBUG nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] No event matching network-vif-plugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 in dict_keys([('network-vif-plugged', '8c94fd31-501a-4da2-853a-b0a5df9f1c8e')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.805 2 WARNING nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received unexpected event network-vif-plugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 for instance with vm_state building and task_state spawning.
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.805 2 DEBUG nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-plugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.806 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.806 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.807 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.807 2 DEBUG nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Processing event network-vif-plugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.807 2 DEBUG nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-plugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.808 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.808 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.809 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.809 2 DEBUG nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] No waiting events found dispatching network-vif-plugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.809 2 WARNING nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received unexpected event network-vif-plugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e for instance with vm_state building and task_state spawning.
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.811 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.816 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267401.8160093, bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.816 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] VM Resumed (Lifecycle Event)
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.827 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.837 2 INFO nova.virt.libvirt.driver [-] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Instance spawned successfully.
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.837 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.867 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.876 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.883 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.885 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.886 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.887 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.888 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.889 2 DEBUG nova.virt.libvirt.driver [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.897 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.968 2 INFO nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Took 13.90 seconds to spawn the instance on the hypervisor.
Sep 30 21:23:21 compute-1 nova_compute[192795]: 2025-09-30 21:23:21.968 2 DEBUG nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:22 compute-1 nova_compute[192795]: 2025-09-30 21:23:22.043 2 INFO nova.compute.manager [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Took 14.50 seconds to build instance.
Sep 30 21:23:22 compute-1 nova_compute[192795]: 2025-09-30 21:23:22.068 2 DEBUG oslo_concurrency.lockutils [None req-cf671705-4a31-4288-b797-abfe61b5c464 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:22 compute-1 nova_compute[192795]: 2025-09-30 21:23:22.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.861 2 DEBUG oslo_concurrency.lockutils [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.862 2 DEBUG oslo_concurrency.lockutils [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.863 2 DEBUG oslo_concurrency.lockutils [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.863 2 DEBUG oslo_concurrency.lockutils [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.863 2 DEBUG oslo_concurrency.lockutils [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.880 2 INFO nova.compute.manager [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Terminating instance
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.894 2 DEBUG nova.compute.manager [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:23:23 compute-1 kernel: tape9d92195-9c (unregistering): left promiscuous mode
Sep 30 21:23:23 compute-1 NetworkManager[51724]: <info>  [1759267403.9171] device (tape9d92195-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:23:23 compute-1 ovn_controller[94902]: 2025-09-30T21:23:23Z|00164|binding|INFO|Releasing lport e9d92195-9ca5-43a8-8e02-f057dfdf8378 from this chassis (sb_readonly=0)
Sep 30 21:23:23 compute-1 ovn_controller[94902]: 2025-09-30T21:23:23Z|00165|binding|INFO|Setting lport e9d92195-9ca5-43a8-8e02-f057dfdf8378 down in Southbound
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:23 compute-1 ovn_controller[94902]: 2025-09-30T21:23:23Z|00166|binding|INFO|Removing iface tape9d92195-9c ovn-installed in OVS
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:23.940 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:63:7f 10.100.0.142'], port_security=['fa:16:3e:ef:63:7f 10.100.0.142'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.142/24', 'neutron:device_id': 'bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c23309b9-6834-440c-b70d-83c63e6f455b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f3de690-43a5-4d81-baca-cfa7cf84724b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=e9d92195-9ca5-43a8-8e02-f057dfdf8378) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:23.942 103861 INFO neutron.agent.ovn.metadata.agent [-] Port e9d92195-9ca5-43a8-8e02-f057dfdf8378 in datapath c23309b9-6834-440c-b70d-83c63e6f455b unbound from our chassis
Sep 30 21:23:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:23.944 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c23309b9-6834-440c-b70d-83c63e6f455b
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:23 compute-1 kernel: tap9b270a14-f4 (unregistering): left promiscuous mode
Sep 30 21:23:23 compute-1 NetworkManager[51724]: <info>  [1759267403.9678] device (tap9b270a14-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:23:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:23.976 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a7717522-098f-4622-a65f-726735042edf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:23 compute-1 ovn_controller[94902]: 2025-09-30T21:23:23Z|00167|binding|INFO|Releasing lport 9b270a14-f424-4491-9f14-3084a3431112 from this chassis (sb_readonly=0)
Sep 30 21:23:23 compute-1 ovn_controller[94902]: 2025-09-30T21:23:23Z|00168|binding|INFO|Setting lport 9b270a14-f424-4491-9f14-3084a3431112 down in Southbound
Sep 30 21:23:23 compute-1 ovn_controller[94902]: 2025-09-30T21:23:23Z|00169|binding|INFO|Removing iface tap9b270a14-f4 ovn-installed in OVS
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:23.990 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:14:cc 10.100.1.90'], port_security=['fa:16:3e:19:14:cc 10.100.1.90'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.90/24', 'neutron:device_id': 'bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-acbf49ed-afb9-4cf1-a75d-f198ee3fb120', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ad8b189-10f5-4834-b0fc-ffc4a44401ab, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=9b270a14-f424-4491-9f14-3084a3431112) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:23 compute-1 nova_compute[192795]: 2025-09-30 21:23:23.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 kernel: tap8c94fd31-50 (unregistering): left promiscuous mode
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.012 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[11250721-cbcb-4e0b-96f0-7d3922422580]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 NetworkManager[51724]: <info>  [1759267404.0152] device (tap8c94fd31-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.017 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a75bba-9b09-433e-b465-debb9a3107c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 ovn_controller[94902]: 2025-09-30T21:23:24Z|00170|binding|INFO|Releasing lport 8c94fd31-501a-4da2-853a-b0a5df9f1c8e from this chassis (sb_readonly=0)
Sep 30 21:23:24 compute-1 ovn_controller[94902]: 2025-09-30T21:23:24Z|00171|binding|INFO|Setting lport 8c94fd31-501a-4da2-853a-b0a5df9f1c8e down in Southbound
Sep 30 21:23:24 compute-1 ovn_controller[94902]: 2025-09-30T21:23:24Z|00172|binding|INFO|Removing iface tap8c94fd31-50 ovn-installed in OVS
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.044 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:e7:bc 10.100.0.39'], port_security=['fa:16:3e:ac:e7:bc 10.100.0.39'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.39/24', 'neutron:device_id': 'bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c23309b9-6834-440c-b70d-83c63e6f455b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f3de690-43a5-4d81-baca-cfa7cf84724b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=8c94fd31-501a-4da2-853a-b0a5df9f1c8e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.068 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7ecd139f-a507-4669-b021-5f06d680df2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000027.scope: Deactivated successfully.
Sep 30 21:23:24 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000027.scope: Consumed 3.341s CPU time.
Sep 30 21:23:24 compute-1 systemd-machined[152783]: Machine qemu-21-instance-00000027 terminated.
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.097 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[230a5362-5e88-433b-b513-b6860e34b285]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc23309b9-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:11:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 832, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 832, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409453, 'reachable_time': 28654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226514, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 NetworkManager[51724]: <info>  [1759267404.1214] manager: (tape9d92195-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.128 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[36ac93b2-bad0-43d6-8387-db743669c5d2]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tapc23309b9-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409469, 'tstamp': 409469}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226515, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc23309b9-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409473, 'tstamp': 409473}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226515, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.131 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc23309b9-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 NetworkManager[51724]: <info>  [1759267404.1383] manager: (tap9b270a14-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.149 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc23309b9-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.150 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:24 compute-1 NetworkManager[51724]: <info>  [1759267404.1516] manager: (tap8c94fd31-50): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.151 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc23309b9-60, col_values=(('external_ids', {'iface-id': '47235677-cd26-4edc-b4dc-078e745e8c3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.151 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.153 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 9b270a14-f424-4491-9f14-3084a3431112 in datapath acbf49ed-afb9-4cf1-a75d-f198ee3fb120 unbound from our chassis
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.156 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network acbf49ed-afb9-4cf1-a75d-f198ee3fb120, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.157 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[50f47d6c-f8ca-405e-b9a6-be03e726f308]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.158 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120 namespace which is not needed anymore
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.223 2 INFO nova.virt.libvirt.driver [-] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Instance destroyed successfully.
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.224 2 DEBUG nova.objects.instance [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lazy-loading 'resources' on Instance uuid bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.250 2 DEBUG nova.virt.libvirt.vif [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1854388411',display_name='tempest-ServersTestMultiNic-server-1854388411',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1854388411',id=39,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:23:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-py8p7vm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:23:22Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.251 2 DEBUG nova.network.os_vif_util [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "address": "fa:16:3e:ef:63:7f", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9d92195-9c", "ovs_interfaceid": "e9d92195-9ca5-43a8-8e02-f057dfdf8378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.251 2 DEBUG nova.network.os_vif_util [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:63:7f,bridge_name='br-int',has_traffic_filtering=True,id=e9d92195-9ca5-43a8-8e02-f057dfdf8378,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9d92195-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.252 2 DEBUG os_vif [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:63:7f,bridge_name='br-int',has_traffic_filtering=True,id=e9d92195-9ca5-43a8-8e02-f057dfdf8378,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9d92195-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.254 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9d92195-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.267 2 INFO os_vif [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:63:7f,bridge_name='br-int',has_traffic_filtering=True,id=e9d92195-9ca5-43a8-8e02-f057dfdf8378,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9d92195-9c')
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.268 2 DEBUG nova.virt.libvirt.vif [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1854388411',display_name='tempest-ServersTestMultiNic-server-1854388411',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1854388411',id=39,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:23:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-py8p7vm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:23:22Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.268 2 DEBUG nova.network.os_vif_util [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.269 2 DEBUG nova.network.os_vif_util [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:14:cc,bridge_name='br-int',has_traffic_filtering=True,id=9b270a14-f424-4491-9f14-3084a3431112,network=Network(acbf49ed-afb9-4cf1-a75d-f198ee3fb120),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b270a14-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.269 2 DEBUG os_vif [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:14:cc,bridge_name='br-int',has_traffic_filtering=True,id=9b270a14-f424-4491-9f14-3084a3431112,network=Network(acbf49ed-afb9-4cf1-a75d-f198ee3fb120),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b270a14-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.270 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b270a14-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.278 2 INFO os_vif [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:14:cc,bridge_name='br-int',has_traffic_filtering=True,id=9b270a14-f424-4491-9f14-3084a3431112,network=Network(acbf49ed-afb9-4cf1-a75d-f198ee3fb120),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b270a14-f4')
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.279 2 DEBUG nova.virt.libvirt.vif [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1854388411',display_name='tempest-ServersTestMultiNic-server-1854388411',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1854388411',id=39,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:23:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-py8p7vm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:23:22Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.279 2 DEBUG nova.network.os_vif_util [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.280 2 DEBUG nova.network.os_vif_util [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e7:bc,bridge_name='br-int',has_traffic_filtering=True,id=8c94fd31-501a-4da2-853a-b0a5df9f1c8e,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c94fd31-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.280 2 DEBUG os_vif [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e7:bc,bridge_name='br-int',has_traffic_filtering=True,id=8c94fd31-501a-4da2-853a-b0a5df9f1c8e,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c94fd31-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c94fd31-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.286 2 INFO os_vif [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e7:bc,bridge_name='br-int',has_traffic_filtering=True,id=8c94fd31-501a-4da2-853a-b0a5df9f1c8e,network=Network(c23309b9-6834-440c-b70d-83c63e6f455b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c94fd31-50')
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.286 2 INFO nova.virt.libvirt.driver [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Deleting instance files /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa_del
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.287 2 INFO nova.virt.libvirt.driver [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Deletion of /var/lib/nova/instances/bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa_del complete
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120[226458]: [NOTICE]   (226479) : haproxy version is 2.8.14-c23fe91
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120[226458]: [NOTICE]   (226479) : path to executable is /usr/sbin/haproxy
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120[226458]: [WARNING]  (226479) : Exiting Master process...
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120[226458]: [ALERT]    (226479) : Current worker (226482) exited with code 143 (Terminated)
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120[226458]: [WARNING]  (226479) : All workers exited. Exiting... (0)
Sep 30 21:23:24 compute-1 systemd[1]: libpod-cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c.scope: Deactivated successfully.
Sep 30 21:23:24 compute-1 podman[226573]: 2025-09-30 21:23:24.313864689 +0000 UTC m=+0.049847222 container died cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:23:24 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:23:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-9712d8139c6764d82e2e15e25c7d1cce66f80eeae43d036ef447bcb197add4e3-merged.mount: Deactivated successfully.
Sep 30 21:23:24 compute-1 podman[226573]: 2025-09-30 21:23:24.352574101 +0000 UTC m=+0.088556604 container cleanup cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:23:24 compute-1 systemd[1]: libpod-conmon-cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c.scope: Deactivated successfully.
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.380 2 INFO nova.compute.manager [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Took 0.49 seconds to destroy the instance on the hypervisor.
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.381 2 DEBUG oslo.service.loopingcall [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.382 2 DEBUG nova.compute.manager [-] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.382 2 DEBUG nova.network.neutron [-] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.386 2 DEBUG nova.compute.manager [req-21a3199a-4443-45c0-9e8f-0364da400653 req-b0263d00-f677-43b8-b45b-bfd61df6bb56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-unplugged-9b270a14-f424-4491-9f14-3084a3431112 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.387 2 DEBUG oslo_concurrency.lockutils [req-21a3199a-4443-45c0-9e8f-0364da400653 req-b0263d00-f677-43b8-b45b-bfd61df6bb56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.387 2 DEBUG oslo_concurrency.lockutils [req-21a3199a-4443-45c0-9e8f-0364da400653 req-b0263d00-f677-43b8-b45b-bfd61df6bb56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.387 2 DEBUG oslo_concurrency.lockutils [req-21a3199a-4443-45c0-9e8f-0364da400653 req-b0263d00-f677-43b8-b45b-bfd61df6bb56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.387 2 DEBUG nova.compute.manager [req-21a3199a-4443-45c0-9e8f-0364da400653 req-b0263d00-f677-43b8-b45b-bfd61df6bb56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] No waiting events found dispatching network-vif-unplugged-9b270a14-f424-4491-9f14-3084a3431112 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.387 2 DEBUG nova.compute.manager [req-21a3199a-4443-45c0-9e8f-0364da400653 req-b0263d00-f677-43b8-b45b-bfd61df6bb56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-unplugged-9b270a14-f424-4491-9f14-3084a3431112 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:23:24 compute-1 podman[226605]: 2025-09-30 21:23:24.428851384 +0000 UTC m=+0.052998637 container remove cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.436 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c5aa35f5-f35f-4d10-a605-e8d115fb0f04]: (4, ('Tue Sep 30 09:23:24 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120 (cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c)\ncd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c\nTue Sep 30 09:23:24 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120 (cd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c)\ncd16ad2ee56cd03bc21ab097db03030d8a69a1a1db8c91fd6a5f673a7c1d609c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.438 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[85de6521-092d-4b4c-927c-13f3c6afff8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.439 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapacbf49ed-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:24 compute-1 kernel: tapacbf49ed-a0: left promiscuous mode
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.456 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5e898d06-f897-4636-b1a3-bc3d4d252576]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.491 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[655ee90d-e360-4930-8106-86cd7913d2e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.492 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5626a9-4dfc-4091-aa8d-b4c177d279c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.519 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[19cf4f21-b5c2-4c93-9879-f11693e7342d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409572, 'reachable_time': 23253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226620, 'error': None, 'target': 'ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 systemd[1]: run-netns-ovnmeta\x2dacbf49ed\x2dafb9\x2d4cf1\x2da75d\x2df198ee3fb120.mount: Deactivated successfully.
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.522 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-acbf49ed-afb9-4cf1-a75d-f198ee3fb120 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.522 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[94514eda-12f8-4c00-af46-06d25033d8c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.524 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 8c94fd31-501a-4da2-853a-b0a5df9f1c8e in datapath c23309b9-6834-440c-b70d-83c63e6f455b unbound from our chassis
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.526 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c23309b9-6834-440c-b70d-83c63e6f455b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.526 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0e76c6-0620-4337-b5f1-1d9cb998909a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.527 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b namespace which is not needed anymore
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b[226375]: [NOTICE]   (226379) : haproxy version is 2.8.14-c23fe91
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b[226375]: [NOTICE]   (226379) : path to executable is /usr/sbin/haproxy
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b[226375]: [WARNING]  (226379) : Exiting Master process...
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b[226375]: [ALERT]    (226379) : Current worker (226381) exited with code 143 (Terminated)
Sep 30 21:23:24 compute-1 neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b[226375]: [WARNING]  (226379) : All workers exited. Exiting... (0)
Sep 30 21:23:24 compute-1 systemd[1]: libpod-a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658.scope: Deactivated successfully.
Sep 30 21:23:24 compute-1 podman[226638]: 2025-09-30 21:23:24.685947095 +0000 UTC m=+0.048388224 container died a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 21:23:24 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658-userdata-shm.mount: Deactivated successfully.
Sep 30 21:23:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-824d8d5d31ac2d8ba45b191192b59d721bb7455221f14596dbd3f64701d184e0-merged.mount: Deactivated successfully.
Sep 30 21:23:24 compute-1 podman[226638]: 2025-09-30 21:23:24.717795802 +0000 UTC m=+0.080236941 container cleanup a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:23:24 compute-1 systemd[1]: libpod-conmon-a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658.scope: Deactivated successfully.
Sep 30 21:23:24 compute-1 podman[226669]: 2025-09-30 21:23:24.800626141 +0000 UTC m=+0.055004501 container remove a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.810 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e5e6c2-3d45-451c-a057-696012c5d83c]: (4, ('Tue Sep 30 09:23:24 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b (a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658)\na686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658\nTue Sep 30 09:23:24 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b (a686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658)\na686a8e173432901058b50504355da4583a574a6bc3145a23d5c668ee9187658\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.812 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7d99a962-c476-4e7c-8af3-b65139ef7663]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.813 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc23309b9-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 kernel: tapc23309b9-60: left promiscuous mode
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 nova_compute[192795]: 2025-09-30 21:23:24.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.836 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cf74db81-7671-4430-ad94-827ec50f0fbe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.859 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e32c0aee-a0b0-4db0-93b1-108f6ca1aff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.860 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee93c92-1f3b-499a-b727-7e9a8118350c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.889 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[49bd28a4-4e38-40c9-a154-1df799dc91e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409444, 'reachable_time': 44234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226684, 'error': None, 'target': 'ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.892 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c23309b9-6834-440c-b70d-83c63e6f455b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:23:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:24.893 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[880adc77-ea02-43a9-8fb6-3459c8d4f756]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:25 compute-1 nova_compute[192795]: 2025-09-30 21:23:25.147 2 DEBUG nova.compute.manager [req-74fe2347-2c78-4bb4-9746-fa2087c1a3bb req-6158dbe3-d218-4796-98f3-94bb408ff9f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-deleted-e9d92195-9ca5-43a8-8e02-f057dfdf8378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:25 compute-1 nova_compute[192795]: 2025-09-30 21:23:25.147 2 INFO nova.compute.manager [req-74fe2347-2c78-4bb4-9746-fa2087c1a3bb req-6158dbe3-d218-4796-98f3-94bb408ff9f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Neutron deleted interface e9d92195-9ca5-43a8-8e02-f057dfdf8378; detaching it from the instance and deleting it from the info cache
Sep 30 21:23:25 compute-1 nova_compute[192795]: 2025-09-30 21:23:25.148 2 DEBUG nova.network.neutron [req-74fe2347-2c78-4bb4-9746-fa2087c1a3bb req-6158dbe3-d218-4796-98f3-94bb408ff9f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Updating instance_info_cache with network_info: [{"id": "9b270a14-f424-4491-9f14-3084a3431112", "address": "fa:16:3e:19:14:cc", "network": {"id": "acbf49ed-afb9-4cf1-a75d-f198ee3fb120", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-125824459", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.90", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b270a14-f4", "ovs_interfaceid": "9b270a14-f424-4491-9f14-3084a3431112", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "address": "fa:16:3e:ac:e7:bc", "network": {"id": "c23309b9-6834-440c-b70d-83c63e6f455b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1453074316", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c94fd31-50", "ovs_interfaceid": "8c94fd31-501a-4da2-853a-b0a5df9f1c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:25 compute-1 nova_compute[192795]: 2025-09-30 21:23:25.203 2 DEBUG nova.compute.manager [req-74fe2347-2c78-4bb4-9746-fa2087c1a3bb req-6158dbe3-d218-4796-98f3-94bb408ff9f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Detach interface failed, port_id=e9d92195-9ca5-43a8-8e02-f057dfdf8378, reason: Instance bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:23:25 compute-1 systemd[1]: run-netns-ovnmeta\x2dc23309b9\x2d6834\x2d440c\x2db70d\x2d83c63e6f455b.mount: Deactivated successfully.
Sep 30 21:23:25 compute-1 nova_compute[192795]: 2025-09-30 21:23:25.892 2 DEBUG nova.network.neutron [-] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:25 compute-1 nova_compute[192795]: 2025-09-30 21:23:25.917 2 INFO nova.compute.manager [-] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Took 1.54 seconds to deallocate network for instance.
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.005 2 DEBUG oslo_concurrency.lockutils [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.006 2 DEBUG oslo_concurrency.lockutils [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.115 2 DEBUG nova.compute.provider_tree [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.131 2 DEBUG nova.scheduler.client.report [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.152 2 DEBUG oslo_concurrency.lockutils [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.200 2 INFO nova.scheduler.client.report [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Deleted allocations for instance bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.254 2 DEBUG nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-unplugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.255 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.256 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.257 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.257 2 DEBUG nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] No waiting events found dispatching network-vif-unplugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.258 2 WARNING nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received unexpected event network-vif-unplugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 for instance with vm_state deleted and task_state None.
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.258 2 DEBUG nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-plugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.259 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.260 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.260 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.261 2 DEBUG nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] No waiting events found dispatching network-vif-plugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.261 2 WARNING nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received unexpected event network-vif-plugged-e9d92195-9ca5-43a8-8e02-f057dfdf8378 for instance with vm_state deleted and task_state None.
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.262 2 DEBUG nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-unplugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.263 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.263 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.264 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.264 2 DEBUG nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] No waiting events found dispatching network-vif-unplugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.265 2 WARNING nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received unexpected event network-vif-unplugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e for instance with vm_state deleted and task_state None.
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.265 2 DEBUG nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-plugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.266 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.267 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.267 2 DEBUG oslo_concurrency.lockutils [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.268 2 DEBUG nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] No waiting events found dispatching network-vif-plugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.268 2 WARNING nova.compute.manager [req-39f278bd-966e-4c69-a2a3-267200b965f8 req-a00c2207-0a45-4189-9a56-f7a30d2cd42d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received unexpected event network-vif-plugged-8c94fd31-501a-4da2-853a-b0a5df9f1c8e for instance with vm_state deleted and task_state None.
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.285 2 DEBUG oslo_concurrency.lockutils [None req-b13e6dd2-e3b0-4667-9cf3-9e3d7de21612 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.493 2 DEBUG nova.compute.manager [req-c37951d2-8178-4200-be84-eaf98669a1ca req-e293e499-6b72-481e-ad7c-2684f3cf98c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-plugged-9b270a14-f424-4491-9f14-3084a3431112 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.494 2 DEBUG oslo_concurrency.lockutils [req-c37951d2-8178-4200-be84-eaf98669a1ca req-e293e499-6b72-481e-ad7c-2684f3cf98c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.494 2 DEBUG oslo_concurrency.lockutils [req-c37951d2-8178-4200-be84-eaf98669a1ca req-e293e499-6b72-481e-ad7c-2684f3cf98c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.495 2 DEBUG oslo_concurrency.lockutils [req-c37951d2-8178-4200-be84-eaf98669a1ca req-e293e499-6b72-481e-ad7c-2684f3cf98c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.495 2 DEBUG nova.compute.manager [req-c37951d2-8178-4200-be84-eaf98669a1ca req-e293e499-6b72-481e-ad7c-2684f3cf98c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] No waiting events found dispatching network-vif-plugged-9b270a14-f424-4491-9f14-3084a3431112 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.496 2 WARNING nova.compute.manager [req-c37951d2-8178-4200-be84-eaf98669a1ca req-e293e499-6b72-481e-ad7c-2684f3cf98c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received unexpected event network-vif-plugged-9b270a14-f424-4491-9f14-3084a3431112 for instance with vm_state deleted and task_state None.
Sep 30 21:23:26 compute-1 nova_compute[192795]: 2025-09-30 21:23:26.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:27 compute-1 nova_compute[192795]: 2025-09-30 21:23:27.226 2 DEBUG nova.compute.manager [req-95103e30-112e-4d95-8a58-2ed6282ee1f7 req-b831523b-7f30-4a54-a4af-8e30d9cf9cb9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-deleted-9b270a14-f424-4491-9f14-3084a3431112 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:27 compute-1 nova_compute[192795]: 2025-09-30 21:23:27.227 2 DEBUG nova.compute.manager [req-95103e30-112e-4d95-8a58-2ed6282ee1f7 req-b831523b-7f30-4a54-a4af-8e30d9cf9cb9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Received event network-vif-deleted-8c94fd31-501a-4da2-853a-b0a5df9f1c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:28 compute-1 nova_compute[192795]: 2025-09-30 21:23:28.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:28 compute-1 podman[226685]: 2025-09-30 21:23:28.270362085 +0000 UTC m=+0.095618594 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:23:28 compute-1 podman[226687]: 2025-09-30 21:23:28.282489052 +0000 UTC m=+0.087883467 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:23:28 compute-1 podman[226686]: 2025-09-30 21:23:28.326931917 +0000 UTC m=+0.138065756 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:23:29 compute-1 nova_compute[192795]: 2025-09-30 21:23:29.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:31 compute-1 nova_compute[192795]: 2025-09-30 21:23:31.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:32 compute-1 podman[226754]: 2025-09-30 21:23:32.231598648 +0000 UTC m=+0.076388456 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:23:34 compute-1 nova_compute[192795]: 2025-09-30 21:23:34.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:35.468 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:35 compute-1 nova_compute[192795]: 2025-09-30 21:23:35.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:35.471 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:23:35 compute-1 nova_compute[192795]: 2025-09-30 21:23:35.714 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:36.475 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:36 compute-1 nova_compute[192795]: 2025-09-30 21:23:36.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:38.684 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:38.685 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:38.685 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.221 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267404.2202137, bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.222 2 INFO nova.compute.manager [-] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] VM Stopped (Lifecycle Event)
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.245 2 DEBUG nova.compute.manager [None req-6ba8f92d-d752-4e48-9116-acb3eb41e2b3 - - - - - -] [instance: bfd4e9ed-c49b-48dc-9e74-8f916fae2cfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.460 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.460 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.487 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.597 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.598 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.605 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.605 2 INFO nova.compute.claims [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.768 2 DEBUG nova.compute.provider_tree [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.785 2 DEBUG nova.scheduler.client.report [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.811 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.812 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.865 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.866 2 DEBUG nova.network.neutron [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.886 2 INFO nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:23:39 compute-1 nova_compute[192795]: 2025-09-30 21:23:39.902 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.021 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.024 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.025 2 INFO nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Creating image(s)
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.026 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "/var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.027 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "/var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.028 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "/var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.056 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.160 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.162 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.163 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.180 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:40 compute-1 podman[226776]: 2025-09-30 21:23:40.241845296 +0000 UTC m=+0.072514553 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.254 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.255 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.301 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.303 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.303 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.378 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.381 2 DEBUG nova.virt.disk.api [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Checking if we can resize image /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.382 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.415 2 DEBUG nova.policy [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '80abdc8ce51444378234b07daa877ac7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.449 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.451 2 DEBUG nova.virt.disk.api [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Cannot resize image /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.451 2 DEBUG nova.objects.instance [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lazy-loading 'migration_context' on Instance uuid 9eeaaff3-c30b-4b21-8955-570ddc4444b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.474 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.474 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Ensure instance console log exists: /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.475 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.476 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.477 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.718 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.719 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.719 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.720 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.959 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.961 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5703MB free_disk=73.46217346191406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.961 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:40 compute-1 nova_compute[192795]: 2025-09-30 21:23:40.961 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.035 2 DEBUG nova.network.neutron [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Successfully created port: 7c19ac04-f090-41a4-8712-2a1661fbf4db _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.060 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 9eeaaff3-c30b-4b21-8955-570ddc4444b6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.060 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.061 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.118 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.137 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.169 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.169 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.668 2 DEBUG nova.network.neutron [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Successfully created port: d0acc879-5ca2-4e3f-94c9-c24f4b88de6a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:23:41 compute-1 nova_compute[192795]: 2025-09-30 21:23:41.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.169 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:42 compute-1 podman[226810]: 2025-09-30 21:23:42.221705517 +0000 UTC m=+0.056558333 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:23:42 compute-1 podman[226809]: 2025-09-30 21:23:42.236112405 +0000 UTC m=+0.074632060 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9)
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.433 2 DEBUG nova.network.neutron [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Successfully updated port: 7c19ac04-f090-41a4-8712-2a1661fbf4db _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.549 2 DEBUG nova.compute.manager [req-d10bf76e-364d-4469-bd63-5698f7610fa7 req-a3239398-48ae-45e8-a67a-68eedbb44e8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-changed-7c19ac04-f090-41a4-8712-2a1661fbf4db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.549 2 DEBUG nova.compute.manager [req-d10bf76e-364d-4469-bd63-5698f7610fa7 req-a3239398-48ae-45e8-a67a-68eedbb44e8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Refreshing instance network info cache due to event network-changed-7c19ac04-f090-41a4-8712-2a1661fbf4db. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.550 2 DEBUG oslo_concurrency.lockutils [req-d10bf76e-364d-4469-bd63-5698f7610fa7 req-a3239398-48ae-45e8-a67a-68eedbb44e8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-9eeaaff3-c30b-4b21-8955-570ddc4444b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.550 2 DEBUG oslo_concurrency.lockutils [req-d10bf76e-364d-4469-bd63-5698f7610fa7 req-a3239398-48ae-45e8-a67a-68eedbb44e8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-9eeaaff3-c30b-4b21-8955-570ddc4444b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.550 2 DEBUG nova.network.neutron [req-d10bf76e-364d-4469-bd63-5698f7610fa7 req-a3239398-48ae-45e8-a67a-68eedbb44e8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Refreshing network info cache for port 7c19ac04-f090-41a4-8712-2a1661fbf4db _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:23:42 compute-1 nova_compute[192795]: 2025-09-30 21:23:42.768 2 DEBUG nova.network.neutron [req-d10bf76e-364d-4469-bd63-5698f7610fa7 req-a3239398-48ae-45e8-a67a-68eedbb44e8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:43 compute-1 nova_compute[192795]: 2025-09-30 21:23:43.264 2 DEBUG nova.network.neutron [req-d10bf76e-364d-4469-bd63-5698f7610fa7 req-a3239398-48ae-45e8-a67a-68eedbb44e8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:43 compute-1 nova_compute[192795]: 2025-09-30 21:23:43.282 2 DEBUG oslo_concurrency.lockutils [req-d10bf76e-364d-4469-bd63-5698f7610fa7 req-a3239398-48ae-45e8-a67a-68eedbb44e8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-9eeaaff3-c30b-4b21-8955-570ddc4444b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:43 compute-1 nova_compute[192795]: 2025-09-30 21:23:43.449 2 DEBUG nova.network.neutron [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Successfully updated port: d0acc879-5ca2-4e3f-94c9-c24f4b88de6a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:23:43 compute-1 nova_compute[192795]: 2025-09-30 21:23:43.464 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "refresh_cache-9eeaaff3-c30b-4b21-8955-570ddc4444b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:43 compute-1 nova_compute[192795]: 2025-09-30 21:23:43.464 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquired lock "refresh_cache-9eeaaff3-c30b-4b21-8955-570ddc4444b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:43 compute-1 nova_compute[192795]: 2025-09-30 21:23:43.464 2 DEBUG nova.network.neutron [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:23:43 compute-1 nova_compute[192795]: 2025-09-30 21:23:43.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:44 compute-1 nova_compute[192795]: 2025-09-30 21:23:44.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:44 compute-1 nova_compute[192795]: 2025-09-30 21:23:44.399 2 DEBUG nova.network.neutron [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:44 compute-1 nova_compute[192795]: 2025-09-30 21:23:44.776 2 DEBUG nova.compute.manager [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-changed-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:44 compute-1 nova_compute[192795]: 2025-09-30 21:23:44.776 2 DEBUG nova.compute.manager [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Refreshing instance network info cache due to event network-changed-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:44 compute-1 nova_compute[192795]: 2025-09-30 21:23:44.776 2 DEBUG oslo_concurrency.lockutils [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-9eeaaff3-c30b-4b21-8955-570ddc4444b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:46 compute-1 nova_compute[192795]: 2025-09-30 21:23:46.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:46 compute-1 nova_compute[192795]: 2025-09-30 21:23:46.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:23:46 compute-1 nova_compute[192795]: 2025-09-30 21:23:46.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:23:46 compute-1 nova_compute[192795]: 2025-09-30 21:23:46.727 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:23:46 compute-1 nova_compute[192795]: 2025-09-30 21:23:46.728 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:23:46 compute-1 nova_compute[192795]: 2025-09-30 21:23:46.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.647 2 DEBUG nova.network.neutron [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Updating instance_info_cache with network_info: [{"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.669 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Releasing lock "refresh_cache-9eeaaff3-c30b-4b21-8955-570ddc4444b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.670 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Instance network_info: |[{"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.670 2 DEBUG oslo_concurrency.lockutils [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-9eeaaff3-c30b-4b21-8955-570ddc4444b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.671 2 DEBUG nova.network.neutron [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Refreshing network info cache for port d0acc879-5ca2-4e3f-94c9-c24f4b88de6a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.674 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Start _get_guest_xml network_info=[{"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.679 2 WARNING nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.686 2 DEBUG nova.virt.libvirt.host [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.687 2 DEBUG nova.virt.libvirt.host [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.696 2 DEBUG nova.virt.libvirt.host [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.697 2 DEBUG nova.virt.libvirt.host [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.699 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.699 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.699 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.700 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.700 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.700 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.701 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.701 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.701 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.701 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.702 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.702 2 DEBUG nova.virt.hardware [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.707 2 DEBUG nova.virt.libvirt.vif [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1249593289',display_name='tempest-ServersTestMultiNic-server-1249593289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1249593289',id=43,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-wz2oapbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:39Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=9eeaaff3-c30b-4b21-8955-570ddc4444b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.708 2 DEBUG nova.network.os_vif_util [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.709 2 DEBUG nova.network.os_vif_util [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=7c19ac04-f090-41a4-8712-2a1661fbf4db,network=Network(2f8397c5-7d5c-493b-98d1-beedbfee4a07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c19ac04-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.710 2 DEBUG nova.virt.libvirt.vif [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1249593289',display_name='tempest-ServersTestMultiNic-server-1249593289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1249593289',id=43,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-wz2oapbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:39Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=9eeaaff3-c30b-4b21-8955-570ddc4444b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.710 2 DEBUG nova.network.os_vif_util [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.711 2 DEBUG nova.network.os_vif_util [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:51:60,bridge_name='br-int',has_traffic_filtering=True,id=d0acc879-5ca2-4e3f-94c9-c24f4b88de6a,network=Network(df283875-3d9d-495f-9575-5f8e249bc3a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0acc879-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.712 2 DEBUG nova.objects.instance [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9eeaaff3-c30b-4b21-8955-570ddc4444b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.731 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <uuid>9eeaaff3-c30b-4b21-8955-570ddc4444b6</uuid>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <name>instance-0000002b</name>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersTestMultiNic-server-1249593289</nova:name>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:23:47</nova:creationTime>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:23:47 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         <nova:user uuid="80abdc8ce51444378234b07daa877ac7">tempest-ServersTestMultiNic-108666806-project-member</nova:user>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         <nova:project uuid="da6a860bf9ab4d91b946d9a35e448d94">tempest-ServersTestMultiNic-108666806</nova:project>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         <nova:port uuid="7c19ac04-f090-41a4-8712-2a1661fbf4db">
Sep 30 21:23:47 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.158" ipVersion="4"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         <nova:port uuid="d0acc879-5ca2-4e3f-94c9-c24f4b88de6a">
Sep 30 21:23:47 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.1.194" ipVersion="4"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <system>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <entry name="serial">9eeaaff3-c30b-4b21-8955-570ddc4444b6</entry>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <entry name="uuid">9eeaaff3-c30b-4b21-8955-570ddc4444b6</entry>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </system>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <os>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   </os>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <features>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   </features>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk.config"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:26:b2:36"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <target dev="tap7c19ac04-f0"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:eb:51:60"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <target dev="tapd0acc879-5c"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/console.log" append="off"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <video>
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </video>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:23:47 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:23:47 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:23:47 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:23:47 compute-1 nova_compute[192795]: </domain>
Sep 30 21:23:47 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.733 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Preparing to wait for external event network-vif-plugged-7c19ac04-f090-41a4-8712-2a1661fbf4db prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.733 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.734 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.734 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.734 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Preparing to wait for external event network-vif-plugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.734 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.735 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.735 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.736 2 DEBUG nova.virt.libvirt.vif [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1249593289',display_name='tempest-ServersTestMultiNic-server-1249593289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1249593289',id=43,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-wz2oapbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:39Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=9eeaaff3-c30b-4b21-8955-570ddc4444b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.736 2 DEBUG nova.network.os_vif_util [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.737 2 DEBUG nova.network.os_vif_util [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=7c19ac04-f090-41a4-8712-2a1661fbf4db,network=Network(2f8397c5-7d5c-493b-98d1-beedbfee4a07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c19ac04-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.737 2 DEBUG os_vif [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=7c19ac04-f090-41a4-8712-2a1661fbf4db,network=Network(2f8397c5-7d5c-493b-98d1-beedbfee4a07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c19ac04-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.739 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.739 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c19ac04-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7c19ac04-f0, col_values=(('external_ids', {'iface-id': '7c19ac04-f090-41a4-8712-2a1661fbf4db', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:b2:36', 'vm-uuid': '9eeaaff3-c30b-4b21-8955-570ddc4444b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:47 compute-1 NetworkManager[51724]: <info>  [1759267427.7477] manager: (tap7c19ac04-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.754 2 INFO os_vif [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=7c19ac04-f090-41a4-8712-2a1661fbf4db,network=Network(2f8397c5-7d5c-493b-98d1-beedbfee4a07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c19ac04-f0')
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.755 2 DEBUG nova.virt.libvirt.vif [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1249593289',display_name='tempest-ServersTestMultiNic-server-1249593289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1249593289',id=43,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-wz2oapbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:39Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=9eeaaff3-c30b-4b21-8955-570ddc4444b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.756 2 DEBUG nova.network.os_vif_util [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.757 2 DEBUG nova.network.os_vif_util [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:51:60,bridge_name='br-int',has_traffic_filtering=True,id=d0acc879-5ca2-4e3f-94c9-c24f4b88de6a,network=Network(df283875-3d9d-495f-9575-5f8e249bc3a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0acc879-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.757 2 DEBUG os_vif [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:51:60,bridge_name='br-int',has_traffic_filtering=True,id=d0acc879-5ca2-4e3f-94c9-c24f4b88de6a,network=Network(df283875-3d9d-495f-9575-5f8e249bc3a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0acc879-5c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.758 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.758 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.761 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd0acc879-5c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.762 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd0acc879-5c, col_values=(('external_ids', {'iface-id': 'd0acc879-5ca2-4e3f-94c9-c24f4b88de6a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:51:60', 'vm-uuid': '9eeaaff3-c30b-4b21-8955-570ddc4444b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:47 compute-1 NetworkManager[51724]: <info>  [1759267427.7649] manager: (tapd0acc879-5c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.773 2 INFO os_vif [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:51:60,bridge_name='br-int',has_traffic_filtering=True,id=d0acc879-5ca2-4e3f-94c9-c24f4b88de6a,network=Network(df283875-3d9d-495f-9575-5f8e249bc3a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0acc879-5c')
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.834 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.835 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.835 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] No VIF found with MAC fa:16:3e:26:b2:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.836 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] No VIF found with MAC fa:16:3e:eb:51:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:23:47 compute-1 nova_compute[192795]: 2025-09-30 21:23:47.836 2 INFO nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Using config drive
Sep 30 21:23:48 compute-1 nova_compute[192795]: 2025-09-30 21:23:48.723 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:48 compute-1 nova_compute[192795]: 2025-09-30 21:23:48.804 2 INFO nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Creating config drive at /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk.config
Sep 30 21:23:48 compute-1 nova_compute[192795]: 2025-09-30 21:23:48.809 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7f2xpx2l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:48 compute-1 nova_compute[192795]: 2025-09-30 21:23:48.937 2 DEBUG oslo_concurrency.processutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7f2xpx2l" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:49 compute-1 NetworkManager[51724]: <info>  [1759267429.0103] manager: (tap7c19ac04-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Sep 30 21:23:49 compute-1 kernel: tap7c19ac04-f0: entered promiscuous mode
Sep 30 21:23:49 compute-1 ovn_controller[94902]: 2025-09-30T21:23:49Z|00173|binding|INFO|Claiming lport 7c19ac04-f090-41a4-8712-2a1661fbf4db for this chassis.
Sep 30 21:23:49 compute-1 ovn_controller[94902]: 2025-09-30T21:23:49Z|00174|binding|INFO|7c19ac04-f090-41a4-8712-2a1661fbf4db: Claiming fa:16:3e:26:b2:36 10.100.0.158
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:49 compute-1 NetworkManager[51724]: <info>  [1759267429.0332] manager: (tapd0acc879-5c): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.036 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:b2:36 10.100.0.158'], port_security=['fa:16:3e:26:b2:36 10.100.0.158'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.158/24', 'neutron:device_id': '9eeaaff3-c30b-4b21-8955-570ddc4444b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f8397c5-7d5c-493b-98d1-beedbfee4a07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d069bc96-e0bf-415d-b198-5dbfe173420d, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=7c19ac04-f090-41a4-8712-2a1661fbf4db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.037 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 7c19ac04-f090-41a4-8712-2a1661fbf4db in datapath 2f8397c5-7d5c-493b-98d1-beedbfee4a07 bound to our chassis
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.039 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2f8397c5-7d5c-493b-98d1-beedbfee4a07
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.055 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c00b91bb-8fff-494e-8284-6c49669b3286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.057 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2f8397c5-71 in ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.058 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2f8397c5-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.059 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4da77b62-18d1-461c-9c26-e8aa856d67e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.061 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[78387832-cc2b-4a12-ba0a-61cb7a4b5550]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 systemd-udevd[226880]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:49 compute-1 systemd-udevd[226879]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.073 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[a3494432-51af-4296-924f-a5e14af33900]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 NetworkManager[51724]: <info>  [1759267429.0792] device (tap7c19ac04-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:23:49 compute-1 NetworkManager[51724]: <info>  [1759267429.0807] device (tap7c19ac04-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:23:49 compute-1 kernel: tapd0acc879-5c: entered promiscuous mode
Sep 30 21:23:49 compute-1 ovn_controller[94902]: 2025-09-30T21:23:49Z|00175|binding|INFO|Claiming lport d0acc879-5ca2-4e3f-94c9-c24f4b88de6a for this chassis.
Sep 30 21:23:49 compute-1 ovn_controller[94902]: 2025-09-30T21:23:49Z|00176|binding|INFO|d0acc879-5ca2-4e3f-94c9-c24f4b88de6a: Claiming fa:16:3e:eb:51:60 10.100.1.194
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:49 compute-1 NetworkManager[51724]: <info>  [1759267429.0948] device (tapd0acc879-5c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:23:49 compute-1 ovn_controller[94902]: 2025-09-30T21:23:49Z|00177|binding|INFO|Setting lport 7c19ac04-f090-41a4-8712-2a1661fbf4db ovn-installed in OVS
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.098 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:51:60 10.100.1.194'], port_security=['fa:16:3e:eb:51:60 10.100.1.194'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.194/24', 'neutron:device_id': '9eeaaff3-c30b-4b21-8955-570ddc4444b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df283875-3d9d-495f-9575-5f8e249bc3a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d38a0621-ad3f-4193-b927-6131cb09407c, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=d0acc879-5ca2-4e3f-94c9-c24f4b88de6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:49 compute-1 ovn_controller[94902]: 2025-09-30T21:23:49Z|00178|binding|INFO|Setting lport 7c19ac04-f090-41a4-8712-2a1661fbf4db up in Southbound
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.099 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f88f08a8-9994-416f-a71d-413f8716acba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 NetworkManager[51724]: <info>  [1759267429.1018] device (tapd0acc879-5c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:23:49 compute-1 systemd-machined[152783]: New machine qemu-22-instance-0000002b.
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:49 compute-1 systemd[1]: Started Virtual Machine qemu-22-instance-0000002b.
Sep 30 21:23:49 compute-1 ovn_controller[94902]: 2025-09-30T21:23:49Z|00179|binding|INFO|Setting lport d0acc879-5ca2-4e3f-94c9-c24f4b88de6a ovn-installed in OVS
Sep 30 21:23:49 compute-1 ovn_controller[94902]: 2025-09-30T21:23:49Z|00180|binding|INFO|Setting lport d0acc879-5ca2-4e3f-94c9-c24f4b88de6a up in Southbound
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.142 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[dd742515-8d48-4208-9902-3ba733c2b661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.149 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef54f4b-9dd2-4080-b6c2-b538d61d545f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 systemd-udevd[226887]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:49 compute-1 NetworkManager[51724]: <info>  [1759267429.1504] manager: (tap2f8397c5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.184 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ac508de8-4a59-49d6-b87f-c8acbde247c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.190 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a1521aea-8cc0-439c-98e8-ad26a62bd79f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 NetworkManager[51724]: <info>  [1759267429.2154] device (tap2f8397c5-70): carrier: link connected
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.221 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7bcc7542-c1f2-45b0-933c-dd9c73207529]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.241 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1556a091-450b-4b43-99a0-9e2fad71bab6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2f8397c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:d0:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412488, 'reachable_time': 24163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226915, 'error': None, 'target': 'ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.264 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2e638b57-a4fc-4ace-8f86-100101db7d5b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5e:d0dc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412488, 'tstamp': 412488}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226916, 'error': None, 'target': 'ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.285 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3882d8b9-a70e-4c0f-89cb-2f74c46ce33c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2f8397c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:d0:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412488, 'reachable_time': 24163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226917, 'error': None, 'target': 'ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.325 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9fcb45bd-9bd7-4f52-8216-57934c3df801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.397 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8e1ca1-b03a-45a2-b542-104826fb9128]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.398 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f8397c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.399 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.399 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f8397c5-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:49 compute-1 kernel: tap2f8397c5-70: entered promiscuous mode
Sep 30 21:23:49 compute-1 NetworkManager[51724]: <info>  [1759267429.4019] manager: (tap2f8397c5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.404 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2f8397c5-70, col_values=(('external_ids', {'iface-id': '93800c15-188d-458d-a332-4b4f0b15344c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:49 compute-1 ovn_controller[94902]: 2025-09-30T21:23:49Z|00181|binding|INFO|Releasing lport 93800c15-188d-458d-a332-4b4f0b15344c from this chassis (sb_readonly=0)
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.406 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2f8397c5-7d5c-493b-98d1-beedbfee4a07.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2f8397c5-7d5c-493b-98d1-beedbfee4a07.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.407 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5c642e29-7299-4c13-a8f2-963a41672f9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.408 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-2f8397c5-7d5c-493b-98d1-beedbfee4a07
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/2f8397c5-7d5c-493b-98d1-beedbfee4a07.pid.haproxy
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 2f8397c5-7d5c-493b-98d1-beedbfee4a07
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:23:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:49.410 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07', 'env', 'PROCESS_TAG=haproxy-2f8397c5-7d5c-493b-98d1-beedbfee4a07', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2f8397c5-7d5c-493b-98d1-beedbfee4a07.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.531 2 DEBUG nova.compute.manager [req-0174a2e2-edca-4836-9ec1-3061dd13c8f9 req-bbbc7712-6d2c-4fd5-9018-fd6e8fda7720 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-plugged-7c19ac04-f090-41a4-8712-2a1661fbf4db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.532 2 DEBUG oslo_concurrency.lockutils [req-0174a2e2-edca-4836-9ec1-3061dd13c8f9 req-bbbc7712-6d2c-4fd5-9018-fd6e8fda7720 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.532 2 DEBUG oslo_concurrency.lockutils [req-0174a2e2-edca-4836-9ec1-3061dd13c8f9 req-bbbc7712-6d2c-4fd5-9018-fd6e8fda7720 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.532 2 DEBUG oslo_concurrency.lockutils [req-0174a2e2-edca-4836-9ec1-3061dd13c8f9 req-bbbc7712-6d2c-4fd5-9018-fd6e8fda7720 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.533 2 DEBUG nova.compute.manager [req-0174a2e2-edca-4836-9ec1-3061dd13c8f9 req-bbbc7712-6d2c-4fd5-9018-fd6e8fda7720 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Processing event network-vif-plugged-7c19ac04-f090-41a4-8712-2a1661fbf4db _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.737 2 DEBUG nova.compute.manager [req-0b2c0f58-3cac-4609-8a2f-90d66c6b2760 req-963e5d0c-dc77-4ea9-ad6d-d620fc6be833 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-plugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.738 2 DEBUG oslo_concurrency.lockutils [req-0b2c0f58-3cac-4609-8a2f-90d66c6b2760 req-963e5d0c-dc77-4ea9-ad6d-d620fc6be833 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.738 2 DEBUG oslo_concurrency.lockutils [req-0b2c0f58-3cac-4609-8a2f-90d66c6b2760 req-963e5d0c-dc77-4ea9-ad6d-d620fc6be833 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.738 2 DEBUG oslo_concurrency.lockutils [req-0b2c0f58-3cac-4609-8a2f-90d66c6b2760 req-963e5d0c-dc77-4ea9-ad6d-d620fc6be833 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:49 compute-1 nova_compute[192795]: 2025-09-30 21:23:49.739 2 DEBUG nova.compute.manager [req-0b2c0f58-3cac-4609-8a2f-90d66c6b2760 req-963e5d0c-dc77-4ea9-ad6d-d620fc6be833 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Processing event network-vif-plugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:23:49 compute-1 podman[226957]: 2025-09-30 21:23:49.82869462 +0000 UTC m=+0.065455022 container create 717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:23:49 compute-1 systemd[1]: Started libpod-conmon-717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56.scope.
Sep 30 21:23:49 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:23:49 compute-1 podman[226957]: 2025-09-30 21:23:49.801609822 +0000 UTC m=+0.038370244 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:23:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cc5895ebff095b28827dfdc07e44325b3552004b8d41f15405ab3a9dce57ab6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:23:49 compute-1 podman[226957]: 2025-09-30 21:23:49.921943961 +0000 UTC m=+0.158704393 container init 717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:23:49 compute-1 podman[226957]: 2025-09-30 21:23:49.928996161 +0000 UTC m=+0.165756563 container start 717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:23:49 compute-1 neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07[226974]: [NOTICE]   (226978) : New worker (226980) forked
Sep 30 21:23:49 compute-1 neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07[226974]: [NOTICE]   (226978) : Loading success.
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.014 103861 INFO neutron.agent.ovn.metadata.agent [-] Port d0acc879-5ca2-4e3f-94c9-c24f4b88de6a in datapath df283875-3d9d-495f-9575-5f8e249bc3a8 unbound from our chassis
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.015 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df283875-3d9d-495f-9575-5f8e249bc3a8
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.021 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.022 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267430.0205219, 9eeaaff3-c30b-4b21-8955-570ddc4444b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.022 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] VM Started (Lifecycle Event)
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.025 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.029 2 INFO nova.virt.libvirt.driver [-] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Instance spawned successfully.
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.030 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.035 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1d990a7d-fd5e-4fc5-9103-61be12ddbf1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.036 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdf283875-31 in ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.039 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdf283875-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.039 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2046bf2f-0be7-4eef-ac9e-1bb1aaaabe36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.040 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[854e0a01-50c4-4546-a763-e6e2042611ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.045 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.051 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.054 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.054 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.055 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.055 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.056 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.056 2 DEBUG nova.virt.libvirt.driver [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.061 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[ff313caf-c3ea-42a3-a396-552334c5a3e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.079 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b70e29-d17e-4907-968f-14964d57d151]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.087 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.087 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267430.0218997, 9eeaaff3-c30b-4b21-8955-570ddc4444b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.087 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] VM Paused (Lifecycle Event)
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.115 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4bcd5a77-ed59-4e18-9750-8ba44ac4f13d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 systemd-udevd[226904]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:50 compute-1 NetworkManager[51724]: <info>  [1759267430.1233] manager: (tapdf283875-30): new Veth device (/org/freedesktop/NetworkManager/Devices/90)
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.121 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[91f8df8c-4057-4584-826c-c42a8e0e3f71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.132 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.142 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267430.0256608, 9eeaaff3-c30b-4b21-8955-570ddc4444b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.143 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] VM Resumed (Lifecycle Event)
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.146 2 INFO nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Took 10.12 seconds to spawn the instance on the hypervisor.
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.147 2 DEBUG nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.165 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.170 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.171 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[af98b94f-4b9e-4d99-9f32-4fd4edce4188]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.174 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[90b74c64-50b2-47d0-9462-feb3846c6483]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.199 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:50 compute-1 NetworkManager[51724]: <info>  [1759267430.2021] device (tapdf283875-30): carrier: link connected
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.212 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[eab6897f-d138-43c8-b53d-ad5ac2e5d564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.238 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1f83ac-23ab-4f74-bc89-95ab329343fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf283875-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:c6:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412587, 'reachable_time': 30960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226999, 'error': None, 'target': 'ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.258 2 INFO nova.compute.manager [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Took 10.70 seconds to build instance.
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.262 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[29c5b8ed-cf9d-42c1-ab26-3cc36fb93339]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:c648'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412587, 'tstamp': 412587}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227000, 'error': None, 'target': 'ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.282 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d91588-51c8-49c2-8554-8d8eac856e2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf283875-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:c6:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412587, 'reachable_time': 30960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227001, 'error': None, 'target': 'ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.292 2 DEBUG oslo_concurrency.lockutils [None req-85a05c06-1548-4baa-85c7-dbca91e8a220 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.832s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.322 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bda2b470-15ef-48e5-80b9-1c7325229504]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.391 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[71c5ec50-6d85-4e45-8cae-437443c8b328]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.393 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf283875-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.393 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.393 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf283875-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:50 compute-1 NetworkManager[51724]: <info>  [1759267430.3972] manager: (tapdf283875-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:50 compute-1 kernel: tapdf283875-30: entered promiscuous mode
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.401 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf283875-30, col_values=(('external_ids', {'iface-id': '63f8dc5e-5cd1-429c-b0b3-54f70530a3d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:50 compute-1 ovn_controller[94902]: 2025-09-30T21:23:50Z|00182|binding|INFO|Releasing lport 63f8dc5e-5cd1-429c-b0b3-54f70530a3d6 from this chassis (sb_readonly=0)
Sep 30 21:23:50 compute-1 nova_compute[192795]: 2025-09-30 21:23:50.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.431 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df283875-3d9d-495f-9575-5f8e249bc3a8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df283875-3d9d-495f-9575-5f8e249bc3a8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.432 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[99bd293b-393a-4ea8-8349-55d28e4df9e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.433 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-df283875-3d9d-495f-9575-5f8e249bc3a8
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/df283875-3d9d-495f-9575-5f8e249bc3a8.pid.haproxy
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID df283875-3d9d-495f-9575-5f8e249bc3a8
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:23:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:50.434 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8', 'env', 'PROCESS_TAG=haproxy-df283875-3d9d-495f-9575-5f8e249bc3a8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/df283875-3d9d-495f-9575-5f8e249bc3a8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:23:50 compute-1 podman[227033]: 2025-09-30 21:23:50.828585864 +0000 UTC m=+0.045678800 container create c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:23:50 compute-1 systemd[1]: Started libpod-conmon-c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce.scope.
Sep 30 21:23:50 compute-1 podman[227033]: 2025-09-30 21:23:50.806353886 +0000 UTC m=+0.023446822 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:23:50 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:23:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de6f3ee221b8b1eb960a448470c1326f5f6e901a0c7448172880cc98df9722ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:23:50 compute-1 podman[227033]: 2025-09-30 21:23:50.919722738 +0000 UTC m=+0.136815674 container init c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:23:50 compute-1 podman[227033]: 2025-09-30 21:23:50.926662514 +0000 UTC m=+0.143755450 container start c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true)
Sep 30 21:23:50 compute-1 neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8[227049]: [NOTICE]   (227053) : New worker (227055) forked
Sep 30 21:23:50 compute-1 neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8[227049]: [NOTICE]   (227053) : Loading success.
Sep 30 21:23:51 compute-1 podman[227064]: 2025-09-30 21:23:51.239535406 +0000 UTC m=+0.069593215 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.433 2 DEBUG nova.network.neutron [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Updated VIF entry in instance network info cache for port d0acc879-5ca2-4e3f-94c9-c24f4b88de6a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.433 2 DEBUG nova.network.neutron [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Updating instance_info_cache with network_info: [{"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.452 2 DEBUG oslo_concurrency.lockutils [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-9eeaaff3-c30b-4b21-8955-570ddc4444b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.677 2 DEBUG nova.compute.manager [req-d3595c8f-1265-442e-a5d1-ab5fc8a45546 req-8a9a7a34-b5f5-4055-a5fa-1f89bb1bb601 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-plugged-7c19ac04-f090-41a4-8712-2a1661fbf4db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.677 2 DEBUG oslo_concurrency.lockutils [req-d3595c8f-1265-442e-a5d1-ab5fc8a45546 req-8a9a7a34-b5f5-4055-a5fa-1f89bb1bb601 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.677 2 DEBUG oslo_concurrency.lockutils [req-d3595c8f-1265-442e-a5d1-ab5fc8a45546 req-8a9a7a34-b5f5-4055-a5fa-1f89bb1bb601 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.677 2 DEBUG oslo_concurrency.lockutils [req-d3595c8f-1265-442e-a5d1-ab5fc8a45546 req-8a9a7a34-b5f5-4055-a5fa-1f89bb1bb601 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.678 2 DEBUG nova.compute.manager [req-d3595c8f-1265-442e-a5d1-ab5fc8a45546 req-8a9a7a34-b5f5-4055-a5fa-1f89bb1bb601 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] No waiting events found dispatching network-vif-plugged-7c19ac04-f090-41a4-8712-2a1661fbf4db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.678 2 WARNING nova.compute.manager [req-d3595c8f-1265-442e-a5d1-ab5fc8a45546 req-8a9a7a34-b5f5-4055-a5fa-1f89bb1bb601 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received unexpected event network-vif-plugged-7c19ac04-f090-41a4-8712-2a1661fbf4db for instance with vm_state active and task_state None.
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.885 2 DEBUG nova.compute.manager [req-b9b1865e-55f9-44fe-9be1-2aa8d6342d7d req-62580749-38d7-4591-b57f-6653f86ebb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-plugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.885 2 DEBUG oslo_concurrency.lockutils [req-b9b1865e-55f9-44fe-9be1-2aa8d6342d7d req-62580749-38d7-4591-b57f-6653f86ebb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.886 2 DEBUG oslo_concurrency.lockutils [req-b9b1865e-55f9-44fe-9be1-2aa8d6342d7d req-62580749-38d7-4591-b57f-6653f86ebb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.886 2 DEBUG oslo_concurrency.lockutils [req-b9b1865e-55f9-44fe-9be1-2aa8d6342d7d req-62580749-38d7-4591-b57f-6653f86ebb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.886 2 DEBUG nova.compute.manager [req-b9b1865e-55f9-44fe-9be1-2aa8d6342d7d req-62580749-38d7-4591-b57f-6653f86ebb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] No waiting events found dispatching network-vif-plugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.886 2 WARNING nova.compute.manager [req-b9b1865e-55f9-44fe-9be1-2aa8d6342d7d req-62580749-38d7-4591-b57f-6653f86ebb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received unexpected event network-vif-plugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a for instance with vm_state active and task_state None.
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.929 2 DEBUG oslo_concurrency.lockutils [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.930 2 DEBUG oslo_concurrency.lockutils [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.930 2 DEBUG oslo_concurrency.lockutils [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.930 2 DEBUG oslo_concurrency.lockutils [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.931 2 DEBUG oslo_concurrency.lockutils [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.943 2 INFO nova.compute.manager [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Terminating instance
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.955 2 DEBUG nova.compute.manager [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:23:51 compute-1 kernel: tap7c19ac04-f0 (unregistering): left promiscuous mode
Sep 30 21:23:51 compute-1 NetworkManager[51724]: <info>  [1759267431.9864] device (tap7c19ac04-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:23:51 compute-1 nova_compute[192795]: 2025-09-30 21:23:51.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:51 compute-1 ovn_controller[94902]: 2025-09-30T21:23:51Z|00183|binding|INFO|Releasing lport 7c19ac04-f090-41a4-8712-2a1661fbf4db from this chassis (sb_readonly=0)
Sep 30 21:23:51 compute-1 ovn_controller[94902]: 2025-09-30T21:23:51Z|00184|binding|INFO|Setting lport 7c19ac04-f090-41a4-8712-2a1661fbf4db down in Southbound
Sep 30 21:23:51 compute-1 ovn_controller[94902]: 2025-09-30T21:23:51Z|00185|binding|INFO|Removing iface tap7c19ac04-f0 ovn-installed in OVS
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.017 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:b2:36 10.100.0.158'], port_security=['fa:16:3e:26:b2:36 10.100.0.158'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.158/24', 'neutron:device_id': '9eeaaff3-c30b-4b21-8955-570ddc4444b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f8397c5-7d5c-493b-98d1-beedbfee4a07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d069bc96-e0bf-415d-b198-5dbfe173420d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=7c19ac04-f090-41a4-8712-2a1661fbf4db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.019 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 7c19ac04-f090-41a4-8712-2a1661fbf4db in datapath 2f8397c5-7d5c-493b-98d1-beedbfee4a07 unbound from our chassis
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.021 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f8397c5-7d5c-493b-98d1-beedbfee4a07, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.022 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[05d9a673-6722-42be-9014-62d9c38dfadf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.023 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07 namespace which is not needed anymore
Sep 30 21:23:52 compute-1 kernel: tapd0acc879-5c (unregistering): left promiscuous mode
Sep 30 21:23:52 compute-1 NetworkManager[51724]: <info>  [1759267432.0288] device (tapd0acc879-5c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 ovn_controller[94902]: 2025-09-30T21:23:52Z|00186|binding|INFO|Releasing lport d0acc879-5ca2-4e3f-94c9-c24f4b88de6a from this chassis (sb_readonly=0)
Sep 30 21:23:52 compute-1 ovn_controller[94902]: 2025-09-30T21:23:52Z|00187|binding|INFO|Setting lport d0acc879-5ca2-4e3f-94c9-c24f4b88de6a down in Southbound
Sep 30 21:23:52 compute-1 ovn_controller[94902]: 2025-09-30T21:23:52Z|00188|binding|INFO|Removing iface tapd0acc879-5c ovn-installed in OVS
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.052 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:51:60 10.100.1.194'], port_security=['fa:16:3e:eb:51:60 10.100.1.194'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.194/24', 'neutron:device_id': '9eeaaff3-c30b-4b21-8955-570ddc4444b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df283875-3d9d-495f-9575-5f8e249bc3a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6a860bf9ab4d91b946d9a35e448d94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9135bf1b-3ec0-445f-a433-ff442745d259', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d38a0621-ad3f-4193-b927-6131cb09407c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=d0acc879-5ca2-4e3f-94c9-c24f4b88de6a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Sep 30 21:23:52 compute-1 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002b.scope: Consumed 2.734s CPU time.
Sep 30 21:23:52 compute-1 systemd-machined[152783]: Machine qemu-22-instance-0000002b terminated.
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07[226974]: [NOTICE]   (226978) : haproxy version is 2.8.14-c23fe91
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07[226974]: [NOTICE]   (226978) : path to executable is /usr/sbin/haproxy
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07[226974]: [WARNING]  (226978) : Exiting Master process...
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07[226974]: [WARNING]  (226978) : Exiting Master process...
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07[226974]: [ALERT]    (226978) : Current worker (226980) exited with code 143 (Terminated)
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07[226974]: [WARNING]  (226978) : All workers exited. Exiting... (0)
Sep 30 21:23:52 compute-1 NetworkManager[51724]: <info>  [1759267432.1822] manager: (tap7c19ac04-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Sep 30 21:23:52 compute-1 systemd-udevd[227097]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:52 compute-1 systemd[1]: libpod-717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56.scope: Deactivated successfully.
Sep 30 21:23:52 compute-1 podman[227111]: 2025-09-30 21:23:52.189926167 +0000 UTC m=+0.059743179 container died 717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:23:52 compute-1 NetworkManager[51724]: <info>  [1759267432.1972] manager: (tapd0acc879-5c): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Sep 30 21:23:52 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56-userdata-shm.mount: Deactivated successfully.
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-7cc5895ebff095b28827dfdc07e44325b3552004b8d41f15405ab3a9dce57ab6-merged.mount: Deactivated successfully.
Sep 30 21:23:52 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.258 2 INFO nova.virt.libvirt.driver [-] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Instance destroyed successfully.
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.259 2 DEBUG nova.objects.instance [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lazy-loading 'resources' on Instance uuid 9eeaaff3-c30b-4b21-8955-570ddc4444b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:52 compute-1 podman[227111]: 2025-09-30 21:23:52.272092798 +0000 UTC m=+0.141909810 container cleanup 717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.274 2 DEBUG nova.virt.libvirt.vif [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:23:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1249593289',display_name='tempest-ServersTestMultiNic-server-1249593289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1249593289',id=43,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:23:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-wz2oapbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:23:50Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=9eeaaff3-c30b-4b21-8955-570ddc4444b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.275 2 DEBUG nova.network.os_vif_util [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "address": "fa:16:3e:26:b2:36", "network": {"id": "2f8397c5-7d5c-493b-98d1-beedbfee4a07", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1437419290", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c19ac04-f0", "ovs_interfaceid": "7c19ac04-f090-41a4-8712-2a1661fbf4db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.275 2 DEBUG nova.network.os_vif_util [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=7c19ac04-f090-41a4-8712-2a1661fbf4db,network=Network(2f8397c5-7d5c-493b-98d1-beedbfee4a07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c19ac04-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.276 2 DEBUG os_vif [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=7c19ac04-f090-41a4-8712-2a1661fbf4db,network=Network(2f8397c5-7d5c-493b-98d1-beedbfee4a07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c19ac04-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.278 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c19ac04-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 systemd[1]: libpod-conmon-717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56.scope: Deactivated successfully.
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.289 2 INFO os_vif [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:b2:36,bridge_name='br-int',has_traffic_filtering=True,id=7c19ac04-f090-41a4-8712-2a1661fbf4db,network=Network(2f8397c5-7d5c-493b-98d1-beedbfee4a07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c19ac04-f0')
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.290 2 DEBUG nova.virt.libvirt.vif [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:23:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1249593289',display_name='tempest-ServersTestMultiNic-server-1249593289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1249593289',id=43,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:23:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6a860bf9ab4d91b946d9a35e448d94',ramdisk_id='',reservation_id='r-wz2oapbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-108666806',owner_user_name='tempest-ServersTestMultiNic-108666806-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:23:50Z,user_data=None,user_id='80abdc8ce51444378234b07daa877ac7',uuid=9eeaaff3-c30b-4b21-8955-570ddc4444b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.290 2 DEBUG nova.network.os_vif_util [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converting VIF {"id": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "address": "fa:16:3e:eb:51:60", "network": {"id": "df283875-3d9d-495f-9575-5f8e249bc3a8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1775930378", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6a860bf9ab4d91b946d9a35e448d94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0acc879-5c", "ovs_interfaceid": "d0acc879-5ca2-4e3f-94c9-c24f4b88de6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.290 2 DEBUG nova.network.os_vif_util [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:51:60,bridge_name='br-int',has_traffic_filtering=True,id=d0acc879-5ca2-4e3f-94c9-c24f4b88de6a,network=Network(df283875-3d9d-495f-9575-5f8e249bc3a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0acc879-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.291 2 DEBUG os_vif [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:51:60,bridge_name='br-int',has_traffic_filtering=True,id=d0acc879-5ca2-4e3f-94c9-c24f4b88de6a,network=Network(df283875-3d9d-495f-9575-5f8e249bc3a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0acc879-5c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.292 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0acc879-5c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.297 2 INFO os_vif [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:51:60,bridge_name='br-int',has_traffic_filtering=True,id=d0acc879-5ca2-4e3f-94c9-c24f4b88de6a,network=Network(df283875-3d9d-495f-9575-5f8e249bc3a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0acc879-5c')
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.298 2 INFO nova.virt.libvirt.driver [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Deleting instance files /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6_del
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.299 2 INFO nova.virt.libvirt.driver [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Deletion of /var/lib/nova/instances/9eeaaff3-c30b-4b21-8955-570ddc4444b6_del complete
Sep 30 21:23:52 compute-1 podman[227160]: 2025-09-30 21:23:52.368980147 +0000 UTC m=+0.072819861 container remove 717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.378 2 INFO nova.compute.manager [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Took 0.42 seconds to destroy the instance on the hypervisor.
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.379 2 DEBUG oslo.service.loopingcall [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.379 2 DEBUG nova.compute.manager [-] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.380 2 DEBUG nova.network.neutron [-] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.384 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bada315e-ebd5-40fe-a390-5bc75abb5f31]: (4, ('Tue Sep 30 09:23:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07 (717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56)\n717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56\nTue Sep 30 09:23:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07 (717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56)\n717dceb3f8de61326054e57d768295d4bc8603bae50e3d90cd638f47e0997e56\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.386 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[af186a14-7edf-41af-8233-7a255a1b6128]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.388 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f8397c5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 kernel: tap2f8397c5-70: left promiscuous mode
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.406 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[af88284e-87d9-46c9-b378-fb361bea33cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.441 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[656994ec-2043-4788-bb9f-d7458dac2ec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.443 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bcbb739b-b860-4861-9d1d-c0a4dedee6a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.457 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0f80bdee-003d-4d22-aa28-0c9fa16194b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412480, 'reachable_time': 17527, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227176, 'error': None, 'target': 'ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.459 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2f8397c5-7d5c-493b-98d1-beedbfee4a07 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.460 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa7e303-bfad-45c9-913e-da8d4b2b2cb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 systemd[1]: run-netns-ovnmeta\x2d2f8397c5\x2d7d5c\x2d493b\x2d98d1\x2dbeedbfee4a07.mount: Deactivated successfully.
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.461 103861 INFO neutron.agent.ovn.metadata.agent [-] Port d0acc879-5ca2-4e3f-94c9-c24f4b88de6a in datapath df283875-3d9d-495f-9575-5f8e249bc3a8 unbound from our chassis
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.462 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df283875-3d9d-495f-9575-5f8e249bc3a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.463 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e475f5b4-9294-4041-9f01-7f6ab5a41ded]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.464 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8 namespace which is not needed anymore
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8[227049]: [NOTICE]   (227053) : haproxy version is 2.8.14-c23fe91
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8[227049]: [NOTICE]   (227053) : path to executable is /usr/sbin/haproxy
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8[227049]: [WARNING]  (227053) : Exiting Master process...
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8[227049]: [ALERT]    (227053) : Current worker (227055) exited with code 143 (Terminated)
Sep 30 21:23:52 compute-1 neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8[227049]: [WARNING]  (227053) : All workers exited. Exiting... (0)
Sep 30 21:23:52 compute-1 systemd[1]: libpod-c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce.scope: Deactivated successfully.
Sep 30 21:23:52 compute-1 podman[227194]: 2025-09-30 21:23:52.653557096 +0000 UTC m=+0.093895838 container died c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:23:52 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce-userdata-shm.mount: Deactivated successfully.
Sep 30 21:23:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-de6f3ee221b8b1eb960a448470c1326f5f6e901a0c7448172880cc98df9722ac-merged.mount: Deactivated successfully.
Sep 30 21:23:52 compute-1 podman[227194]: 2025-09-30 21:23:52.695091774 +0000 UTC m=+0.135430546 container cleanup c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:23:52 compute-1 systemd[1]: libpod-conmon-c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce.scope: Deactivated successfully.
Sep 30 21:23:52 compute-1 podman[227224]: 2025-09-30 21:23:52.776102184 +0000 UTC m=+0.048435154 container remove c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.782 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7fde0e21-af93-4256-99a3-6c96e5e0f7a8]: (4, ('Tue Sep 30 09:23:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8 (c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce)\nc8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce\nTue Sep 30 09:23:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8 (c8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce)\nc8a4daa8da51cc022d64dd42aad41ce8cf2816f466127a70770ed29c726098ce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.786 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d71fada9-e703-4251-a253-e5837913930c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.787 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf283875-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 kernel: tapdf283875-30: left promiscuous mode
Sep 30 21:23:52 compute-1 nova_compute[192795]: 2025-09-30 21:23:52.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.808 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb3101a-ed42-4d78-ad8b-48a88e013789]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.837 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4b330aa3-6377-4971-922a-fcafde781c99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.839 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[308f76f5-7367-4691-8bd9-ebc287186cce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.858 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1805754a-dfa6-404b-8c52-f23b14aa5c1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412578, 'reachable_time': 37439, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227240, 'error': None, 'target': 'ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.861 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-df283875-3d9d-495f-9575-5f8e249bc3a8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:23:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:23:52.861 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e6ec19-a559-43f5-b427-77c301cab56f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:53 compute-1 systemd[1]: run-netns-ovnmeta\x2ddf283875\x2d3d9d\x2d495f\x2d9575\x2d5f8e249bc3a8.mount: Deactivated successfully.
Sep 30 21:23:53 compute-1 sshd-session[227230]: Invalid user a from 185.156.73.233 port 24946
Sep 30 21:23:53 compute-1 sshd-session[227230]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:23:53 compute-1 sshd-session[227230]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.991 2 DEBUG nova.compute.manager [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-unplugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.992 2 DEBUG oslo_concurrency.lockutils [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.992 2 DEBUG oslo_concurrency.lockutils [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.993 2 DEBUG oslo_concurrency.lockutils [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.993 2 DEBUG nova.compute.manager [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] No waiting events found dispatching network-vif-unplugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.994 2 DEBUG nova.compute.manager [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-unplugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.994 2 DEBUG nova.compute.manager [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-plugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.995 2 DEBUG oslo_concurrency.lockutils [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.996 2 DEBUG oslo_concurrency.lockutils [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.996 2 DEBUG oslo_concurrency.lockutils [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.996 2 DEBUG nova.compute.manager [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] No waiting events found dispatching network-vif-plugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:53 compute-1 nova_compute[192795]: 2025-09-30 21:23:53.997 2 WARNING nova.compute.manager [req-3fef0e7d-e248-4ea4-9e13-703707a5b43c req-5a30e11a-8a2b-4813-bb74-5f656d32f1a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received unexpected event network-vif-plugged-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a for instance with vm_state active and task_state deleting.
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.668 2 DEBUG nova.compute.manager [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-unplugged-7c19ac04-f090-41a4-8712-2a1661fbf4db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.669 2 DEBUG oslo_concurrency.lockutils [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.670 2 DEBUG oslo_concurrency.lockutils [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.671 2 DEBUG oslo_concurrency.lockutils [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.671 2 DEBUG nova.compute.manager [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] No waiting events found dispatching network-vif-unplugged-7c19ac04-f090-41a4-8712-2a1661fbf4db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.672 2 DEBUG nova.compute.manager [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-unplugged-7c19ac04-f090-41a4-8712-2a1661fbf4db for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.672 2 DEBUG nova.compute.manager [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-plugged-7c19ac04-f090-41a4-8712-2a1661fbf4db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.673 2 DEBUG oslo_concurrency.lockutils [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.673 2 DEBUG oslo_concurrency.lockutils [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.674 2 DEBUG oslo_concurrency.lockutils [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.674 2 DEBUG nova.compute.manager [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] No waiting events found dispatching network-vif-plugged-7c19ac04-f090-41a4-8712-2a1661fbf4db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.675 2 WARNING nova.compute.manager [req-f988dcc7-a669-431f-b9d9-a6c8d1fc196d req-7ef0a94f-4ef6-4f05-84b6-b464799f1111 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received unexpected event network-vif-plugged-7c19ac04-f090-41a4-8712-2a1661fbf4db for instance with vm_state active and task_state deleting.
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.717 2 DEBUG nova.network.neutron [-] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.735 2 INFO nova.compute.manager [-] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Took 2.36 seconds to deallocate network for instance.
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.823 2 DEBUG oslo_concurrency.lockutils [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.824 2 DEBUG oslo_concurrency.lockutils [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.855 2 DEBUG nova.compute.manager [req-e48ffd19-decb-4d2b-b09d-00dde2d7b386 req-9fa3d578-1289-4d19-bfc9-f22ec08516fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-deleted-7c19ac04-f090-41a4-8712-2a1661fbf4db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.855 2 DEBUG nova.compute.manager [req-e48ffd19-decb-4d2b-b09d-00dde2d7b386 req-9fa3d578-1289-4d19-bfc9-f22ec08516fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Received event network-vif-deleted-d0acc879-5ca2-4e3f-94c9-c24f4b88de6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.888 2 DEBUG nova.compute.provider_tree [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.908 2 DEBUG nova.scheduler.client.report [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.930 2 DEBUG oslo_concurrency.lockutils [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:54 compute-1 nova_compute[192795]: 2025-09-30 21:23:54.964 2 INFO nova.scheduler.client.report [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Deleted allocations for instance 9eeaaff3-c30b-4b21-8955-570ddc4444b6
Sep 30 21:23:55 compute-1 nova_compute[192795]: 2025-09-30 21:23:55.043 2 DEBUG oslo_concurrency.lockutils [None req-14df2912-2362-4c03-b214-8a7dcae30268 80abdc8ce51444378234b07daa877ac7 da6a860bf9ab4d91b946d9a35e448d94 - - default default] Lock "9eeaaff3-c30b-4b21-8955-570ddc4444b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:55 compute-1 sshd-session[227230]: Failed password for invalid user a from 185.156.73.233 port 24946 ssh2
Sep 30 21:23:56 compute-1 sshd-session[227230]: Connection closed by invalid user a 185.156.73.233 port 24946 [preauth]
Sep 30 21:23:56 compute-1 nova_compute[192795]: 2025-09-30 21:23:56.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:56 compute-1 nova_compute[192795]: 2025-09-30 21:23:56.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-1 nova_compute[192795]: 2025-09-30 21:23:57.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:59 compute-1 podman[227245]: 2025-09-30 21:23:59.250268007 +0000 UTC m=+0.067670492 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:23:59 compute-1 podman[227243]: 2025-09-30 21:23:59.288759363 +0000 UTC m=+0.118807929 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:23:59 compute-1 podman[227244]: 2025-09-30 21:23:59.30052636 +0000 UTC m=+0.127418601 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:24:01 compute-1 nova_compute[192795]: 2025-09-30 21:24:01.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:02 compute-1 nova_compute[192795]: 2025-09-30 21:24:02.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:03 compute-1 podman[227314]: 2025-09-30 21:24:03.260598142 +0000 UTC m=+0.090617940 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:24:06 compute-1 nova_compute[192795]: 2025-09-30 21:24:06.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:07 compute-1 nova_compute[192795]: 2025-09-30 21:24:07.256 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267432.2549682, 9eeaaff3-c30b-4b21-8955-570ddc4444b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:07 compute-1 nova_compute[192795]: 2025-09-30 21:24:07.257 2 INFO nova.compute.manager [-] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] VM Stopped (Lifecycle Event)
Sep 30 21:24:07 compute-1 nova_compute[192795]: 2025-09-30 21:24:07.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:07 compute-1 nova_compute[192795]: 2025-09-30 21:24:07.303 2 DEBUG nova.compute.manager [None req-0cf5ac29-2d47-4126-a90d-bc638be644a1 - - - - - -] [instance: 9eeaaff3-c30b-4b21-8955-570ddc4444b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:11 compute-1 podman[227334]: 2025-09-30 21:24:11.279106951 +0000 UTC m=+0.119757204 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:24:11 compute-1 nova_compute[192795]: 2025-09-30 21:24:11.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:11 compute-1 nova_compute[192795]: 2025-09-30 21:24:11.908 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquiring lock "e704a3db-d970-44f4-8b39-b304bbac4a69" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:11 compute-1 nova_compute[192795]: 2025-09-30 21:24:11.908 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "e704a3db-d970-44f4-8b39-b304bbac4a69" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.220 2 DEBUG nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.366 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.367 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.375 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.375 2 INFO nova.compute.claims [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.583 2 DEBUG nova.compute.provider_tree [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.626 2 DEBUG nova.scheduler.client.report [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.656 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.657 2 DEBUG nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.725 2 DEBUG nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.726 2 DEBUG nova.network.neutron [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.846 2 INFO nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:24:12 compute-1 nova_compute[192795]: 2025-09-30 21:24:12.879 2 DEBUG nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.179 2 DEBUG nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.180 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.180 2 INFO nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Creating image(s)
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.181 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquiring lock "/var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.181 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "/var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.182 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "/var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.199 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:13 compute-1 podman[227353]: 2025-09-30 21:24:13.268828738 +0000 UTC m=+0.101452422 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Sep 30 21:24:13 compute-1 podman[227354]: 2025-09-30 21:24:13.270717789 +0000 UTC m=+0.092715657 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.273 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.274 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.275 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.294 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.359 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.360 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.415 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.417 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.418 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.465 2 DEBUG nova.network.neutron [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.466 2 DEBUG nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.484 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.485 2 DEBUG nova.virt.disk.api [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Checking if we can resize image /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.486 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.554 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.556 2 DEBUG nova.virt.disk.api [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Cannot resize image /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.557 2 DEBUG nova.objects.instance [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lazy-loading 'migration_context' on Instance uuid e704a3db-d970-44f4-8b39-b304bbac4a69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.578 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.579 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Ensure instance console log exists: /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.580 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.581 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.581 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.584 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.592 2 WARNING nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.600 2 DEBUG nova.virt.libvirt.host [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.601 2 DEBUG nova.virt.libvirt.host [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.605 2 DEBUG nova.virt.libvirt.host [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.606 2 DEBUG nova.virt.libvirt.host [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.608 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.608 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.609 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.610 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.610 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.610 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.611 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.611 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.612 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.613 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.613 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.614 2 DEBUG nova.virt.hardware [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.622 2 DEBUG nova.objects.instance [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lazy-loading 'pci_devices' on Instance uuid e704a3db-d970-44f4-8b39-b304bbac4a69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.642 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <uuid>e704a3db-d970-44f4-8b39-b304bbac4a69</uuid>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <name>instance-0000002d</name>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersAdminNegativeTestJSON-server-1059222995</nova:name>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:24:13</nova:creationTime>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:24:13 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:24:13 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:24:13 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:24:13 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:24:13 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:24:13 compute-1 nova_compute[192795]:         <nova:user uuid="6df73a9d520f4a6c93aa97f2f78d87d4">tempest-ServersAdminNegativeTestJSON-81409629-project-member</nova:user>
Sep 30 21:24:13 compute-1 nova_compute[192795]:         <nova:project uuid="b134615435f24a99b7e13e04b933e849">tempest-ServersAdminNegativeTestJSON-81409629</nova:project>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <system>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <entry name="serial">e704a3db-d970-44f4-8b39-b304bbac4a69</entry>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <entry name="uuid">e704a3db-d970-44f4-8b39-b304bbac4a69</entry>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     </system>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <os>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   </os>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <features>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   </features>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk.config"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/console.log" append="off"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <video>
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     </video>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:24:13 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:24:13 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:24:13 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:24:13 compute-1 nova_compute[192795]: </domain>
Sep 30 21:24:13 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.705 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.706 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.706 2 INFO nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Using config drive
Sep 30 21:24:13 compute-1 nova_compute[192795]: 2025-09-30 21:24:13.998 2 INFO nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Creating config drive at /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk.config
Sep 30 21:24:14 compute-1 nova_compute[192795]: 2025-09-30 21:24:14.003 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkyrtbdq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:14 compute-1 nova_compute[192795]: 2025-09-30 21:24:14.135 2 DEBUG oslo_concurrency.processutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkyrtbdq" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:14 compute-1 systemd-machined[152783]: New machine qemu-23-instance-0000002d.
Sep 30 21:24:14 compute-1 systemd[1]: Started Virtual Machine qemu-23-instance-0000002d.
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.062 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267455.0621092, e704a3db-d970-44f4-8b39-b304bbac4a69 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.064 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] VM Resumed (Lifecycle Event)
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.066 2 DEBUG nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.067 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.072 2 INFO nova.virt.libvirt.driver [-] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Instance spawned successfully.
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.073 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.101 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.108 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.109 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.110 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.111 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.111 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.112 2 DEBUG nova.virt.libvirt.driver [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.122 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.159 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.160 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267455.0622861, e704a3db-d970-44f4-8b39-b304bbac4a69 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.160 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] VM Started (Lifecycle Event)
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.184 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.189 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.225 2 INFO nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Took 2.05 seconds to spawn the instance on the hypervisor.
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.226 2 DEBUG nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.227 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.309 2 INFO nova.compute.manager [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Took 3.00 seconds to build instance.
Sep 30 21:24:15 compute-1 nova_compute[192795]: 2025-09-30 21:24:15.323 2 DEBUG oslo_concurrency.lockutils [None req-bb61173f-477a-4326-a1fe-1090e9be9741 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "e704a3db-d970-44f4-8b39-b304bbac4a69" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:16 compute-1 nova_compute[192795]: 2025-09-30 21:24:16.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:17 compute-1 nova_compute[192795]: 2025-09-30 21:24:17.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:21 compute-1 nova_compute[192795]: 2025-09-30 21:24:21.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:22 compute-1 podman[227440]: 2025-09-30 21:24:22.271918851 +0000 UTC m=+0.095024809 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250923)
Sep 30 21:24:22 compute-1 nova_compute[192795]: 2025-09-30 21:24:22.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:23.828 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:24:23 compute-1 nova_compute[192795]: 2025-09-30 21:24:23.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:23.831 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:24:26 compute-1 nova_compute[192795]: 2025-09-30 21:24:26.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:27 compute-1 nova_compute[192795]: 2025-09-30 21:24:27.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:30 compute-1 podman[227478]: 2025-09-30 21:24:30.269972921 +0000 UTC m=+0.091494634 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:24:30 compute-1 podman[227476]: 2025-09-30 21:24:30.272521449 +0000 UTC m=+0.103664381 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:24:30 compute-1 podman[227477]: 2025-09-30 21:24:30.314680884 +0000 UTC m=+0.131960613 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:24:31 compute-1 ovn_controller[94902]: 2025-09-30T21:24:31Z|00189|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Sep 30 21:24:31 compute-1 nova_compute[192795]: 2025-09-30 21:24:31.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:32 compute-1 nova_compute[192795]: 2025-09-30 21:24:32.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:32.835 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:34 compute-1 podman[227544]: 2025-09-30 21:24:34.250406171 +0000 UTC m=+0.091127924 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.125 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "dc03304c-f076-4627-9b00-265c7d559784" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.126 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.157 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.250 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquiring lock "e704a3db-d970-44f4-8b39-b304bbac4a69" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.251 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "e704a3db-d970-44f4-8b39-b304bbac4a69" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.252 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquiring lock "e704a3db-d970-44f4-8b39-b304bbac4a69-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.253 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "e704a3db-d970-44f4-8b39-b304bbac4a69-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.253 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "e704a3db-d970-44f4-8b39-b304bbac4a69-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.326 2 INFO nova.compute.manager [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Terminating instance
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.336 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquiring lock "refresh_cache-e704a3db-d970-44f4-8b39-b304bbac4a69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.337 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquired lock "refresh_cache-e704a3db-d970-44f4-8b39-b304bbac4a69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.337 2 DEBUG nova.network.neutron [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.518 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.518 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.529 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.529 2 INFO nova.compute.claims [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.769 2 DEBUG nova.compute.provider_tree [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.784 2 DEBUG nova.scheduler.client.report [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.815 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.816 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.890 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.891 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.922 2 INFO nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:24:35 compute-1 nova_compute[192795]: 2025-09-30 21:24:35.943 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.117 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.119 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.120 2 INFO nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Creating image(s)
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.121 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "/var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.121 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "/var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.123 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "/var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.149 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.218 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.221 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.222 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.246 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.313 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.315 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.369 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.371 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.371 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.425 2 DEBUG nova.network.neutron [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.433 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.434 2 DEBUG nova.virt.disk.api [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Checking if we can resize image /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.434 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.509 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.510 2 DEBUG nova.virt.disk.api [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Cannot resize image /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.511 2 DEBUG nova.objects.instance [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lazy-loading 'migration_context' on Instance uuid dc03304c-f076-4627-9b00-265c7d559784 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.517 2 DEBUG nova.policy [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.538 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.539 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Ensure instance console log exists: /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.539 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.540 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.540 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:36 compute-1 nova_compute[192795]: 2025-09-30 21:24:36.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.073 2 DEBUG nova.network.neutron [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.096 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Releasing lock "refresh_cache-e704a3db-d970-44f4-8b39-b304bbac4a69" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.098 2 DEBUG nova.compute.manager [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:24:37 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Sep 30 21:24:37 compute-1 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002d.scope: Consumed 13.663s CPU time.
Sep 30 21:24:37 compute-1 systemd-machined[152783]: Machine qemu-23-instance-0000002d terminated.
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.373 2 INFO nova.virt.libvirt.driver [-] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Instance destroyed successfully.
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.374 2 DEBUG nova.objects.instance [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lazy-loading 'resources' on Instance uuid e704a3db-d970-44f4-8b39-b304bbac4a69 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.394 2 INFO nova.virt.libvirt.driver [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Deleting instance files /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69_del
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.395 2 INFO nova.virt.libvirt.driver [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Deletion of /var/lib/nova/instances/e704a3db-d970-44f4-8b39-b304bbac4a69_del complete
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.499 2 INFO nova.compute.manager [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.500 2 DEBUG oslo.service.loopingcall [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.501 2 DEBUG nova.compute.manager [-] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.501 2 DEBUG nova.network.neutron [-] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.715 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.727 2 DEBUG nova.network.neutron [-] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.752 2 DEBUG nova.network.neutron [-] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.772 2 INFO nova.compute.manager [-] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Took 0.27 seconds to deallocate network for instance.
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.884 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.885 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.926 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Successfully created port: 12aa3fda-2986-4025-bda8-bcc4b6872810 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:24:37 compute-1 nova_compute[192795]: 2025-09-30 21:24:37.997 2 DEBUG nova.compute.provider_tree [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:38 compute-1 nova_compute[192795]: 2025-09-30 21:24:38.013 2 DEBUG nova.scheduler.client.report [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:38 compute-1 nova_compute[192795]: 2025-09-30 21:24:38.038 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:38 compute-1 nova_compute[192795]: 2025-09-30 21:24:38.118 2 INFO nova.scheduler.client.report [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Deleted allocations for instance e704a3db-d970-44f4-8b39-b304bbac4a69
Sep 30 21:24:38 compute-1 nova_compute[192795]: 2025-09-30 21:24:38.456 2 DEBUG oslo_concurrency.lockutils [None req-bec32005-9c60-40f5-905f-14ced9b62648 6df73a9d520f4a6c93aa97f2f78d87d4 b134615435f24a99b7e13e04b933e849 - - default default] Lock "e704a3db-d970-44f4-8b39-b304bbac4a69" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:38.685 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:38.686 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:38.686 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:39 compute-1 nova_compute[192795]: 2025-09-30 21:24:39.775 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Successfully updated port: 12aa3fda-2986-4025-bda8-bcc4b6872810 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:24:39 compute-1 nova_compute[192795]: 2025-09-30 21:24:39.799 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "refresh_cache-dc03304c-f076-4627-9b00-265c7d559784" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:24:39 compute-1 nova_compute[192795]: 2025-09-30 21:24:39.800 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquired lock "refresh_cache-dc03304c-f076-4627-9b00-265c7d559784" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:24:39 compute-1 nova_compute[192795]: 2025-09-30 21:24:39.800 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.090 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.715 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.716 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.716 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.717 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.932 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.934 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5718MB free_disk=73.46218872070312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.934 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.934 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.944 2 DEBUG nova.compute.manager [req-2d6f21e7-ebb7-44f9-a156-b7bd7f5a2f88 req-7ace629c-1b29-4ab3-9eaa-006c889ed378 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Received event network-changed-12aa3fda-2986-4025-bda8-bcc4b6872810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.944 2 DEBUG nova.compute.manager [req-2d6f21e7-ebb7-44f9-a156-b7bd7f5a2f88 req-7ace629c-1b29-4ab3-9eaa-006c889ed378 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Refreshing instance network info cache due to event network-changed-12aa3fda-2986-4025-bda8-bcc4b6872810. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:24:40 compute-1 nova_compute[192795]: 2025-09-30 21:24:40.946 2 DEBUG oslo_concurrency.lockutils [req-2d6f21e7-ebb7-44f9-a156-b7bd7f5a2f88 req-7ace629c-1b29-4ab3-9eaa-006c889ed378 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-dc03304c-f076-4627-9b00-265c7d559784" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.066 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance dc03304c-f076-4627-9b00-265c7d559784 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.067 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.068 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.136 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.187 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.216 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.217 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.218 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.218 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.623 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Updating instance_info_cache with network_info: [{"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.657 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Releasing lock "refresh_cache-dc03304c-f076-4627-9b00-265c7d559784" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.657 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Instance network_info: |[{"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.658 2 DEBUG oslo_concurrency.lockutils [req-2d6f21e7-ebb7-44f9-a156-b7bd7f5a2f88 req-7ace629c-1b29-4ab3-9eaa-006c889ed378 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-dc03304c-f076-4627-9b00-265c7d559784" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.658 2 DEBUG nova.network.neutron [req-2d6f21e7-ebb7-44f9-a156-b7bd7f5a2f88 req-7ace629c-1b29-4ab3-9eaa-006c889ed378 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Refreshing network info cache for port 12aa3fda-2986-4025-bda8-bcc4b6872810 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.660 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Start _get_guest_xml network_info=[{"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.666 2 WARNING nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.670 2 DEBUG nova.virt.libvirt.host [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.670 2 DEBUG nova.virt.libvirt.host [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.677 2 DEBUG nova.virt.libvirt.host [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.677 2 DEBUG nova.virt.libvirt.host [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.678 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.678 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.679 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.679 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.679 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.679 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.680 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.680 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.680 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.680 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.680 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.681 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.684 2 DEBUG nova.virt.libvirt.vif [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-966346934',display_name='tempest-tempest.common.compute-instance-966346934-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-966346934-2',id=49,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34c754fa0f364622a4433b9ba5718857',ramdisk_id='',reservation_id='r-zs1xvy19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-413721927',owner_user_name='tempest-MultipleCreateTestJSON-413721927-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:24:35Z,user_data=None,user_id='1a8900f8597741ad930d414e1db02d76',uuid=dc03304c-f076-4627-9b00-265c7d559784,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.684 2 DEBUG nova.network.os_vif_util [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converting VIF {"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.685 2 DEBUG nova.network.os_vif_util [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:40:d4,bridge_name='br-int',has_traffic_filtering=True,id=12aa3fda-2986-4025-bda8-bcc4b6872810,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12aa3fda-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.686 2 DEBUG nova.objects.instance [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lazy-loading 'pci_devices' on Instance uuid dc03304c-f076-4627-9b00-265c7d559784 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.707 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <uuid>dc03304c-f076-4627-9b00-265c7d559784</uuid>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <name>instance-00000031</name>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <nova:name>tempest-tempest.common.compute-instance-966346934-2</nova:name>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:24:41</nova:creationTime>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:24:41 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:24:41 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:24:41 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:24:41 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:24:41 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:24:41 compute-1 nova_compute[192795]:         <nova:user uuid="1a8900f8597741ad930d414e1db02d76">tempest-MultipleCreateTestJSON-413721927-project-member</nova:user>
Sep 30 21:24:41 compute-1 nova_compute[192795]:         <nova:project uuid="34c754fa0f364622a4433b9ba5718857">tempest-MultipleCreateTestJSON-413721927</nova:project>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:24:41 compute-1 nova_compute[192795]:         <nova:port uuid="12aa3fda-2986-4025-bda8-bcc4b6872810">
Sep 30 21:24:41 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <system>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <entry name="serial">dc03304c-f076-4627-9b00-265c7d559784</entry>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <entry name="uuid">dc03304c-f076-4627-9b00-265c7d559784</entry>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     </system>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <os>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   </os>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <features>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   </features>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk.config"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:39:40:d4"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <target dev="tap12aa3fda-29"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/console.log" append="off"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <video>
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     </video>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:24:41 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:24:41 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:24:41 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:24:41 compute-1 nova_compute[192795]: </domain>
Sep 30 21:24:41 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.709 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Preparing to wait for external event network-vif-plugged-12aa3fda-2986-4025-bda8-bcc4b6872810 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.709 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "dc03304c-f076-4627-9b00-265c7d559784-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.710 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.710 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.712 2 DEBUG nova.virt.libvirt.vif [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-966346934',display_name='tempest-tempest.common.compute-instance-966346934-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-966346934-2',id=49,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34c754fa0f364622a4433b9ba5718857',ramdisk_id='',reservation_id='r-zs1xvy19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-413721927',owner_user_name='tempest-MultipleCreateTestJSON-413721927-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:24:35Z,user_data=None,user_id='1a8900f8597741ad930d414e1db02d76',uuid=dc03304c-f076-4627-9b00-265c7d559784,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.712 2 DEBUG nova.network.os_vif_util [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converting VIF {"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.713 2 DEBUG nova.network.os_vif_util [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:40:d4,bridge_name='br-int',has_traffic_filtering=True,id=12aa3fda-2986-4025-bda8-bcc4b6872810,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12aa3fda-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.714 2 DEBUG os_vif [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:40:d4,bridge_name='br-int',has_traffic_filtering=True,id=12aa3fda-2986-4025-bda8-bcc4b6872810,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12aa3fda-29') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.716 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.716 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.724 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12aa3fda-29, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.725 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap12aa3fda-29, col_values=(('external_ids', {'iface-id': '12aa3fda-2986-4025-bda8-bcc4b6872810', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:40:d4', 'vm-uuid': 'dc03304c-f076-4627-9b00-265c7d559784'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:41 compute-1 NetworkManager[51724]: <info>  [1759267481.7296] manager: (tap12aa3fda-29): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.741 2 INFO os_vif [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:40:d4,bridge_name='br-int',has_traffic_filtering=True,id=12aa3fda-2986-4025-bda8-bcc4b6872810,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12aa3fda-29')
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.830 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.831 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.831 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] No VIF found with MAC fa:16:3e:39:40:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.832 2 INFO nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Using config drive
Sep 30 21:24:41 compute-1 nova_compute[192795]: 2025-09-30 21:24:41.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-1 nova_compute[192795]: 2025-09-30 21:24:42.241 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:42 compute-1 podman[227592]: 2025-09-30 21:24:42.25174613 +0000 UTC m=+0.080087266 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 21:24:42 compute-1 nova_compute[192795]: 2025-09-30 21:24:42.658 2 INFO nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Creating config drive at /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk.config
Sep 30 21:24:42 compute-1 nova_compute[192795]: 2025-09-30 21:24:42.665 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpexk5zdt0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:42 compute-1 nova_compute[192795]: 2025-09-30 21:24:42.796 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpexk5zdt0" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:42 compute-1 kernel: tap12aa3fda-29: entered promiscuous mode
Sep 30 21:24:42 compute-1 ovn_controller[94902]: 2025-09-30T21:24:42Z|00190|binding|INFO|Claiming lport 12aa3fda-2986-4025-bda8-bcc4b6872810 for this chassis.
Sep 30 21:24:42 compute-1 ovn_controller[94902]: 2025-09-30T21:24:42Z|00191|binding|INFO|12aa3fda-2986-4025-bda8-bcc4b6872810: Claiming fa:16:3e:39:40:d4 10.100.0.11
Sep 30 21:24:42 compute-1 nova_compute[192795]: 2025-09-30 21:24:42.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-1 NetworkManager[51724]: <info>  [1759267482.8810] manager: (tap12aa3fda-29): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Sep 30 21:24:42 compute-1 nova_compute[192795]: 2025-09-30 21:24:42.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-1 systemd-udevd[227627]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:24:42 compute-1 nova_compute[192795]: 2025-09-30 21:24:42.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-1 ovn_controller[94902]: 2025-09-30T21:24:42Z|00192|binding|INFO|Setting lport 12aa3fda-2986-4025-bda8-bcc4b6872810 ovn-installed in OVS
Sep 30 21:24:42 compute-1 nova_compute[192795]: 2025-09-30 21:24:42.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-1 NetworkManager[51724]: <info>  [1759267482.9428] device (tap12aa3fda-29): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:24:42 compute-1 NetworkManager[51724]: <info>  [1759267482.9451] device (tap12aa3fda-29): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:24:43 compute-1 ovn_controller[94902]: 2025-09-30T21:24:43Z|00193|binding|INFO|Setting lport 12aa3fda-2986-4025-bda8-bcc4b6872810 up in Southbound
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.022 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:40:d4 10.100.0.11'], port_security=['fa:16:3e:39:40:d4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34c754fa0f364622a4433b9ba5718857', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1c375ca4-4abf-4405-95d5-43748e715058', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f21a560c-012c-4374-b9ef-0dc124a433df, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=12aa3fda-2986-4025-bda8-bcc4b6872810) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.024 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 12aa3fda-2986-4025-bda8-bcc4b6872810 in datapath f4180897-fb47-4ee3-b86e-380da38f2ec5 bound to our chassis
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.027 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4180897-fb47-4ee3-b86e-380da38f2ec5
Sep 30 21:24:43 compute-1 systemd-machined[152783]: New machine qemu-24-instance-00000031.
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.044 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c4755ad3-c80f-4e6e-a6f8-0a97f387bb7f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.045 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf4180897-f1 in ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.048 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf4180897-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.048 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a0abe0-7720-47e0-b006-2a09361870f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.049 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf19f08-361d-4d7e-a9c0-ee202838aef2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 systemd[1]: Started Virtual Machine qemu-24-instance-00000031.
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.070 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[54ce486a-a739-4737-9a3c-a3570cc4a32f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.106 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cd797a-a973-4551-b06e-43c4eb41cc83]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.155 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[cca7d51c-3e40-447c-8ef8-bf3cc48aa435]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.164 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1dbd52e1-1d92-4886-bae4-3f314e51bcc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 NetworkManager[51724]: <info>  [1759267483.1667] manager: (tapf4180897-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.217 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[bd7e7812-9760-48e8-9099-583e7517f129]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.221 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[efcbce77-aae3-4d05-a3de-75c920861668]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 NetworkManager[51724]: <info>  [1759267483.2535] device (tapf4180897-f0): carrier: link connected
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.262 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0fd5fc-a585-4397-877d-7ee85bd7b5ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.281 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4a0fbe-0ec0-4890-8853-d0ef8287a690]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4180897-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:a9:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417892, 'reachable_time': 16605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227663, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.297 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8dec020d-7b31-44ca-8705-033d6213972e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:a9c4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 417892, 'tstamp': 417892}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227664, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.326 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bef6e86a-1959-4996-a4dc-1d43c36de02a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4180897-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:a9:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417892, 'reachable_time': 16605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227665, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.375 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9badb6d2-0adf-4cda-95c9-b8d90d499f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.462 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5e15e2-d36f-436d-a9a5-539265244f03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.463 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4180897-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.463 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.464 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4180897-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:43 compute-1 NetworkManager[51724]: <info>  [1759267483.4665] manager: (tapf4180897-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Sep 30 21:24:43 compute-1 kernel: tapf4180897-f0: entered promiscuous mode
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.470 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4180897-f0, col_values=(('external_ids', {'iface-id': 'f5229ae3-c5a0-4511-b62d-05f7c88709d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:43 compute-1 ovn_controller[94902]: 2025-09-30T21:24:43Z|00194|binding|INFO|Releasing lport f5229ae3-c5a0-4511-b62d-05f7c88709d9 from this chassis (sb_readonly=0)
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.474 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4180897-fb47-4ee3-b86e-380da38f2ec5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4180897-fb47-4ee3-b86e-380da38f2ec5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.475 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1732ff-9bcf-4d77-b281-0f29a0c4f6d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.476 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f4180897-fb47-4ee3-b86e-380da38f2ec5
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f4180897-fb47-4ee3-b86e-380da38f2ec5.pid.haproxy
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f4180897-fb47-4ee3-b86e-380da38f2ec5
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:24:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:43.476 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'env', 'PROCESS_TAG=haproxy-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f4180897-fb47-4ee3-b86e-380da38f2ec5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.753 2 DEBUG nova.compute.manager [req-050454db-78c9-4118-8759-fb6d58572b77 req-a15de02a-679f-4e4b-bbc7-917c97dd04e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Received event network-vif-plugged-12aa3fda-2986-4025-bda8-bcc4b6872810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.754 2 DEBUG oslo_concurrency.lockutils [req-050454db-78c9-4118-8759-fb6d58572b77 req-a15de02a-679f-4e4b-bbc7-917c97dd04e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "dc03304c-f076-4627-9b00-265c7d559784-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.754 2 DEBUG oslo_concurrency.lockutils [req-050454db-78c9-4118-8759-fb6d58572b77 req-a15de02a-679f-4e4b-bbc7-917c97dd04e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.754 2 DEBUG oslo_concurrency.lockutils [req-050454db-78c9-4118-8759-fb6d58572b77 req-a15de02a-679f-4e4b-bbc7-917c97dd04e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.755 2 DEBUG nova.compute.manager [req-050454db-78c9-4118-8759-fb6d58572b77 req-a15de02a-679f-4e4b-bbc7-917c97dd04e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Processing event network-vif-plugged-12aa3fda-2986-4025-bda8-bcc4b6872810 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:24:43 compute-1 podman[227704]: 2025-09-30 21:24:43.920830497 +0000 UTC m=+0.059152084 container create 621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:24:43 compute-1 systemd[1]: Started libpod-conmon-621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63.scope.
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.966 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267483.9655855, dc03304c-f076-4627-9b00-265c7d559784 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.967 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] VM Started (Lifecycle Event)
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.969 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.982 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:24:43 compute-1 podman[227704]: 2025-09-30 21:24:43.892324959 +0000 UTC m=+0.030646546 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.991 2 INFO nova.virt.libvirt.driver [-] [instance: dc03304c-f076-4627-9b00-265c7d559784] Instance spawned successfully.
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.992 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:24:43 compute-1 nova_compute[192795]: 2025-09-30 21:24:43.996 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.002 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:24:44 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:24:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68cedf5716b1e0efb5bcbb7b579d5bc85681d91d6f528455ba7e3b2372f24dae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.015 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}14e5db1849b70689dbb8f5c07f9763ac1a74da54d89edda9de905bf920d22a92" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.023 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.024 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267483.9657211, dc03304c-f076-4627-9b00-265c7d559784 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.024 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] VM Paused (Lifecycle Event)
Sep 30 21:24:44 compute-1 podman[227704]: 2025-09-30 21:24:44.027854507 +0000 UTC m=+0.166176094 container init 621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.030 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.031 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.031 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.031 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.032 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.032 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:44 compute-1 podman[227704]: 2025-09-30 21:24:44.03576193 +0000 UTC m=+0.174083507 container start 621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.055 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.062 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267483.9754713, dc03304c-f076-4627-9b00-265c7d559784 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.062 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] VM Resumed (Lifecycle Event)
Sep 30 21:24:44 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227722]: [NOTICE]   (227755) : New worker (227770) forked
Sep 30 21:24:44 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227722]: [NOTICE]   (227755) : Loading success.
Sep 30 21:24:44 compute-1 podman[227721]: 2025-09-30 21:24:44.079333793 +0000 UTC m=+0.098405211 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:24:44 compute-1 podman[227720]: 2025-09-30 21:24:44.093313649 +0000 UTC m=+0.119864508 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public)
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.095 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.101 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.139 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.140 2 INFO nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Took 8.02 seconds to spawn the instance on the hypervisor.
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.140 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.268 2 INFO nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Took 9.03 seconds to build instance.
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.270 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Tue, 30 Sep 2025 21:24:44 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-112bf29d-8f56-4ca7-a261-bd538837b6b1 x-openstack-request-id: req-112bf29d-8f56-4ca7-a261-bd538837b6b1 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.270 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "afe5c12d-500a-499b-9438-9e9c37698acc", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}]}, {"id": "c9779bca-1eb6-4567-a36c-b452abeafc70", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.270 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-112bf29d-8f56-4ca7-a261-bd538837b6b1 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.272 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}14e5db1849b70689dbb8f5c07f9763ac1a74da54d89edda9de905bf920d22a92" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.305 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.345 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Tue, 30 Sep 2025 21:24:44 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-2c190f79-2b5c-45d6-98d2-456dd2b9d4a6 x-openstack-request-id: req-2c190f79-2b5c-45d6-98d2-456dd2b9d4a6 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.346 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "afe5c12d-500a-499b-9438-9e9c37698acc", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.346 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc used request id req-2c190f79-2b5c-45d6-98d2-456dd2b9d4a6 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.347 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'dc03304c-f076-4627-9b00-265c7d559784', 'name': 'tempest-tempest.common.compute-instance-966346934-2', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000031', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '34c754fa0f364622a4433b9ba5718857', 'user_id': '1a8900f8597741ad930d414e1db02d76', 'hostId': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.367 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.368 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00d238fa-0376-4863-b2fd-78165c100d53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-vda', 'timestamp': '2025-09-30T21:24:44.347865', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2362d5e-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': 'bbbe8c7027d6464e006f0547b0ec8c938f26a65e961f12ab20faabac2aebcc82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-sda', 'timestamp': '2025-09-30T21:24:44.347865', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e23647f8-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': 'b0b3546672425d0ef8cb21cae17f7ee1bceaf57b98d34a3fa9c2418edbca71c8'}]}, 'timestamp': '2025-09-30 21:24:44.369401', '_unique_id': '11e96bbf79304b7997230b6bdd506157'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.371 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.373 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.378 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for dc03304c-f076-4627-9b00-265c7d559784 / tap12aa3fda-29 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.378 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20bb74b5-776a-4407-804d-baed21e2b0fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.373691', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e237d366-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': 'dc2c68c308e6ee19c6d36a27aa279943cdfa77bff7a2160fdefd2991cc0f5d62'}]}, 'timestamp': '2025-09-30 21:24:44.379563', '_unique_id': '187a0464a59c4d8d9cd0fa708aa2445b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.380 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.382 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24fcd090-348f-4a6a-b304-58eb9b1f18cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.382573', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e23861a0-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': 'd6d5ad9f85f11d967ee9ee392bf7f4353c8be2d2349a76d85432be8a5b90570e'}]}, 'timestamp': '2025-09-30 21:24:44.383123', '_unique_id': 'bb5d6b087a1549a9ae7c6290692ad756'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.384 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.385 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.385 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67a2baaa-7486-430b-80d3-39e6ea62ed2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.385780', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e238dedc-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': '8bcd3aa51c7f26fe9a62b6aeb486f8baa557b10dcfc6c4994702a05e8a9532ef'}]}, 'timestamp': '2025-09-30 21:24:44.386366', '_unique_id': 'ac8999a2ee024059a719dd9f592d584d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.387 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.388 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.400 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.400 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6df6f901-2b99-49bf-bca3-19f5c36366fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-vda', 'timestamp': '2025-09-30T21:24:44.388857', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e23b142c-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.121661022, 'message_signature': '5e5cff8dc0e40d7e0a28e9a84c17c17ca61745678400f6169c78b3e75f44761b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-sda', 'timestamp': '2025-09-30T21:24:44.388857', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e23b2958-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.121661022, 'message_signature': 'd3a5a5dcdf814bc3f724b5e195a0dc831d676e64121ad1d54503d91a2135dbee'}]}, 'timestamp': '2025-09-30 21:24:44.401361', '_unique_id': 'f72ff088d08d451cbb6e7c2a58e98cb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.403 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.404 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.404 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03ab521e-e659-430f-a30f-b4e3fc7ba0d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.404719', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e23bc322-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': '7e94b35dd091172d4db8fb98b4812a258f445eb5d5d5d9f4738277dbc21fc89d'}]}, 'timestamp': '2025-09-30 21:24:44.405270', '_unique_id': '424363d1198a45cbb6db768cd8374d5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.406 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.407 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.408 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.408 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ebb06e3-efee-4184-9c22-aa24b6d25e49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-vda', 'timestamp': '2025-09-30T21:24:44.408036', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e23c47de-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': '70daf446f44859a3e2f6303ca7ce0e27bd5b02d63816731a810fbe601f3dd8d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-sda', 'timestamp': '2025-09-30T21:24:44.408036', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e23c5fa8-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': '1de99ba7701df045c37d7fa24e70782b11e5815e8da3c75e4f4224981e5c8707'}]}, 'timestamp': '2025-09-30 21:24:44.409361', '_unique_id': '44d64f111f0c414f83a39a5709ce8de1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.410 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.412 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.412 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-2>]
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.413 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.413 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0b75430-a37c-43ff-9b45-8747d78071a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.413663', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e23d24ec-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': '841f57ae48c5137ad011027ebff0f6efbe7a821e784f19a5a2b93d0ff8fcdafa'}]}, 'timestamp': '2025-09-30 21:24:44.414489', '_unique_id': 'c0aac4e02d6e4b8bb1496aaaaaa4201f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.415 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.418 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.439 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/cpu volume: 380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8aa67ec-5c1c-4452-893d-ea669ace5abf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 380000000, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'timestamp': '2025-09-30T21:24:44.418221', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e2411d54-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.172186722, 'message_signature': '2b32a45161b66c81c02064c76d6b2b18d985565775a349b8b8c842fe4d75289e'}]}, 'timestamp': '2025-09-30 21:24:44.440484', '_unique_id': '5d51c111d8c74ef483dbcaa5a2c258d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.442 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.443 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.443 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.444 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cd74183-9342-41e0-a074-5f1e9978a72f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-vda', 'timestamp': '2025-09-30T21:24:44.443950', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e241bea8-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': '39c16e9a09eb978f6175084cd8cb1e145e99bd2c92b44415594226628ccd53bd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-sda', 'timestamp': '2025-09-30T21:24:44.443950', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e241d456-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': '5bf91051c7cb3c506c5918088ec2c1ec8216d756fc377126453d585fc47591ee'}]}, 'timestamp': '2025-09-30 21:24:44.445008', '_unique_id': '0baa2ddae9564d47835ebc30a0a833a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.446 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.447 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.447 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16f825e7-8ea6-45ae-9232-2a3da77df945', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.447657', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e2424f76-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': '6f3e974af6469746e2429ce26b56c44ee2db8e62bece4386776b9c0548ecd411'}]}, 'timestamp': '2025-09-30 21:24:44.448171', '_unique_id': 'be39252b1c6e43f890d496aa56e019a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.449 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.450 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.450 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.451 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02a2d3ed-8b46-4016-8788-fa6fba2bea91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-vda', 'timestamp': '2025-09-30T21:24:44.450617', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e242c262-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': 'b1717ab960331ed6f6e1b25dc2b10361755ca2052142f2c4940ab7963e8dc522'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-sda', 'timestamp': '2025-09-30T21:24:44.450617', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e242d4e6-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': '709414b9b5f35ca618bef0a814f3a403bae35b2a618cd0dce45bb4f40ddca235'}]}, 'timestamp': '2025-09-30 21:24:44.451560', '_unique_id': '5af98315efb74feeb694238e903f7566'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.452 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.454 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.454 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-2>]
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.454 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.454 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.454 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance dc03304c-f076-4627-9b00-265c7d559784: ceilometer.compute.pollsters.NoVolumeException
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.455 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.455 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3a5ea17-5aa7-4198-a16f-4780ff37c303', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.455282', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e2437b8a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': '264d5b1c4270e80b25c09ac0087f02cf8be94f59ddb2def586ea1ae8246647e9'}]}, 'timestamp': '2025-09-30 21:24:44.455858', '_unique_id': '0e71fba5590747588b71fdf5cb3872a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.456 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.458 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.458 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06561e8a-c758-4910-88e9-6e6a2ead0a64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.458210', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e243ee58-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': '749454f9c8867c1b11c28f41f9b6987f3fbfc6fa4c538f9f791fabad6f912f2a'}]}, 'timestamp': '2025-09-30 21:24:44.458798', '_unique_id': '925eb945afd74b078db67b8d8d366bb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.459 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.461 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.461 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.461 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-2>]
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.461 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.462 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.462 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0525902-333f-426c-98d5-b25157c59b01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-vda', 'timestamp': '2025-09-30T21:24:44.461974', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2447e72-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': '3730847326ffef13493f3646dbac2bedd8185ea6487f9eedfdf41916de69ded7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-sda', 'timestamp': '2025-09-30T21:24:44.461974', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e244960a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': '67c1c2ea8c2997382a361149a6500a6232f37d5fb1cd8e7ab9f6b88be2a1c80d'}]}, 'timestamp': '2025-09-30 21:24:44.463097', '_unique_id': '9a841b04d71c4669ad641e524710e409'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.464 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.465 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.465 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.466 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4d94fbb-84eb-435b-b49e-c5eb50cd1e96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-vda', 'timestamp': '2025-09-30T21:24:44.465851', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e245162a-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.121661022, 'message_signature': '5b1968e0d4bf17095b9dfb699b2770de1b543c6c28073d9fd0de7d18039f318f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-sda', 'timestamp': '2025-09-30T21:24:44.465851', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e24529f8-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.121661022, 'message_signature': '365a8dc7a9e8defa253c976b8c31abddd2b6e54e3834e8a537f211341a7da2cf'}]}, 'timestamp': '2025-09-30 21:24:44.466922', '_unique_id': '4c22213f6fc043e2a3604d41e51f17b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.468 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.469 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.470 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e57f31b3-b73a-4589-b253-c3f74db8d371', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.469971', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e245bb02-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': '76301308977b3bd41e57000d4839aea06e3415c0b33829167e26448d826d0edf'}]}, 'timestamp': '2025-09-30 21:24:44.470582', '_unique_id': '44849d3db6e74b0d944898ed39942325'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.471 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.472 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.472 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.472 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-2>]
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.472 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.473 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.473 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6c30968-44e1-4713-ba4d-ce011e9562e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-vda', 'timestamp': '2025-09-30T21:24:44.473011', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2462bd2-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': '65c2f92a975ebb1f9bbab192702d30a96d4faf0a3e26e629073eed29358831aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-sda', 'timestamp': '2025-09-30T21:24:44.473011', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2463aaa-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.08071452, 'message_signature': 'db8ca832b83f026f9862e7e6860a9308abcd849874916d95ac9b20da38a42fc6'}]}, 'timestamp': '2025-09-30 21:24:44.473769', '_unique_id': 'e149caaa745c4650b16d4744758655b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.474 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.475 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.475 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '529ffb52-36f1-49fc-8715-92d9d66ab6d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000031-dc03304c-f076-4627-9b00-265c7d559784-tap12aa3fda-29', 'timestamp': '2025-09-30T21:24:44.475864', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'tap12aa3fda-29', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:40:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap12aa3fda-29'}, 'message_id': 'e2469e28-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.106572046, 'message_signature': 'd7e92c71169dfc5228a3d1d51664f23e4b94fcfd0bbec8011ac8f9f9a01d2645'}]}, 'timestamp': '2025-09-30 21:24:44.476427', '_unique_id': 'aa5993ba9c07470db394a830da10feb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.477 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.478 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.478 12 DEBUG ceilometer.compute.pollsters [-] dc03304c-f076-4627-9b00-265c7d559784/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '308cb00e-4169-4e04-b87f-4535be0c3c25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-vda', 'timestamp': '2025-09-30T21:24:44.478042', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e246ef22-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.121661022, 'message_signature': '810d91793260c33fb90964ed2591bb415a4ed637272ac1168aeb0f74bd2b3ab0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'dc03304c-f076-4627-9b00-265c7d559784-sda', 'timestamp': '2025-09-30T21:24:44.478042', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-2', 'name': 'instance-00000031', 'instance_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'instance_type': 'm1.nano', 'host': 'cae69157580c46184853f2cbffc086f7e95186b95efbc770daeae78a', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e246fc10-9e43-11f0-984a-fa163e8033fc', 'monotonic_time': 4180.121661022, 'message_signature': '040ebf240c53229ea9854629264cd217f82f6323c184742adaeb78b2b06afde9'}]}, 'timestamp': '2025-09-30 21:24:44.478700', '_unique_id': '23e12a8397044b7e9300940a795847ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:24:44.479 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.564 2 DEBUG nova.network.neutron [req-2d6f21e7-ebb7-44f9-a156-b7bd7f5a2f88 req-7ace629c-1b29-4ab3-9eaa-006c889ed378 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Updated VIF entry in instance network info cache for port 12aa3fda-2986-4025-bda8-bcc4b6872810. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.565 2 DEBUG nova.network.neutron [req-2d6f21e7-ebb7-44f9-a156-b7bd7f5a2f88 req-7ace629c-1b29-4ab3-9eaa-006c889ed378 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Updating instance_info_cache with network_info: [{"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.585 2 DEBUG oslo_concurrency.lockutils [req-2d6f21e7-ebb7-44f9-a156-b7bd7f5a2f88 req-7ace629c-1b29-4ab3-9eaa-006c889ed378 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-dc03304c-f076-4627-9b00-265c7d559784" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:44 compute-1 nova_compute[192795]: 2025-09-30 21:24:44.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:45 compute-1 nova_compute[192795]: 2025-09-30 21:24:45.860 2 DEBUG nova.compute.manager [req-9867d712-23a0-40f6-a846-58e0660be311 req-aa562e17-be4e-47f3-a806-040bce6fedd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Received event network-vif-plugged-12aa3fda-2986-4025-bda8-bcc4b6872810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:45 compute-1 nova_compute[192795]: 2025-09-30 21:24:45.860 2 DEBUG oslo_concurrency.lockutils [req-9867d712-23a0-40f6-a846-58e0660be311 req-aa562e17-be4e-47f3-a806-040bce6fedd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "dc03304c-f076-4627-9b00-265c7d559784-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:45 compute-1 nova_compute[192795]: 2025-09-30 21:24:45.860 2 DEBUG oslo_concurrency.lockutils [req-9867d712-23a0-40f6-a846-58e0660be311 req-aa562e17-be4e-47f3-a806-040bce6fedd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:45 compute-1 nova_compute[192795]: 2025-09-30 21:24:45.861 2 DEBUG oslo_concurrency.lockutils [req-9867d712-23a0-40f6-a846-58e0660be311 req-aa562e17-be4e-47f3-a806-040bce6fedd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:45 compute-1 nova_compute[192795]: 2025-09-30 21:24:45.861 2 DEBUG nova.compute.manager [req-9867d712-23a0-40f6-a846-58e0660be311 req-aa562e17-be4e-47f3-a806-040bce6fedd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] No waiting events found dispatching network-vif-plugged-12aa3fda-2986-4025-bda8-bcc4b6872810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:24:45 compute-1 nova_compute[192795]: 2025-09-30 21:24:45.861 2 WARNING nova.compute.manager [req-9867d712-23a0-40f6-a846-58e0660be311 req-aa562e17-be4e-47f3-a806-040bce6fedd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Received unexpected event network-vif-plugged-12aa3fda-2986-4025-bda8-bcc4b6872810 for instance with vm_state active and task_state None.
Sep 30 21:24:46 compute-1 nova_compute[192795]: 2025-09-30 21:24:46.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:46 compute-1 nova_compute[192795]: 2025-09-30 21:24:46.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:24:46 compute-1 nova_compute[192795]: 2025-09-30 21:24:46.725 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:24:46 compute-1 nova_compute[192795]: 2025-09-30 21:24:46.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:46 compute-1 nova_compute[192795]: 2025-09-30 21:24:46.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.277 2 DEBUG oslo_concurrency.lockutils [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "dc03304c-f076-4627-9b00-265c7d559784" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.278 2 DEBUG oslo_concurrency.lockutils [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.279 2 DEBUG oslo_concurrency.lockutils [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "dc03304c-f076-4627-9b00-265c7d559784-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.279 2 DEBUG oslo_concurrency.lockutils [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.280 2 DEBUG oslo_concurrency.lockutils [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.296 2 INFO nova.compute.manager [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Terminating instance
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.316 2 DEBUG nova.compute.manager [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:24:48 compute-1 kernel: tap12aa3fda-29 (unregistering): left promiscuous mode
Sep 30 21:24:48 compute-1 NetworkManager[51724]: <info>  [1759267488.3415] device (tap12aa3fda-29): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:24:48 compute-1 ovn_controller[94902]: 2025-09-30T21:24:48Z|00195|binding|INFO|Releasing lport 12aa3fda-2986-4025-bda8-bcc4b6872810 from this chassis (sb_readonly=0)
Sep 30 21:24:48 compute-1 ovn_controller[94902]: 2025-09-30T21:24:48Z|00196|binding|INFO|Setting lport 12aa3fda-2986-4025-bda8-bcc4b6872810 down in Southbound
Sep 30 21:24:48 compute-1 ovn_controller[94902]: 2025-09-30T21:24:48Z|00197|binding|INFO|Removing iface tap12aa3fda-29 ovn-installed in OVS
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.366 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:40:d4 10.100.0.11'], port_security=['fa:16:3e:39:40:d4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'dc03304c-f076-4627-9b00-265c7d559784', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34c754fa0f364622a4433b9ba5718857', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1c375ca4-4abf-4405-95d5-43748e715058', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f21a560c-012c-4374-b9ef-0dc124a433df, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=12aa3fda-2986-4025-bda8-bcc4b6872810) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.368 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 12aa3fda-2986-4025-bda8-bcc4b6872810 in datapath f4180897-fb47-4ee3-b86e-380da38f2ec5 unbound from our chassis
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.371 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4180897-fb47-4ee3-b86e-380da38f2ec5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.372 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a273b868-362b-4390-8f49-8a411bf2b375]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.373 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 namespace which is not needed anymore
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000031.scope: Deactivated successfully.
Sep 30 21:24:48 compute-1 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000031.scope: Consumed 5.268s CPU time.
Sep 30 21:24:48 compute-1 systemd-machined[152783]: Machine qemu-24-instance-00000031 terminated.
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.610 2 INFO nova.virt.libvirt.driver [-] [instance: dc03304c-f076-4627-9b00-265c7d559784] Instance destroyed successfully.
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.611 2 DEBUG nova.objects.instance [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lazy-loading 'resources' on Instance uuid dc03304c-f076-4627-9b00-265c7d559784 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:48 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227722]: [NOTICE]   (227755) : haproxy version is 2.8.14-c23fe91
Sep 30 21:24:48 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227722]: [NOTICE]   (227755) : path to executable is /usr/sbin/haproxy
Sep 30 21:24:48 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227722]: [WARNING]  (227755) : Exiting Master process...
Sep 30 21:24:48 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227722]: [WARNING]  (227755) : Exiting Master process...
Sep 30 21:24:48 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227722]: [ALERT]    (227755) : Current worker (227770) exited with code 143 (Terminated)
Sep 30 21:24:48 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227722]: [WARNING]  (227755) : All workers exited. Exiting... (0)
Sep 30 21:24:48 compute-1 systemd[1]: libpod-621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63.scope: Deactivated successfully.
Sep 30 21:24:48 compute-1 podman[227803]: 2025-09-30 21:24:48.635083359 +0000 UTC m=+0.070132685 container died 621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.643 2 DEBUG nova.virt.libvirt.vif [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-966346934',display_name='tempest-tempest.common.compute-instance-966346934-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-966346934-2',id=49,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-09-30T21:24:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34c754fa0f364622a4433b9ba5718857',ramdisk_id='',reservation_id='r-zs1xvy19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-413721927',owner_user_name='tempest-MultipleCreateTestJSON-413721927-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:24:44Z,user_data=None,user_id='1a8900f8597741ad930d414e1db02d76',uuid=dc03304c-f076-4627-9b00-265c7d559784,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.643 2 DEBUG nova.network.os_vif_util [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converting VIF {"id": "12aa3fda-2986-4025-bda8-bcc4b6872810", "address": "fa:16:3e:39:40:d4", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12aa3fda-29", "ovs_interfaceid": "12aa3fda-2986-4025-bda8-bcc4b6872810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.644 2 DEBUG nova.network.os_vif_util [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:40:d4,bridge_name='br-int',has_traffic_filtering=True,id=12aa3fda-2986-4025-bda8-bcc4b6872810,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12aa3fda-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.644 2 DEBUG os_vif [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:40:d4,bridge_name='br-int',has_traffic_filtering=True,id=12aa3fda-2986-4025-bda8-bcc4b6872810,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12aa3fda-29') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.649 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12aa3fda-29, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.659 2 INFO os_vif [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:40:d4,bridge_name='br-int',has_traffic_filtering=True,id=12aa3fda-2986-4025-bda8-bcc4b6872810,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12aa3fda-29')
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.661 2 INFO nova.virt.libvirt.driver [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Deleting instance files /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784_del
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.663 2 INFO nova.virt.libvirt.driver [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Deletion of /var/lib/nova/instances/dc03304c-f076-4627-9b00-265c7d559784_del complete
Sep 30 21:24:48 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63-userdata-shm.mount: Deactivated successfully.
Sep 30 21:24:48 compute-1 systemd[1]: var-lib-containers-storage-overlay-68cedf5716b1e0efb5bcbb7b579d5bc85681d91d6f528455ba7e3b2372f24dae-merged.mount: Deactivated successfully.
Sep 30 21:24:48 compute-1 podman[227803]: 2025-09-30 21:24:48.692263236 +0000 UTC m=+0.127312562 container cleanup 621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3)
Sep 30 21:24:48 compute-1 systemd[1]: libpod-conmon-621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63.scope: Deactivated successfully.
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.725 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.726 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.726 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.775 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.776 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:24:48 compute-1 podman[227849]: 2025-09-30 21:24:48.782799001 +0000 UTC m=+0.058325910 container remove 621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.791 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[70a71461-0c1d-4115-8ab4-2cb7fd0875ba]: (4, ('Tue Sep 30 09:24:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 (621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63)\n621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63\nTue Sep 30 09:24:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 (621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63)\n621ee269fa2e00e4de551d9d7631769e3657b230ad3fe1d5447ec5786986fc63\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.794 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e08222dc-6241-4e33-bc32-a9e846cecde4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.794 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4180897-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-1 kernel: tapf4180897-f0: left promiscuous mode
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.814 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9b896e-9f03-4a08-b7e6-135b3b1ad82e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.843 2 INFO nova.compute.manager [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Took 0.53 seconds to destroy the instance on the hypervisor.
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.843 2 DEBUG oslo.service.loopingcall [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.844 2 DEBUG nova.compute.manager [-] [instance: dc03304c-f076-4627-9b00-265c7d559784] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:24:48 compute-1 nova_compute[192795]: 2025-09-30 21:24:48.844 2 DEBUG nova.network.neutron [-] [instance: dc03304c-f076-4627-9b00-265c7d559784] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.847 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dc1769c1-b6c1-4943-adb6-e434f9a00fb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.849 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d8a159-67fb-4ad5-9e20-a14b6866ea18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.864 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a55b419d-556b-4035-b677-8c99e18b86b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417881, 'reachable_time': 36494, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227861, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.867 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:24:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:24:48.867 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[25804335-7fe4-48cd-aaec-2efa1737849e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-1 systemd[1]: run-netns-ovnmeta\x2df4180897\x2dfb47\x2d4ee3\x2db86e\x2d380da38f2ec5.mount: Deactivated successfully.
Sep 30 21:24:49 compute-1 nova_compute[192795]: 2025-09-30 21:24:49.587 2 DEBUG nova.compute.manager [req-feaa987d-6427-4ea0-9d59-2e668171ec3c req-92856f2f-c77f-4131-a28d-731c378c31ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Received event network-vif-unplugged-12aa3fda-2986-4025-bda8-bcc4b6872810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:49 compute-1 nova_compute[192795]: 2025-09-30 21:24:49.588 2 DEBUG oslo_concurrency.lockutils [req-feaa987d-6427-4ea0-9d59-2e668171ec3c req-92856f2f-c77f-4131-a28d-731c378c31ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "dc03304c-f076-4627-9b00-265c7d559784-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:49 compute-1 nova_compute[192795]: 2025-09-30 21:24:49.588 2 DEBUG oslo_concurrency.lockutils [req-feaa987d-6427-4ea0-9d59-2e668171ec3c req-92856f2f-c77f-4131-a28d-731c378c31ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:49 compute-1 nova_compute[192795]: 2025-09-30 21:24:49.589 2 DEBUG oslo_concurrency.lockutils [req-feaa987d-6427-4ea0-9d59-2e668171ec3c req-92856f2f-c77f-4131-a28d-731c378c31ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:49 compute-1 nova_compute[192795]: 2025-09-30 21:24:49.589 2 DEBUG nova.compute.manager [req-feaa987d-6427-4ea0-9d59-2e668171ec3c req-92856f2f-c77f-4131-a28d-731c378c31ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] No waiting events found dispatching network-vif-unplugged-12aa3fda-2986-4025-bda8-bcc4b6872810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:24:49 compute-1 nova_compute[192795]: 2025-09-30 21:24:49.589 2 DEBUG nova.compute.manager [req-feaa987d-6427-4ea0-9d59-2e668171ec3c req-92856f2f-c77f-4131-a28d-731c378c31ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Received event network-vif-unplugged-12aa3fda-2986-4025-bda8-bcc4b6872810 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.135 2 DEBUG nova.network.neutron [-] [instance: dc03304c-f076-4627-9b00-265c7d559784] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.243 2 DEBUG nova.compute.manager [req-c06b1cba-f1ec-496d-b7c7-78d3b068643d req-58d67f6e-7dc5-44cc-8b34-980816524c00 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Received event network-vif-deleted-12aa3fda-2986-4025-bda8-bcc4b6872810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.244 2 INFO nova.compute.manager [req-c06b1cba-f1ec-496d-b7c7-78d3b068643d req-58d67f6e-7dc5-44cc-8b34-980816524c00 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Neutron deleted interface 12aa3fda-2986-4025-bda8-bcc4b6872810; detaching it from the instance and deleting it from the info cache
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.244 2 DEBUG nova.network.neutron [req-c06b1cba-f1ec-496d-b7c7-78d3b068643d req-58d67f6e-7dc5-44cc-8b34-980816524c00 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.250 2 INFO nova.compute.manager [-] [instance: dc03304c-f076-4627-9b00-265c7d559784] Took 1.41 seconds to deallocate network for instance.
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.274 2 DEBUG nova.compute.manager [req-c06b1cba-f1ec-496d-b7c7-78d3b068643d req-58d67f6e-7dc5-44cc-8b34-980816524c00 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Detach interface failed, port_id=12aa3fda-2986-4025-bda8-bcc4b6872810, reason: Instance dc03304c-f076-4627-9b00-265c7d559784 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.392 2 DEBUG oslo_concurrency.lockutils [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.393 2 DEBUG oslo_concurrency.lockutils [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.606 2 DEBUG nova.compute.provider_tree [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.632 2 DEBUG nova.scheduler.client.report [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.656 2 DEBUG oslo_concurrency.lockutils [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.762 2 INFO nova.scheduler.client.report [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Deleted allocations for instance dc03304c-f076-4627-9b00-265c7d559784
Sep 30 21:24:50 compute-1 nova_compute[192795]: 2025-09-30 21:24:50.875 2 DEBUG oslo_concurrency.lockutils [None req-ac2125da-0da8-41a5-bb73-cf8a15cc2ba8 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:51 compute-1 nova_compute[192795]: 2025-09-30 21:24:51.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:51 compute-1 nova_compute[192795]: 2025-09-30 21:24:51.766 2 DEBUG nova.compute.manager [req-a7f8e5d6-eaf0-4019-94bc-1ff84690cf41 req-c9ed3b75-a49e-4db6-9473-f6b1a8123feb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Received event network-vif-plugged-12aa3fda-2986-4025-bda8-bcc4b6872810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:51 compute-1 nova_compute[192795]: 2025-09-30 21:24:51.767 2 DEBUG oslo_concurrency.lockutils [req-a7f8e5d6-eaf0-4019-94bc-1ff84690cf41 req-c9ed3b75-a49e-4db6-9473-f6b1a8123feb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "dc03304c-f076-4627-9b00-265c7d559784-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:51 compute-1 nova_compute[192795]: 2025-09-30 21:24:51.767 2 DEBUG oslo_concurrency.lockutils [req-a7f8e5d6-eaf0-4019-94bc-1ff84690cf41 req-c9ed3b75-a49e-4db6-9473-f6b1a8123feb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:51 compute-1 nova_compute[192795]: 2025-09-30 21:24:51.768 2 DEBUG oslo_concurrency.lockutils [req-a7f8e5d6-eaf0-4019-94bc-1ff84690cf41 req-c9ed3b75-a49e-4db6-9473-f6b1a8123feb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dc03304c-f076-4627-9b00-265c7d559784-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:51 compute-1 nova_compute[192795]: 2025-09-30 21:24:51.768 2 DEBUG nova.compute.manager [req-a7f8e5d6-eaf0-4019-94bc-1ff84690cf41 req-c9ed3b75-a49e-4db6-9473-f6b1a8123feb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] No waiting events found dispatching network-vif-plugged-12aa3fda-2986-4025-bda8-bcc4b6872810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:24:51 compute-1 nova_compute[192795]: 2025-09-30 21:24:51.768 2 WARNING nova.compute.manager [req-a7f8e5d6-eaf0-4019-94bc-1ff84690cf41 req-c9ed3b75-a49e-4db6-9473-f6b1a8123feb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dc03304c-f076-4627-9b00-265c7d559784] Received unexpected event network-vif-plugged-12aa3fda-2986-4025-bda8-bcc4b6872810 for instance with vm_state deleted and task_state None.
Sep 30 21:24:51 compute-1 nova_compute[192795]: 2025-09-30 21:24:51.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:52 compute-1 nova_compute[192795]: 2025-09-30 21:24:52.370 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267477.3698654, e704a3db-d970-44f4-8b39-b304bbac4a69 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:52 compute-1 nova_compute[192795]: 2025-09-30 21:24:52.371 2 INFO nova.compute.manager [-] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] VM Stopped (Lifecycle Event)
Sep 30 21:24:52 compute-1 nova_compute[192795]: 2025-09-30 21:24:52.389 2 DEBUG nova.compute.manager [None req-df1dfaa0-d167-44f2-aec9-12e0969054cc - - - - - -] [instance: e704a3db-d970-44f4-8b39-b304bbac4a69] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:53 compute-1 podman[227862]: 2025-09-30 21:24:53.259222195 +0000 UTC m=+0.097358486 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:24:53 compute-1 nova_compute[192795]: 2025-09-30 21:24:53.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.338 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.338 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.364 2 DEBUG nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.481 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.482 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.491 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.491 2 INFO nova.compute.claims [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.667 2 DEBUG nova.compute.provider_tree [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.699 2 DEBUG nova.scheduler.client.report [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.732 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.733 2 DEBUG nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.813 2 DEBUG nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.814 2 DEBUG nova.network.neutron [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.837 2 INFO nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:24:55 compute-1 nova_compute[192795]: 2025-09-30 21:24:55.877 2 DEBUG nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.028 2 DEBUG nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.029 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.030 2 INFO nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Creating image(s)
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.030 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "/var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.031 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "/var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.031 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "/var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.048 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.127 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.128 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.129 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.143 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.193 2 DEBUG nova.policy [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.213 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.214 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.253 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.255 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.255 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.319 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.320 2 DEBUG nova.virt.disk.api [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Checking if we can resize image /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.321 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.383 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.385 2 DEBUG nova.virt.disk.api [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Cannot resize image /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.385 2 DEBUG nova.objects.instance [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lazy-loading 'migration_context' on Instance uuid 1ecb1653-d3a8-40e1-9547-bc224d6db6b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.423 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.424 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Ensure instance console log exists: /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.424 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.425 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.425 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.957 2 DEBUG nova.network.neutron [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Successfully created port: 63d53aab-c63f-434b-9733-622094f23583 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:24:56 compute-1 nova_compute[192795]: 2025-09-30 21:24:56.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:58 compute-1 nova_compute[192795]: 2025-09-30 21:24:58.124 2 DEBUG nova.network.neutron [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Successfully updated port: 63d53aab-c63f-434b-9733-622094f23583 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:24:58 compute-1 nova_compute[192795]: 2025-09-30 21:24:58.145 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "refresh_cache-1ecb1653-d3a8-40e1-9547-bc224d6db6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:24:58 compute-1 nova_compute[192795]: 2025-09-30 21:24:58.145 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquired lock "refresh_cache-1ecb1653-d3a8-40e1-9547-bc224d6db6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:24:58 compute-1 nova_compute[192795]: 2025-09-30 21:24:58.145 2 DEBUG nova.network.neutron [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:24:58 compute-1 nova_compute[192795]: 2025-09-30 21:24:58.305 2 DEBUG nova.compute.manager [req-24f1ea3f-03a9-44cc-8517-f91d730a3afd req-5c1e8e7e-4f30-4759-be4a-9f8b861ca04a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Received event network-changed-63d53aab-c63f-434b-9733-622094f23583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:58 compute-1 nova_compute[192795]: 2025-09-30 21:24:58.305 2 DEBUG nova.compute.manager [req-24f1ea3f-03a9-44cc-8517-f91d730a3afd req-5c1e8e7e-4f30-4759-be4a-9f8b861ca04a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Refreshing instance network info cache due to event network-changed-63d53aab-c63f-434b-9733-622094f23583. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:24:58 compute-1 nova_compute[192795]: 2025-09-30 21:24:58.306 2 DEBUG oslo_concurrency.lockutils [req-24f1ea3f-03a9-44cc-8517-f91d730a3afd req-5c1e8e7e-4f30-4759-be4a-9f8b861ca04a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-1ecb1653-d3a8-40e1-9547-bc224d6db6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:24:58 compute-1 nova_compute[192795]: 2025-09-30 21:24:58.332 2 DEBUG nova.network.neutron [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:24:58 compute-1 nova_compute[192795]: 2025-09-30 21:24:58.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.647 2 DEBUG nova.network.neutron [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Updating instance_info_cache with network_info: [{"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.671 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Releasing lock "refresh_cache-1ecb1653-d3a8-40e1-9547-bc224d6db6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.672 2 DEBUG nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Instance network_info: |[{"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.672 2 DEBUG oslo_concurrency.lockutils [req-24f1ea3f-03a9-44cc-8517-f91d730a3afd req-5c1e8e7e-4f30-4759-be4a-9f8b861ca04a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-1ecb1653-d3a8-40e1-9547-bc224d6db6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.673 2 DEBUG nova.network.neutron [req-24f1ea3f-03a9-44cc-8517-f91d730a3afd req-5c1e8e7e-4f30-4759-be4a-9f8b861ca04a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Refreshing network info cache for port 63d53aab-c63f-434b-9733-622094f23583 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.678 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Start _get_guest_xml network_info=[{"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.685 2 WARNING nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.695 2 DEBUG nova.virt.libvirt.host [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.696 2 DEBUG nova.virt.libvirt.host [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.703 2 DEBUG nova.virt.libvirt.host [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.704 2 DEBUG nova.virt.libvirt.host [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.705 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.706 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.707 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.707 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.708 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.708 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.708 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.709 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.709 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.710 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.710 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.711 2 DEBUG nova.virt.hardware [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.718 2 DEBUG nova.virt.libvirt.vif [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1673531770',display_name='tempest-MultipleCreateTestJSON-server-1673531770-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1673531770-1',id=52,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34c754fa0f364622a4433b9ba5718857',ramdisk_id='',reservation_id='r-8qkebqs0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-413721927',owner_user_name='tempest-MultipleCreateTestJSON-413721927-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:24:55Z,user_data=None,user_id='1a8900f8597741ad930d414e1db02d76',uuid=1ecb1653-d3a8-40e1-9547-bc224d6db6b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.718 2 DEBUG nova.network.os_vif_util [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converting VIF {"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.719 2 DEBUG nova.network.os_vif_util [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:98:f1,bridge_name='br-int',has_traffic_filtering=True,id=63d53aab-c63f-434b-9733-622094f23583,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63d53aab-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.721 2 DEBUG nova.objects.instance [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1ecb1653-d3a8-40e1-9547-bc224d6db6b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.737 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <uuid>1ecb1653-d3a8-40e1-9547-bc224d6db6b9</uuid>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <name>instance-00000034</name>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <nova:name>tempest-MultipleCreateTestJSON-server-1673531770-1</nova:name>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:24:59</nova:creationTime>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:24:59 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:24:59 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:24:59 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:24:59 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:24:59 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:24:59 compute-1 nova_compute[192795]:         <nova:user uuid="1a8900f8597741ad930d414e1db02d76">tempest-MultipleCreateTestJSON-413721927-project-member</nova:user>
Sep 30 21:24:59 compute-1 nova_compute[192795]:         <nova:project uuid="34c754fa0f364622a4433b9ba5718857">tempest-MultipleCreateTestJSON-413721927</nova:project>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:24:59 compute-1 nova_compute[192795]:         <nova:port uuid="63d53aab-c63f-434b-9733-622094f23583">
Sep 30 21:24:59 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <system>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <entry name="serial">1ecb1653-d3a8-40e1-9547-bc224d6db6b9</entry>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <entry name="uuid">1ecb1653-d3a8-40e1-9547-bc224d6db6b9</entry>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     </system>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <os>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   </os>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <features>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   </features>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk.config"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:ac:98:f1"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <target dev="tap63d53aab-c6"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/console.log" append="off"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <video>
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     </video>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:24:59 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:24:59 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:24:59 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:24:59 compute-1 nova_compute[192795]: </domain>
Sep 30 21:24:59 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.738 2 DEBUG nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Preparing to wait for external event network-vif-plugged-63d53aab-c63f-434b-9733-622094f23583 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.739 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.739 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.740 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.740 2 DEBUG nova.virt.libvirt.vif [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1673531770',display_name='tempest-MultipleCreateTestJSON-server-1673531770-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1673531770-1',id=52,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34c754fa0f364622a4433b9ba5718857',ramdisk_id='',reservation_id='r-8qkebqs0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-413721927',owner_user_name='tempest-MultipleCreateTestJSON-413721927-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:24:55Z,user_data=None,user_id='1a8900f8597741ad930d414e1db02d76',uuid=1ecb1653-d3a8-40e1-9547-bc224d6db6b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.741 2 DEBUG nova.network.os_vif_util [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converting VIF {"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.741 2 DEBUG nova.network.os_vif_util [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:98:f1,bridge_name='br-int',has_traffic_filtering=True,id=63d53aab-c63f-434b-9733-622094f23583,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63d53aab-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.742 2 DEBUG os_vif [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:98:f1,bridge_name='br-int',has_traffic_filtering=True,id=63d53aab-c63f-434b-9733-622094f23583,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63d53aab-c6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.747 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63d53aab-c6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.748 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63d53aab-c6, col_values=(('external_ids', {'iface-id': '63d53aab-c63f-434b-9733-622094f23583', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:98:f1', 'vm-uuid': '1ecb1653-d3a8-40e1-9547-bc224d6db6b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:59 compute-1 NetworkManager[51724]: <info>  [1759267499.7521] manager: (tap63d53aab-c6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.762 2 INFO os_vif [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:98:f1,bridge_name='br-int',has_traffic_filtering=True,id=63d53aab-c63f-434b-9733-622094f23583,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63d53aab-c6')
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.827 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.828 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.828 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] No VIF found with MAC fa:16:3e:ac:98:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:24:59 compute-1 nova_compute[192795]: 2025-09-30 21:24:59.828 2 INFO nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Using config drive
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.002 2 INFO nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Creating config drive at /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk.config
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.010 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuz9bp347 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:01 compute-1 anacron[105422]: Job `cron.daily' started
Sep 30 21:25:01 compute-1 anacron[105422]: Job `cron.daily' terminated
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.154 2 DEBUG oslo_concurrency.processutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuz9bp347" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:01 compute-1 kernel: tap63d53aab-c6: entered promiscuous mode
Sep 30 21:25:01 compute-1 NetworkManager[51724]: <info>  [1759267501.2432] manager: (tap63d53aab-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Sep 30 21:25:01 compute-1 ovn_controller[94902]: 2025-09-30T21:25:01Z|00198|binding|INFO|Claiming lport 63d53aab-c63f-434b-9733-622094f23583 for this chassis.
Sep 30 21:25:01 compute-1 ovn_controller[94902]: 2025-09-30T21:25:01Z|00199|binding|INFO|63d53aab-c63f-434b-9733-622094f23583: Claiming fa:16:3e:ac:98:f1 10.100.0.14
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.257 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:98:f1 10.100.0.14'], port_security=['fa:16:3e:ac:98:f1 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1ecb1653-d3a8-40e1-9547-bc224d6db6b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34c754fa0f364622a4433b9ba5718857', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1c375ca4-4abf-4405-95d5-43748e715058', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f21a560c-012c-4374-b9ef-0dc124a433df, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=63d53aab-c63f-434b-9733-622094f23583) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.258 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 63d53aab-c63f-434b-9733-622094f23583 in datapath f4180897-fb47-4ee3-b86e-380da38f2ec5 bound to our chassis
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.260 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4180897-fb47-4ee3-b86e-380da38f2ec5
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.272 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8f692815-340f-4ca9-9575-b0ab2e357ca4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.273 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf4180897-f1 in ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:25:01 compute-1 ovn_controller[94902]: 2025-09-30T21:25:01Z|00200|binding|INFO|Setting lport 63d53aab-c63f-434b-9733-622094f23583 ovn-installed in OVS
Sep 30 21:25:01 compute-1 ovn_controller[94902]: 2025-09-30T21:25:01Z|00201|binding|INFO|Setting lport 63d53aab-c63f-434b-9733-622094f23583 up in Southbound
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.275 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf4180897-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.276 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ff82bbab-a2b9-4440-92cb-ea049753cda1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.278 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2d523b1e-d87d-4b6f-8eb9-c13b4a8478ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:01 compute-1 podman[227909]: 2025-09-30 21:25:01.288870658 +0000 UTC m=+0.115290952 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:25:01 compute-1 podman[227907]: 2025-09-30 21:25:01.293117291 +0000 UTC m=+0.120210694 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd)
Sep 30 21:25:01 compute-1 systemd-udevd[227985]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.298 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[b8270ae3-bd59-4272-b480-ea7f2b0cea10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 podman[227908]: 2025-09-30 21:25:01.306393964 +0000 UTC m=+0.136454045 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:25:01 compute-1 NetworkManager[51724]: <info>  [1759267501.3172] device (tap63d53aab-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:25:01 compute-1 NetworkManager[51724]: <info>  [1759267501.3188] device (tap63d53aab-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.319 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b2946569-a1cd-40e7-b262-c13342545f2d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 systemd-machined[152783]: New machine qemu-25-instance-00000034.
Sep 30 21:25:01 compute-1 systemd[1]: Started Virtual Machine qemu-25-instance-00000034.
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.353 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b79fbec6-9445-45c5-91d9-5245a59de21a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.358 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[11106043-fbc6-4fc4-9b77-2f073392add8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 NetworkManager[51724]: <info>  [1759267501.3610] manager: (tapf4180897-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/100)
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.416 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f4957e74-fcf7-4cb7-8d50-3225854e1adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.420 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5947fb23-0017-4895-9d02-13de36adf9a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 NetworkManager[51724]: <info>  [1759267501.4528] device (tapf4180897-f0): carrier: link connected
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.461 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[fb1d2f2f-c2f4-459d-ba53-e311b40e9dc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.487 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d28cdaaf-ef2f-4c56-933e-0da42f3bb38e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4180897-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:a9:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419712, 'reachable_time': 25227, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228019, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.512 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f941cf6c-0598-4fba-9eaa-8283ebabd958]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:a9c4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 419712, 'tstamp': 419712}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228020, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.538 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[55ba7fff-6fd2-4802-b766-b6bd757b9611]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4180897-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:a9:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419712, 'reachable_time': 25227, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228021, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.583 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7a987ba8-3223-4b0e-92f8-16218845cbda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.660 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[34b2fb5a-ba30-44e5-b25b-99d701f85a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.662 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4180897-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.663 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.663 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4180897-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:01 compute-1 NetworkManager[51724]: <info>  [1759267501.6665] manager: (tapf4180897-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:01 compute-1 kernel: tapf4180897-f0: entered promiscuous mode
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.670 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4180897-f0, col_values=(('external_ids', {'iface-id': 'f5229ae3-c5a0-4511-b62d-05f7c88709d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:01 compute-1 ovn_controller[94902]: 2025-09-30T21:25:01Z|00202|binding|INFO|Releasing lport f5229ae3-c5a0-4511-b62d-05f7c88709d9 from this chassis (sb_readonly=0)
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.675 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4180897-fb47-4ee3-b86e-380da38f2ec5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4180897-fb47-4ee3-b86e-380da38f2ec5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.675 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7078b1d6-2ca0-4b09-b408-62cf6ee3084b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.676 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f4180897-fb47-4ee3-b86e-380da38f2ec5
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f4180897-fb47-4ee3-b86e-380da38f2ec5.pid.haproxy
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f4180897-fb47-4ee3-b86e-380da38f2ec5
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:25:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:01.677 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'env', 'PROCESS_TAG=haproxy-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f4180897-fb47-4ee3-b86e-380da38f2ec5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.963 2 DEBUG nova.compute.manager [req-3b9eeef3-7fff-4e13-9556-61adc7684211 req-fe6f995f-9e41-4bb0-93f3-038f6fef17b5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Received event network-vif-plugged-63d53aab-c63f-434b-9733-622094f23583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.964 2 DEBUG oslo_concurrency.lockutils [req-3b9eeef3-7fff-4e13-9556-61adc7684211 req-fe6f995f-9e41-4bb0-93f3-038f6fef17b5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.965 2 DEBUG oslo_concurrency.lockutils [req-3b9eeef3-7fff-4e13-9556-61adc7684211 req-fe6f995f-9e41-4bb0-93f3-038f6fef17b5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.966 2 DEBUG oslo_concurrency.lockutils [req-3b9eeef3-7fff-4e13-9556-61adc7684211 req-fe6f995f-9e41-4bb0-93f3-038f6fef17b5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.966 2 DEBUG nova.compute.manager [req-3b9eeef3-7fff-4e13-9556-61adc7684211 req-fe6f995f-9e41-4bb0-93f3-038f6fef17b5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Processing event network-vif-plugged-63d53aab-c63f-434b-9733-622094f23583 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:25:01 compute-1 nova_compute[192795]: 2025-09-30 21:25:01.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:02 compute-1 podman[228060]: 2025-09-30 21:25:02.145602419 +0000 UTC m=+0.079474011 container create c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:25:02 compute-1 systemd[1]: Started libpod-conmon-c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30.scope.
Sep 30 21:25:02 compute-1 podman[228060]: 2025-09-30 21:25:02.113771734 +0000 UTC m=+0.047643376 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:25:02 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:25:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8e886ed42644568df68303f022792b2e726639a2d7a10b84afa2ca466667ffd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:25:02 compute-1 podman[228060]: 2025-09-30 21:25:02.243628613 +0000 UTC m=+0.177500225 container init c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:25:02 compute-1 podman[228060]: 2025-09-30 21:25:02.256018192 +0000 UTC m=+0.189889814 container start c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:25:02 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[228075]: [NOTICE]   (228079) : New worker (228081) forked
Sep 30 21:25:02 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[228075]: [NOTICE]   (228079) : Loading success.
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.351 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267502.3501987, 1ecb1653-d3a8-40e1-9547-bc224d6db6b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.353 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] VM Started (Lifecycle Event)
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.357 2 DEBUG nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.361 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.365 2 INFO nova.virt.libvirt.driver [-] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Instance spawned successfully.
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.365 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.402 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.408 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.412 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.412 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.413 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.413 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.414 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.414 2 DEBUG nova.virt.libvirt.driver [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.419 2 DEBUG nova.network.neutron [req-24f1ea3f-03a9-44cc-8517-f91d730a3afd req-5c1e8e7e-4f30-4759-be4a-9f8b861ca04a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Updated VIF entry in instance network info cache for port 63d53aab-c63f-434b-9733-622094f23583. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.420 2 DEBUG nova.network.neutron [req-24f1ea3f-03a9-44cc-8517-f91d730a3afd req-5c1e8e7e-4f30-4759-be4a-9f8b861ca04a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Updating instance_info_cache with network_info: [{"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.450 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.451 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267502.3504772, 1ecb1653-d3a8-40e1-9547-bc224d6db6b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.451 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] VM Paused (Lifecycle Event)
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.456 2 DEBUG oslo_concurrency.lockutils [req-24f1ea3f-03a9-44cc-8517-f91d730a3afd req-5c1e8e7e-4f30-4759-be4a-9f8b861ca04a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-1ecb1653-d3a8-40e1-9547-bc224d6db6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.480 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.484 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267502.3602517, 1ecb1653-d3a8-40e1-9547-bc224d6db6b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.484 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] VM Resumed (Lifecycle Event)
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.518 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.521 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.567 2 INFO nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Took 6.54 seconds to spawn the instance on the hypervisor.
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.567 2 DEBUG nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.586 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.705 2 INFO nova.compute.manager [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Took 7.26 seconds to build instance.
Sep 30 21:25:02 compute-1 nova_compute[192795]: 2025-09-30 21:25:02.724 2 DEBUG oslo_concurrency.lockutils [None req-8b5ce350-2866-46f3-9dc4-60036534f93e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:03 compute-1 nova_compute[192795]: 2025-09-30 21:25:03.607 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267488.6061528, dc03304c-f076-4627-9b00-265c7d559784 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:03 compute-1 nova_compute[192795]: 2025-09-30 21:25:03.607 2 INFO nova.compute.manager [-] [instance: dc03304c-f076-4627-9b00-265c7d559784] VM Stopped (Lifecycle Event)
Sep 30 21:25:03 compute-1 nova_compute[192795]: 2025-09-30 21:25:03.627 2 DEBUG nova.compute.manager [None req-56431ad2-40d1-4abe-a026-0e1996d0ca77 - - - - - -] [instance: dc03304c-f076-4627-9b00-265c7d559784] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:04 compute-1 nova_compute[192795]: 2025-09-30 21:25:04.058 2 DEBUG nova.compute.manager [req-23a16ca0-b6ef-463d-b9d6-b28a39e9b7ab req-b969b52b-184b-451d-a4ea-820cde97e016 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Received event network-vif-plugged-63d53aab-c63f-434b-9733-622094f23583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:04 compute-1 nova_compute[192795]: 2025-09-30 21:25:04.059 2 DEBUG oslo_concurrency.lockutils [req-23a16ca0-b6ef-463d-b9d6-b28a39e9b7ab req-b969b52b-184b-451d-a4ea-820cde97e016 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:04 compute-1 nova_compute[192795]: 2025-09-30 21:25:04.059 2 DEBUG oslo_concurrency.lockutils [req-23a16ca0-b6ef-463d-b9d6-b28a39e9b7ab req-b969b52b-184b-451d-a4ea-820cde97e016 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:04 compute-1 nova_compute[192795]: 2025-09-30 21:25:04.059 2 DEBUG oslo_concurrency.lockutils [req-23a16ca0-b6ef-463d-b9d6-b28a39e9b7ab req-b969b52b-184b-451d-a4ea-820cde97e016 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:04 compute-1 nova_compute[192795]: 2025-09-30 21:25:04.060 2 DEBUG nova.compute.manager [req-23a16ca0-b6ef-463d-b9d6-b28a39e9b7ab req-b969b52b-184b-451d-a4ea-820cde97e016 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] No waiting events found dispatching network-vif-plugged-63d53aab-c63f-434b-9733-622094f23583 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:25:04 compute-1 nova_compute[192795]: 2025-09-30 21:25:04.060 2 WARNING nova.compute.manager [req-23a16ca0-b6ef-463d-b9d6-b28a39e9b7ab req-b969b52b-184b-451d-a4ea-820cde97e016 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Received unexpected event network-vif-plugged-63d53aab-c63f-434b-9733-622094f23583 for instance with vm_state active and task_state None.
Sep 30 21:25:04 compute-1 nova_compute[192795]: 2025-09-30 21:25:04.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:05 compute-1 podman[228090]: 2025-09-30 21:25:05.2583125 +0000 UTC m=+0.087709590 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3)
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.367 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "85a25db6-a162-405f-86f3-8b7a4bf007cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.368 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.399 2 DEBUG nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.568 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.568 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.579 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.580 2 INFO nova.compute.claims [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.815 2 DEBUG nova.compute.provider_tree [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.845 2 DEBUG nova.scheduler.client.report [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.875 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.876 2 DEBUG nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.959 2 DEBUG nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.960 2 DEBUG nova.network.neutron [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:25:06 compute-1 nova_compute[192795]: 2025-09-30 21:25:06.975 2 INFO nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.010 2 DEBUG nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.165 2 DEBUG nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.167 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.167 2 INFO nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Creating image(s)
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.168 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "/var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.168 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.169 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.181 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.242 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.243 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.244 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.253 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.305 2 DEBUG nova.policy [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfe43dba9d03417182dd245d360568e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.310 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.311 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.347 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.348 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.348 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.406 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.408 2 DEBUG nova.virt.disk.api [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Checking if we can resize image /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.408 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.471 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.472 2 DEBUG nova.virt.disk.api [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Cannot resize image /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.472 2 DEBUG nova.objects.instance [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'migration_context' on Instance uuid 85a25db6-a162-405f-86f3-8b7a4bf007cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.501 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.502 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Ensure instance console log exists: /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.502 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.503 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.503 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.535 2 DEBUG oslo_concurrency.lockutils [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.536 2 DEBUG oslo_concurrency.lockutils [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.536 2 DEBUG oslo_concurrency.lockutils [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.536 2 DEBUG oslo_concurrency.lockutils [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.536 2 DEBUG oslo_concurrency.lockutils [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.546 2 INFO nova.compute.manager [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Terminating instance
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.558 2 DEBUG nova.compute.manager [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:25:07 compute-1 kernel: tap63d53aab-c6 (unregistering): left promiscuous mode
Sep 30 21:25:07 compute-1 NetworkManager[51724]: <info>  [1759267507.5835] device (tap63d53aab-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:07 compute-1 ovn_controller[94902]: 2025-09-30T21:25:07Z|00203|binding|INFO|Releasing lport 63d53aab-c63f-434b-9733-622094f23583 from this chassis (sb_readonly=0)
Sep 30 21:25:07 compute-1 ovn_controller[94902]: 2025-09-30T21:25:07Z|00204|binding|INFO|Setting lport 63d53aab-c63f-434b-9733-622094f23583 down in Southbound
Sep 30 21:25:07 compute-1 ovn_controller[94902]: 2025-09-30T21:25:07Z|00205|binding|INFO|Removing iface tap63d53aab-c6 ovn-installed in OVS
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.636 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:98:f1 10.100.0.14'], port_security=['fa:16:3e:ac:98:f1 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1ecb1653-d3a8-40e1-9547-bc224d6db6b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34c754fa0f364622a4433b9ba5718857', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1c375ca4-4abf-4405-95d5-43748e715058', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f21a560c-012c-4374-b9ef-0dc124a433df, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=63d53aab-c63f-434b-9733-622094f23583) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.638 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 63d53aab-c63f-434b-9733-622094f23583 in datapath f4180897-fb47-4ee3-b86e-380da38f2ec5 unbound from our chassis
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.639 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4180897-fb47-4ee3-b86e-380da38f2ec5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.640 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[72dac861-79ca-4623-bc98-14931d98d6db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.641 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 namespace which is not needed anymore
Sep 30 21:25:07 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Deactivated successfully.
Sep 30 21:25:07 compute-1 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Consumed 6.220s CPU time.
Sep 30 21:25:07 compute-1 systemd-machined[152783]: Machine qemu-25-instance-00000034 terminated.
Sep 30 21:25:07 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[228075]: [NOTICE]   (228079) : haproxy version is 2.8.14-c23fe91
Sep 30 21:25:07 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[228075]: [NOTICE]   (228079) : path to executable is /usr/sbin/haproxy
Sep 30 21:25:07 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[228075]: [WARNING]  (228079) : Exiting Master process...
Sep 30 21:25:07 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[228075]: [ALERT]    (228079) : Current worker (228081) exited with code 143 (Terminated)
Sep 30 21:25:07 compute-1 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[228075]: [WARNING]  (228079) : All workers exited. Exiting... (0)
Sep 30 21:25:07 compute-1 systemd[1]: libpod-c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30.scope: Deactivated successfully.
Sep 30 21:25:07 compute-1 podman[228149]: 2025-09-30 21:25:07.780940461 +0000 UTC m=+0.046885097 container died c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:25:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30-userdata-shm.mount: Deactivated successfully.
Sep 30 21:25:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-b8e886ed42644568df68303f022792b2e726639a2d7a10b84afa2ca466667ffd-merged.mount: Deactivated successfully.
Sep 30 21:25:07 compute-1 podman[228149]: 2025-09-30 21:25:07.820929212 +0000 UTC m=+0.086873818 container cleanup c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.830 2 INFO nova.virt.libvirt.driver [-] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Instance destroyed successfully.
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.831 2 DEBUG nova.objects.instance [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lazy-loading 'resources' on Instance uuid 1ecb1653-d3a8-40e1-9547-bc224d6db6b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:07 compute-1 systemd[1]: libpod-conmon-c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30.scope: Deactivated successfully.
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.861 2 DEBUG nova.virt.libvirt.vif [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:24:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1673531770',display_name='tempest-MultipleCreateTestJSON-server-1673531770-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1673531770-1',id=52,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34c754fa0f364622a4433b9ba5718857',ramdisk_id='',reservation_id='r-8qkebqs0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-413721927',owner_user_name='tempest-MultipleCreateTestJSON-413721927-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:25:02Z,user_data=None,user_id='1a8900f8597741ad930d414e1db02d76',uuid=1ecb1653-d3a8-40e1-9547-bc224d6db6b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.862 2 DEBUG nova.network.os_vif_util [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converting VIF {"id": "63d53aab-c63f-434b-9733-622094f23583", "address": "fa:16:3e:ac:98:f1", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63d53aab-c6", "ovs_interfaceid": "63d53aab-c63f-434b-9733-622094f23583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.863 2 DEBUG nova.network.os_vif_util [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:98:f1,bridge_name='br-int',has_traffic_filtering=True,id=63d53aab-c63f-434b-9733-622094f23583,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63d53aab-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.863 2 DEBUG os_vif [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:98:f1,bridge_name='br-int',has_traffic_filtering=True,id=63d53aab-c63f-434b-9733-622094f23583,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63d53aab-c6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63d53aab-c6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.869 2 INFO os_vif [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:98:f1,bridge_name='br-int',has_traffic_filtering=True,id=63d53aab-c63f-434b-9733-622094f23583,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63d53aab-c6')
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.870 2 INFO nova.virt.libvirt.driver [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Deleting instance files /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9_del
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.871 2 INFO nova.virt.libvirt.driver [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Deletion of /var/lib/nova/instances/1ecb1653-d3a8-40e1-9547-bc224d6db6b9_del complete
Sep 30 21:25:07 compute-1 podman[228193]: 2025-09-30 21:25:07.884242554 +0000 UTC m=+0.041028031 container remove c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.889 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5e08ace2-aabf-446f-b099-192bd4e3f84a]: (4, ('Tue Sep 30 09:25:07 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 (c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30)\nc593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30\nTue Sep 30 09:25:07 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 (c593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30)\nc593d1f8d77ad293808c591bd97565ec37d665ccb8783f938c9b885abce59c30\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.891 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5a234c81-3b8c-4e8a-91eb-7216d7e1d8d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.892 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4180897-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:07 compute-1 kernel: tapf4180897-f0: left promiscuous mode
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.899 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e028940d-dbdf-4585-9ddc-45e32725dcbe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:07 compute-1 nova_compute[192795]: 2025-09-30 21:25:07.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.939 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0da92928-6fc1-4d9c-9a0d-8d0302eb11aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.940 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[215b1a2b-1ad4-4be9-8df9-7782d0327b40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.955 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5014e88c-120d-4361-bdbb-ff59ecabe525]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419702, 'reachable_time': 37614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228210, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:07 compute-1 systemd[1]: run-netns-ovnmeta\x2df4180897\x2dfb47\x2d4ee3\x2db86e\x2d380da38f2ec5.mount: Deactivated successfully.
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.959 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:25:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:07.960 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[0127757b-e99c-4556-870d-6e867ab7b37d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.004 2 INFO nova.compute.manager [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Took 0.45 seconds to destroy the instance on the hypervisor.
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.004 2 DEBUG oslo.service.loopingcall [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.004 2 DEBUG nova.compute.manager [-] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.004 2 DEBUG nova.network.neutron [-] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.141 2 DEBUG nova.compute.manager [req-5217b4fa-51c6-42fb-8078-919499d041a4 req-754e7641-221f-44bd-9ae7-60e7debe5f95 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Received event network-vif-unplugged-63d53aab-c63f-434b-9733-622094f23583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.142 2 DEBUG oslo_concurrency.lockutils [req-5217b4fa-51c6-42fb-8078-919499d041a4 req-754e7641-221f-44bd-9ae7-60e7debe5f95 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.142 2 DEBUG oslo_concurrency.lockutils [req-5217b4fa-51c6-42fb-8078-919499d041a4 req-754e7641-221f-44bd-9ae7-60e7debe5f95 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.142 2 DEBUG oslo_concurrency.lockutils [req-5217b4fa-51c6-42fb-8078-919499d041a4 req-754e7641-221f-44bd-9ae7-60e7debe5f95 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.142 2 DEBUG nova.compute.manager [req-5217b4fa-51c6-42fb-8078-919499d041a4 req-754e7641-221f-44bd-9ae7-60e7debe5f95 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] No waiting events found dispatching network-vif-unplugged-63d53aab-c63f-434b-9733-622094f23583 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:25:08 compute-1 nova_compute[192795]: 2025-09-30 21:25:08.142 2 DEBUG nova.compute.manager [req-5217b4fa-51c6-42fb-8078-919499d041a4 req-754e7641-221f-44bd-9ae7-60e7debe5f95 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Received event network-vif-unplugged-63d53aab-c63f-434b-9733-622094f23583 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:25:09 compute-1 nova_compute[192795]: 2025-09-30 21:25:09.734 2 DEBUG nova.network.neutron [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Successfully created port: be314f32-1beb-45ee-8ee5-f3ba6c9da81d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:25:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:10.137 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:10.138 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.203 2 DEBUG nova.network.neutron [-] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.277 2 INFO nova.compute.manager [-] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Took 2.27 seconds to deallocate network for instance.
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.432 2 DEBUG nova.compute.manager [req-9d187480-fbb8-45a9-b36d-30c0d3b6819e req-b3200d42-b086-47ba-a435-a45acff7fddd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Received event network-vif-plugged-63d53aab-c63f-434b-9733-622094f23583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.432 2 DEBUG oslo_concurrency.lockutils [req-9d187480-fbb8-45a9-b36d-30c0d3b6819e req-b3200d42-b086-47ba-a435-a45acff7fddd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.432 2 DEBUG oslo_concurrency.lockutils [req-9d187480-fbb8-45a9-b36d-30c0d3b6819e req-b3200d42-b086-47ba-a435-a45acff7fddd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.432 2 DEBUG oslo_concurrency.lockutils [req-9d187480-fbb8-45a9-b36d-30c0d3b6819e req-b3200d42-b086-47ba-a435-a45acff7fddd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.432 2 DEBUG nova.compute.manager [req-9d187480-fbb8-45a9-b36d-30c0d3b6819e req-b3200d42-b086-47ba-a435-a45acff7fddd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] No waiting events found dispatching network-vif-plugged-63d53aab-c63f-434b-9733-622094f23583 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.433 2 WARNING nova.compute.manager [req-9d187480-fbb8-45a9-b36d-30c0d3b6819e req-b3200d42-b086-47ba-a435-a45acff7fddd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Received unexpected event network-vif-plugged-63d53aab-c63f-434b-9733-622094f23583 for instance with vm_state deleted and task_state None.
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.438 2 DEBUG nova.compute.manager [req-9ad394a7-b466-44c3-aff7-45b8f9223228 req-31ebcb0d-4026-40ce-b70d-efe9cb182261 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Received event network-vif-deleted-63d53aab-c63f-434b-9733-622094f23583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.455 2 DEBUG oslo_concurrency.lockutils [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.455 2 DEBUG oslo_concurrency.lockutils [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.749 2 DEBUG nova.compute.provider_tree [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.806 2 DEBUG nova.scheduler.client.report [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.876 2 DEBUG oslo_concurrency.lockutils [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:10 compute-1 nova_compute[192795]: 2025-09-30 21:25:10.943 2 INFO nova.scheduler.client.report [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Deleted allocations for instance 1ecb1653-d3a8-40e1-9547-bc224d6db6b9
Sep 30 21:25:11 compute-1 nova_compute[192795]: 2025-09-30 21:25:11.042 2 DEBUG oslo_concurrency.lockutils [None req-b9d3045b-8925-45d9-ba75-588c2884af43 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "1ecb1653-d3a8-40e1-9547-bc224d6db6b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:11 compute-1 nova_compute[192795]: 2025-09-30 21:25:11.545 2 DEBUG nova.network.neutron [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Successfully updated port: be314f32-1beb-45ee-8ee5-f3ba6c9da81d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:25:11 compute-1 nova_compute[192795]: 2025-09-30 21:25:11.601 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:11 compute-1 nova_compute[192795]: 2025-09-30 21:25:11.601 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquired lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:11 compute-1 nova_compute[192795]: 2025-09-30 21:25:11.601 2 DEBUG nova.network.neutron [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:25:11 compute-1 nova_compute[192795]: 2025-09-30 21:25:11.885 2 DEBUG nova.network.neutron [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:25:12 compute-1 nova_compute[192795]: 2025-09-30 21:25:12.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:12 compute-1 nova_compute[192795]: 2025-09-30 21:25:12.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:13.141 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:13 compute-1 podman[228211]: 2025-09-30 21:25:13.20494709 +0000 UTC m=+0.051818657 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.338 2 DEBUG nova.network.neutron [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Updating instance_info_cache with network_info: [{"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.372 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Releasing lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.372 2 DEBUG nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Instance network_info: |[{"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.374 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Start _get_guest_xml network_info=[{"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.379 2 WARNING nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.384 2 DEBUG nova.virt.libvirt.host [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.385 2 DEBUG nova.virt.libvirt.host [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.397 2 DEBUG nova.virt.libvirt.host [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.398 2 DEBUG nova.virt.libvirt.host [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.399 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.399 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.399 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.399 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.400 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.400 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.400 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.400 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.400 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.401 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.401 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.401 2 DEBUG nova.virt.hardware [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.404 2 DEBUG nova.virt.libvirt.vif [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:25:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-847442675',display_name='tempest-DeleteServersTestJSON-server-847442675',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-847442675',id=54,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-cy7a80we',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:25:07Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=85a25db6-a162-405f-86f3-8b7a4bf007cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.405 2 DEBUG nova.network.os_vif_util [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.405 2 DEBUG nova.network.os_vif_util [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:e1:05,bridge_name='br-int',has_traffic_filtering=True,id=be314f32-1beb-45ee-8ee5-f3ba6c9da81d,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe314f32-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.406 2 DEBUG nova.objects.instance [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 85a25db6-a162-405f-86f3-8b7a4bf007cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.445 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <uuid>85a25db6-a162-405f-86f3-8b7a4bf007cb</uuid>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <name>instance-00000036</name>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <nova:name>tempest-DeleteServersTestJSON-server-847442675</nova:name>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:25:13</nova:creationTime>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:25:13 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:25:13 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:25:13 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:25:13 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:25:13 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:25:13 compute-1 nova_compute[192795]:         <nova:user uuid="bfe43dba9d03417182dd245d360568e6">tempest-DeleteServersTestJSON-314554874-project-member</nova:user>
Sep 30 21:25:13 compute-1 nova_compute[192795]:         <nova:project uuid="c4bb94b19ac546f195f1f1f35411cce9">tempest-DeleteServersTestJSON-314554874</nova:project>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:25:13 compute-1 nova_compute[192795]:         <nova:port uuid="be314f32-1beb-45ee-8ee5-f3ba6c9da81d">
Sep 30 21:25:13 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <system>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <entry name="serial">85a25db6-a162-405f-86f3-8b7a4bf007cb</entry>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <entry name="uuid">85a25db6-a162-405f-86f3-8b7a4bf007cb</entry>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     </system>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <os>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   </os>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <features>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   </features>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk.config"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:56:e1:05"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <target dev="tapbe314f32-1b"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/console.log" append="off"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <video>
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     </video>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:25:13 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:25:13 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:25:13 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:25:13 compute-1 nova_compute[192795]: </domain>
Sep 30 21:25:13 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.446 2 DEBUG nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Preparing to wait for external event network-vif-plugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.446 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.446 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.447 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.447 2 DEBUG nova.virt.libvirt.vif [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:25:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-847442675',display_name='tempest-DeleteServersTestJSON-server-847442675',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-847442675',id=54,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-cy7a80we',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:25:07Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=85a25db6-a162-405f-86f3-8b7a4bf007cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.448 2 DEBUG nova.network.os_vif_util [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.448 2 DEBUG nova.network.os_vif_util [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:e1:05,bridge_name='br-int',has_traffic_filtering=True,id=be314f32-1beb-45ee-8ee5-f3ba6c9da81d,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe314f32-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.449 2 DEBUG os_vif [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:e1:05,bridge_name='br-int',has_traffic_filtering=True,id=be314f32-1beb-45ee-8ee5-f3ba6c9da81d,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe314f32-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.450 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.450 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.453 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe314f32-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.453 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbe314f32-1b, col_values=(('external_ids', {'iface-id': 'be314f32-1beb-45ee-8ee5-f3ba6c9da81d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:e1:05', 'vm-uuid': '85a25db6-a162-405f-86f3-8b7a4bf007cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:13 compute-1 NetworkManager[51724]: <info>  [1759267513.4566] manager: (tapbe314f32-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.462 2 INFO os_vif [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:e1:05,bridge_name='br-int',has_traffic_filtering=True,id=be314f32-1beb-45ee-8ee5-f3ba6c9da81d,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe314f32-1b')
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.480 2 DEBUG nova.compute.manager [req-04bac8e8-4884-463e-bd46-b3959d3695ec req-ca8f5bc8-0a15-4f53-a83d-f7b8ee2846ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Received event network-changed-be314f32-1beb-45ee-8ee5-f3ba6c9da81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.481 2 DEBUG nova.compute.manager [req-04bac8e8-4884-463e-bd46-b3959d3695ec req-ca8f5bc8-0a15-4f53-a83d-f7b8ee2846ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Refreshing instance network info cache due to event network-changed-be314f32-1beb-45ee-8ee5-f3ba6c9da81d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.481 2 DEBUG oslo_concurrency.lockutils [req-04bac8e8-4884-463e-bd46-b3959d3695ec req-ca8f5bc8-0a15-4f53-a83d-f7b8ee2846ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.481 2 DEBUG oslo_concurrency.lockutils [req-04bac8e8-4884-463e-bd46-b3959d3695ec req-ca8f5bc8-0a15-4f53-a83d-f7b8ee2846ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.481 2 DEBUG nova.network.neutron [req-04bac8e8-4884-463e-bd46-b3959d3695ec req-ca8f5bc8-0a15-4f53-a83d-f7b8ee2846ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Refreshing network info cache for port be314f32-1beb-45ee-8ee5-f3ba6c9da81d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.571 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.571 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.572 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No VIF found with MAC fa:16:3e:56:e1:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:25:13 compute-1 nova_compute[192795]: 2025-09-30 21:25:13.573 2 INFO nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Using config drive
Sep 30 21:25:14 compute-1 podman[228234]: 2025-09-30 21:25:14.218220568 +0000 UTC m=+0.057264672 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.242 2 INFO nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Creating config drive at /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk.config
Sep 30 21:25:14 compute-1 podman[228233]: 2025-09-30 21:25:14.247334631 +0000 UTC m=+0.081866745 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=edpm, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal)
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.252 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpac1e161a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.387 2 DEBUG oslo_concurrency.processutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpac1e161a" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:14 compute-1 kernel: tapbe314f32-1b: entered promiscuous mode
Sep 30 21:25:14 compute-1 NetworkManager[51724]: <info>  [1759267514.4639] manager: (tapbe314f32-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/103)
Sep 30 21:25:14 compute-1 ovn_controller[94902]: 2025-09-30T21:25:14Z|00206|binding|INFO|Claiming lport be314f32-1beb-45ee-8ee5-f3ba6c9da81d for this chassis.
Sep 30 21:25:14 compute-1 ovn_controller[94902]: 2025-09-30T21:25:14Z|00207|binding|INFO|be314f32-1beb-45ee-8ee5-f3ba6c9da81d: Claiming fa:16:3e:56:e1:05 10.100.0.9
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.480 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:e1:05 10.100.0.9'], port_security=['fa:16:3e:56:e1:05 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '85a25db6-a162-405f-86f3-8b7a4bf007cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=be314f32-1beb-45ee-8ee5-f3ba6c9da81d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.481 103861 INFO neutron.agent.ovn.metadata.agent [-] Port be314f32-1beb-45ee-8ee5-f3ba6c9da81d in datapath 5569112a-9fb3-4151-add0-95b595cbe309 bound to our chassis
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.482 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.494 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c42b7ccb-a5e1-44f4-9944-d75474df70be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.495 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5569112a-91 in ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:25:14 compute-1 systemd-udevd[228297]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.496 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5569112a-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.496 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7eacbc82-047d-43f8-8107-1fe429b9e5c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.497 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e8cde30d-9616-4137-ad09-5123faa83e27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 NetworkManager[51724]: <info>  [1759267514.5060] device (tapbe314f32-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:25:14 compute-1 NetworkManager[51724]: <info>  [1759267514.5068] device (tapbe314f32-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:25:14 compute-1 systemd-machined[152783]: New machine qemu-26-instance-00000036.
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.511 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d95cbd-9e63-4293-a361-b7456cf166ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 ovn_controller[94902]: 2025-09-30T21:25:14Z|00208|binding|INFO|Setting lport be314f32-1beb-45ee-8ee5-f3ba6c9da81d ovn-installed in OVS
Sep 30 21:25:14 compute-1 ovn_controller[94902]: 2025-09-30T21:25:14Z|00209|binding|INFO|Setting lport be314f32-1beb-45ee-8ee5-f3ba6c9da81d up in Southbound
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 systemd[1]: Started Virtual Machine qemu-26-instance-00000036.
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.542 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[35bec25f-52bd-43f9-b1e8-adcf96816361]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.571 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[00a9bb0d-b32b-42f3-8215-9572b545f3fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.575 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3de10e3b-843c-4581-b92f-0eaed271c330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 NetworkManager[51724]: <info>  [1759267514.5773] manager: (tap5569112a-90): new Veth device (/org/freedesktop/NetworkManager/Devices/104)
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.610 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[198de51f-9898-4d67-ae73-e1ad3ea19c6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.613 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6b18bf72-56bb-4a2c-8457-823db1be7f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 NetworkManager[51724]: <info>  [1759267514.6341] device (tap5569112a-90): carrier: link connected
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.639 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[02950763-aa07-47cd-a42a-554dffadd4fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.658 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9d017385-3e97-49f4-a3c1-15246417968c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421030, 'reachable_time': 43758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228330, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.674 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[63632f3e-4dae-43f2-aeff-f830faaacad2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:1ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 421030, 'tstamp': 421030}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228331, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.691 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0712599b-bbbd-4783-88dd-6a2676cf1f64]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421030, 'reachable_time': 43758, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228332, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.718 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d27a1609-85ad-4261-ab89-bdb09cfba4ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.794 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf23021-e643-443f-87ef-271538dcf67b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.796 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.796 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.796 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5569112a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 NetworkManager[51724]: <info>  [1759267514.8377] manager: (tap5569112a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Sep 30 21:25:14 compute-1 kernel: tap5569112a-90: entered promiscuous mode
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.841 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5569112a-90, col_values=(('external_ids', {'iface-id': 'af49dc17-c7c9-4524-8791-14107f2ff34d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 ovn_controller[94902]: 2025-09-30T21:25:14Z|00210|binding|INFO|Releasing lport af49dc17-c7c9-4524-8791-14107f2ff34d from this chassis (sb_readonly=0)
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.843 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.844 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6b3d0e6e-0d5e-4b56-9b5f-26b96b631d96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.845 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:25:14 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:14.845 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'env', 'PROCESS_TAG=haproxy-5569112a-9fb3-4151-add0-95b595cbe309', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5569112a-9fb3-4151-add0-95b595cbe309.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-1 ovn_controller[94902]: 2025-09-30T21:25:14Z|00211|binding|INFO|Releasing lport af49dc17-c7c9-4524-8791-14107f2ff34d from this chassis (sb_readonly=0)
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.900 2 DEBUG nova.compute.manager [req-a19849c8-e6aa-4203-8f7b-1f9530d45f7f req-42692ca4-690d-4719-bacf-6a19f1ca5210 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Received event network-vif-plugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.900 2 DEBUG oslo_concurrency.lockutils [req-a19849c8-e6aa-4203-8f7b-1f9530d45f7f req-42692ca4-690d-4719-bacf-6a19f1ca5210 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.901 2 DEBUG oslo_concurrency.lockutils [req-a19849c8-e6aa-4203-8f7b-1f9530d45f7f req-42692ca4-690d-4719-bacf-6a19f1ca5210 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.901 2 DEBUG oslo_concurrency.lockutils [req-a19849c8-e6aa-4203-8f7b-1f9530d45f7f req-42692ca4-690d-4719-bacf-6a19f1ca5210 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.901 2 DEBUG nova.compute.manager [req-a19849c8-e6aa-4203-8f7b-1f9530d45f7f req-42692ca4-690d-4719-bacf-6a19f1ca5210 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Processing event network-vif-plugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:25:14 compute-1 nova_compute[192795]: 2025-09-30 21:25:14.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:15 compute-1 podman[228365]: 2025-09-30 21:25:15.242427797 +0000 UTC m=+0.063440177 container create e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:25:15 compute-1 systemd[1]: Started libpod-conmon-e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d.scope.
Sep 30 21:25:15 compute-1 podman[228365]: 2025-09-30 21:25:15.203702848 +0000 UTC m=+0.024715288 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:25:15 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:25:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc7a2828899681c17350986db44167a0b1c11f43ae18bd91ba579c5d69943d80/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:25:15 compute-1 podman[228365]: 2025-09-30 21:25:15.351341519 +0000 UTC m=+0.172353919 container init e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:25:15 compute-1 podman[228365]: 2025-09-30 21:25:15.35665905 +0000 UTC m=+0.177671430 container start e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:25:15 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228380]: [NOTICE]   (228384) : New worker (228386) forked
Sep 30 21:25:15 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228380]: [NOTICE]   (228384) : Loading success.
Sep 30 21:25:15 compute-1 nova_compute[192795]: 2025-09-30 21:25:15.386 2 DEBUG nova.network.neutron [req-04bac8e8-4884-463e-bd46-b3959d3695ec req-ca8f5bc8-0a15-4f53-a83d-f7b8ee2846ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Updated VIF entry in instance network info cache for port be314f32-1beb-45ee-8ee5-f3ba6c9da81d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:25:15 compute-1 nova_compute[192795]: 2025-09-30 21:25:15.386 2 DEBUG nova.network.neutron [req-04bac8e8-4884-463e-bd46-b3959d3695ec req-ca8f5bc8-0a15-4f53-a83d-f7b8ee2846ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Updating instance_info_cache with network_info: [{"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:15 compute-1 nova_compute[192795]: 2025-09-30 21:25:15.422 2 DEBUG oslo_concurrency.lockutils [req-04bac8e8-4884-463e-bd46-b3959d3695ec req-ca8f5bc8-0a15-4f53-a83d-f7b8ee2846ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.223 2 DEBUG nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.225 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267516.2225108, 85a25db6-a162-405f-86f3-8b7a4bf007cb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.226 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] VM Started (Lifecycle Event)
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.229 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.234 2 INFO nova.virt.libvirt.driver [-] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Instance spawned successfully.
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.235 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.255 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.262 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.266 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.266 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.267 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.268 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.268 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.269 2 DEBUG nova.virt.libvirt.driver [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.298 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.298 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267516.2228787, 85a25db6-a162-405f-86f3-8b7a4bf007cb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.298 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] VM Paused (Lifecycle Event)
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.320 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.324 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267516.2288878, 85a25db6-a162-405f-86f3-8b7a4bf007cb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.324 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] VM Resumed (Lifecycle Event)
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.345 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.347 2 INFO nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Took 9.18 seconds to spawn the instance on the hypervisor.
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.347 2 DEBUG nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.350 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.377 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.430 2 INFO nova.compute.manager [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Took 9.92 seconds to build instance.
Sep 30 21:25:16 compute-1 nova_compute[192795]: 2025-09-30 21:25:16.450 2 DEBUG oslo_concurrency.lockutils [None req-0c1bfa51-7a99-4238-963b-8082cc854160 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:17 compute-1 nova_compute[192795]: 2025-09-30 21:25:17.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:17 compute-1 nova_compute[192795]: 2025-09-30 21:25:17.014 2 DEBUG nova.compute.manager [req-18a42b5a-ec7a-4f5c-b03b-db6393787794 req-bd604742-df68-4aa7-9cc5-12ab70e8721a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Received event network-vif-plugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:17 compute-1 nova_compute[192795]: 2025-09-30 21:25:17.015 2 DEBUG oslo_concurrency.lockutils [req-18a42b5a-ec7a-4f5c-b03b-db6393787794 req-bd604742-df68-4aa7-9cc5-12ab70e8721a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:17 compute-1 nova_compute[192795]: 2025-09-30 21:25:17.016 2 DEBUG oslo_concurrency.lockutils [req-18a42b5a-ec7a-4f5c-b03b-db6393787794 req-bd604742-df68-4aa7-9cc5-12ab70e8721a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:17 compute-1 nova_compute[192795]: 2025-09-30 21:25:17.016 2 DEBUG oslo_concurrency.lockutils [req-18a42b5a-ec7a-4f5c-b03b-db6393787794 req-bd604742-df68-4aa7-9cc5-12ab70e8721a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:17 compute-1 nova_compute[192795]: 2025-09-30 21:25:17.017 2 DEBUG nova.compute.manager [req-18a42b5a-ec7a-4f5c-b03b-db6393787794 req-bd604742-df68-4aa7-9cc5-12ab70e8721a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] No waiting events found dispatching network-vif-plugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:25:17 compute-1 nova_compute[192795]: 2025-09-30 21:25:17.017 2 WARNING nova.compute.manager [req-18a42b5a-ec7a-4f5c-b03b-db6393787794 req-bd604742-df68-4aa7-9cc5-12ab70e8721a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Received unexpected event network-vif-plugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d for instance with vm_state active and task_state None.
Sep 30 21:25:18 compute-1 nova_compute[192795]: 2025-09-30 21:25:18.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:20 compute-1 nova_compute[192795]: 2025-09-30 21:25:20.302 2 DEBUG oslo_concurrency.lockutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "85a25db6-a162-405f-86f3-8b7a4bf007cb" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:20 compute-1 nova_compute[192795]: 2025-09-30 21:25:20.304 2 DEBUG oslo_concurrency.lockutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:20 compute-1 nova_compute[192795]: 2025-09-30 21:25:20.304 2 INFO nova.compute.manager [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Shelving
Sep 30 21:25:20 compute-1 nova_compute[192795]: 2025-09-30 21:25:20.359 2 DEBUG nova.virt.libvirt.driver [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:25:22 compute-1 nova_compute[192795]: 2025-09-30 21:25:22.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:22 compute-1 nova_compute[192795]: 2025-09-30 21:25:22.828 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267507.8271475, 1ecb1653-d3a8-40e1-9547-bc224d6db6b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:22 compute-1 nova_compute[192795]: 2025-09-30 21:25:22.829 2 INFO nova.compute.manager [-] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] VM Stopped (Lifecycle Event)
Sep 30 21:25:22 compute-1 nova_compute[192795]: 2025-09-30 21:25:22.850 2 DEBUG nova.compute.manager [None req-aa2d5512-8883-4095-bf3c-bceff4858077 - - - - - -] [instance: 1ecb1653-d3a8-40e1-9547-bc224d6db6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:23 compute-1 nova_compute[192795]: 2025-09-30 21:25:23.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:24 compute-1 podman[228402]: 2025-09-30 21:25:24.220108286 +0000 UTC m=+0.061379881 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true)
Sep 30 21:25:27 compute-1 nova_compute[192795]: 2025-09-30 21:25:27.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:28 compute-1 nova_compute[192795]: 2025-09-30 21:25:28.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:29 compute-1 ovn_controller[94902]: 2025-09-30T21:25:29Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:56:e1:05 10.100.0.9
Sep 30 21:25:29 compute-1 ovn_controller[94902]: 2025-09-30T21:25:29Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:56:e1:05 10.100.0.9
Sep 30 21:25:30 compute-1 nova_compute[192795]: 2025-09-30 21:25:30.406 2 DEBUG nova.virt.libvirt.driver [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:32 compute-1 podman[228443]: 2025-09-30 21:25:32.263443572 +0000 UTC m=+0.077913660 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:25:32 compute-1 podman[228437]: 2025-09-30 21:25:32.263836553 +0000 UTC m=+0.098994770 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Sep 30 21:25:32 compute-1 podman[228438]: 2025-09-30 21:25:32.295775831 +0000 UTC m=+0.125624247 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 21:25:32 compute-1 kernel: tapbe314f32-1b (unregistering): left promiscuous mode
Sep 30 21:25:32 compute-1 NetworkManager[51724]: <info>  [1759267532.5718] device (tapbe314f32-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:32 compute-1 ovn_controller[94902]: 2025-09-30T21:25:32Z|00212|binding|INFO|Releasing lport be314f32-1beb-45ee-8ee5-f3ba6c9da81d from this chassis (sb_readonly=0)
Sep 30 21:25:32 compute-1 ovn_controller[94902]: 2025-09-30T21:25:32Z|00213|binding|INFO|Setting lport be314f32-1beb-45ee-8ee5-f3ba6c9da81d down in Southbound
Sep 30 21:25:32 compute-1 ovn_controller[94902]: 2025-09-30T21:25:32Z|00214|binding|INFO|Removing iface tapbe314f32-1b ovn-installed in OVS
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.593 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:e1:05 10.100.0.9'], port_security=['fa:16:3e:56:e1:05 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '85a25db6-a162-405f-86f3-8b7a4bf007cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=be314f32-1beb-45ee-8ee5-f3ba6c9da81d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.595 103861 INFO neutron.agent.ovn.metadata.agent [-] Port be314f32-1beb-45ee-8ee5-f3ba6c9da81d in datapath 5569112a-9fb3-4151-add0-95b595cbe309 unbound from our chassis
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.597 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5569112a-9fb3-4151-add0-95b595cbe309, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.598 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[19fc5a3f-804d-4e81-bbf7-d1a34b90268a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.600 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace which is not needed anymore
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:32 compute-1 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000036.scope: Deactivated successfully.
Sep 30 21:25:32 compute-1 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000036.scope: Consumed 14.102s CPU time.
Sep 30 21:25:32 compute-1 systemd-machined[152783]: Machine qemu-26-instance-00000036 terminated.
Sep 30 21:25:32 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228380]: [NOTICE]   (228384) : haproxy version is 2.8.14-c23fe91
Sep 30 21:25:32 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228380]: [NOTICE]   (228384) : path to executable is /usr/sbin/haproxy
Sep 30 21:25:32 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228380]: [WARNING]  (228384) : Exiting Master process...
Sep 30 21:25:32 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228380]: [ALERT]    (228384) : Current worker (228386) exited with code 143 (Terminated)
Sep 30 21:25:32 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228380]: [WARNING]  (228384) : All workers exited. Exiting... (0)
Sep 30 21:25:32 compute-1 systemd[1]: libpod-e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d.scope: Deactivated successfully.
Sep 30 21:25:32 compute-1 podman[228527]: 2025-09-30 21:25:32.773225371 +0000 UTC m=+0.053949925 container died e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:25:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d-userdata-shm.mount: Deactivated successfully.
Sep 30 21:25:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-bc7a2828899681c17350986db44167a0b1c11f43ae18bd91ba579c5d69943d80-merged.mount: Deactivated successfully.
Sep 30 21:25:32 compute-1 podman[228527]: 2025-09-30 21:25:32.816203111 +0000 UTC m=+0.096927635 container cleanup e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:25:32 compute-1 systemd[1]: libpod-conmon-e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d.scope: Deactivated successfully.
Sep 30 21:25:32 compute-1 podman[228564]: 2025-09-30 21:25:32.882404909 +0000 UTC m=+0.041808481 container remove e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.888 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7507185b-a2ec-4ab3-91ea-bd792d446478]: (4, ('Tue Sep 30 09:25:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d)\ne01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d\nTue Sep 30 09:25:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (e01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d)\ne01f83fb052be3acafba002d61ecd98752218e894d15638da493cf5201e14a1d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.891 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[99cc38a5-41ce-440c-9eaf-e88acd06c3cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.892 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:32 compute-1 kernel: tap5569112a-90: left promiscuous mode
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.912 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[efb273c2-17db-40ec-97bc-d9467827baba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.939 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5d75ca29-3e64-4eaa-b5a8-f862f3f6497e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.940 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a18460-f1cd-4566-b1ec-95d0e1a54158]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.945 2 DEBUG nova.compute.manager [req-2dc80bae-9ccc-468c-9a83-5a425f30816c req-dc8cea15-7b97-4875-935c-38cf6f61e07d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Received event network-vif-unplugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.946 2 DEBUG oslo_concurrency.lockutils [req-2dc80bae-9ccc-468c-9a83-5a425f30816c req-dc8cea15-7b97-4875-935c-38cf6f61e07d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.946 2 DEBUG oslo_concurrency.lockutils [req-2dc80bae-9ccc-468c-9a83-5a425f30816c req-dc8cea15-7b97-4875-935c-38cf6f61e07d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.946 2 DEBUG oslo_concurrency.lockutils [req-2dc80bae-9ccc-468c-9a83-5a425f30816c req-dc8cea15-7b97-4875-935c-38cf6f61e07d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.947 2 DEBUG nova.compute.manager [req-2dc80bae-9ccc-468c-9a83-5a425f30816c req-dc8cea15-7b97-4875-935c-38cf6f61e07d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] No waiting events found dispatching network-vif-unplugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:25:32 compute-1 nova_compute[192795]: 2025-09-30 21:25:32.947 2 WARNING nova.compute.manager [req-2dc80bae-9ccc-468c-9a83-5a425f30816c req-dc8cea15-7b97-4875-935c-38cf6f61e07d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Received unexpected event network-vif-unplugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d for instance with vm_state active and task_state shelving.
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.954 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf13271-e99a-423a-8524-e5cb642c028a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 421023, 'reachable_time': 17513, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228589, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.957 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:25:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:32.957 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[49ab30b8-af2d-4fe8-b2dc-e757df73d5cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:32 compute-1 systemd[1]: run-netns-ovnmeta\x2d5569112a\x2d9fb3\x2d4151\x2dadd0\x2d95b595cbe309.mount: Deactivated successfully.
Sep 30 21:25:33 compute-1 nova_compute[192795]: 2025-09-30 21:25:33.422 2 INFO nova.virt.libvirt.driver [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Instance shutdown successfully after 13 seconds.
Sep 30 21:25:33 compute-1 nova_compute[192795]: 2025-09-30 21:25:33.430 2 INFO nova.virt.libvirt.driver [-] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Instance destroyed successfully.
Sep 30 21:25:33 compute-1 nova_compute[192795]: 2025-09-30 21:25:33.431 2 DEBUG nova.objects.instance [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'numa_topology' on Instance uuid 85a25db6-a162-405f-86f3-8b7a4bf007cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:33 compute-1 nova_compute[192795]: 2025-09-30 21:25:33.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:33 compute-1 nova_compute[192795]: 2025-09-30 21:25:33.797 2 INFO nova.virt.libvirt.driver [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Beginning cold snapshot process
Sep 30 21:25:34 compute-1 nova_compute[192795]: 2025-09-30 21:25:34.005 2 DEBUG nova.privsep.utils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:25:34 compute-1 nova_compute[192795]: 2025-09-30 21:25:34.006 2 DEBUG oslo_concurrency.processutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk /var/lib/nova/instances/snapshots/tmppa5b97be/de879833526144b3a0ece431bb76c2d8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:34 compute-1 nova_compute[192795]: 2025-09-30 21:25:34.464 2 DEBUG oslo_concurrency.processutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk /var/lib/nova/instances/snapshots/tmppa5b97be/de879833526144b3a0ece431bb76c2d8" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:34 compute-1 nova_compute[192795]: 2025-09-30 21:25:34.466 2 INFO nova.virt.libvirt.driver [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Snapshot extracted, beginning image upload
Sep 30 21:25:35 compute-1 nova_compute[192795]: 2025-09-30 21:25:35.396 2 DEBUG nova.compute.manager [req-417910f7-3ee1-4c8b-a548-dfa03cc1110b req-3086e58f-ac0c-4a47-850e-3cbaa2c6efb3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Received event network-vif-plugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:35 compute-1 nova_compute[192795]: 2025-09-30 21:25:35.396 2 DEBUG oslo_concurrency.lockutils [req-417910f7-3ee1-4c8b-a548-dfa03cc1110b req-3086e58f-ac0c-4a47-850e-3cbaa2c6efb3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:35 compute-1 nova_compute[192795]: 2025-09-30 21:25:35.397 2 DEBUG oslo_concurrency.lockutils [req-417910f7-3ee1-4c8b-a548-dfa03cc1110b req-3086e58f-ac0c-4a47-850e-3cbaa2c6efb3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:35 compute-1 nova_compute[192795]: 2025-09-30 21:25:35.397 2 DEBUG oslo_concurrency.lockutils [req-417910f7-3ee1-4c8b-a548-dfa03cc1110b req-3086e58f-ac0c-4a47-850e-3cbaa2c6efb3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:35 compute-1 nova_compute[192795]: 2025-09-30 21:25:35.397 2 DEBUG nova.compute.manager [req-417910f7-3ee1-4c8b-a548-dfa03cc1110b req-3086e58f-ac0c-4a47-850e-3cbaa2c6efb3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] No waiting events found dispatching network-vif-plugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:25:35 compute-1 nova_compute[192795]: 2025-09-30 21:25:35.397 2 WARNING nova.compute.manager [req-417910f7-3ee1-4c8b-a548-dfa03cc1110b req-3086e58f-ac0c-4a47-850e-3cbaa2c6efb3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Received unexpected event network-vif-plugged-be314f32-1beb-45ee-8ee5-f3ba6c9da81d for instance with vm_state active and task_state shelving_image_uploading.
Sep 30 21:25:36 compute-1 podman[228599]: 2025-09-30 21:25:36.237608589 +0000 UTC m=+0.065931712 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:25:37 compute-1 nova_compute[192795]: 2025-09-30 21:25:37.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:37 compute-1 nova_compute[192795]: 2025-09-30 21:25:37.989 2 INFO nova.virt.libvirt.driver [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Snapshot image upload complete
Sep 30 21:25:37 compute-1 nova_compute[192795]: 2025-09-30 21:25:37.990 2 DEBUG nova.compute.manager [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:38 compute-1 nova_compute[192795]: 2025-09-30 21:25:38.100 2 INFO nova.compute.manager [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Shelve offloading
Sep 30 21:25:38 compute-1 nova_compute[192795]: 2025-09-30 21:25:38.114 2 INFO nova.virt.libvirt.driver [-] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Instance destroyed successfully.
Sep 30 21:25:38 compute-1 nova_compute[192795]: 2025-09-30 21:25:38.114 2 DEBUG nova.compute.manager [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:38 compute-1 nova_compute[192795]: 2025-09-30 21:25:38.116 2 DEBUG oslo_concurrency.lockutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:38 compute-1 nova_compute[192795]: 2025-09-30 21:25:38.116 2 DEBUG oslo_concurrency.lockutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquired lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:38 compute-1 nova_compute[192795]: 2025-09-30 21:25:38.116 2 DEBUG nova.network.neutron [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:25:38 compute-1 nova_compute[192795]: 2025-09-30 21:25:38.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:38.686 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:38.686 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:38.686 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:38 compute-1 nova_compute[192795]: 2025-09-30 21:25:38.707 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.529 2 DEBUG nova.network.neutron [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Updating instance_info_cache with network_info: [{"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.599 2 DEBUG oslo_concurrency.lockutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Releasing lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.724 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.724 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.725 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.725 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.800 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.883 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.885 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:41 compute-1 nova_compute[192795]: 2025-09-30 21:25:41.949 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.149 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.151 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5718MB free_disk=73.43323135375977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.151 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.151 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.236 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 85a25db6-a162-405f-86f3-8b7a4bf007cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.236 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.236 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.278 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.295 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.322 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.322 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.795 2 INFO nova.virt.libvirt.driver [-] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Instance destroyed successfully.
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.795 2 DEBUG nova.objects.instance [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'resources' on Instance uuid 85a25db6-a162-405f-86f3-8b7a4bf007cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.818 2 DEBUG nova.virt.libvirt.vif [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-847442675',display_name='tempest-DeleteServersTestJSON-server-847442675',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-847442675',id=54,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-cy7a80we',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member',shelved_at='2025-09-30T21:25:37.990137',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='9c427478-2ff8-4e1d-a15f-aa597ff0fc57'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:25:34Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=85a25db6-a162-405f-86f3-8b7a4bf007cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.818 2 DEBUG nova.network.os_vif_util [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe314f32-1b", "ovs_interfaceid": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.819 2 DEBUG nova.network.os_vif_util [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:e1:05,bridge_name='br-int',has_traffic_filtering=True,id=be314f32-1beb-45ee-8ee5-f3ba6c9da81d,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe314f32-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.820 2 DEBUG os_vif [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:e1:05,bridge_name='br-int',has_traffic_filtering=True,id=be314f32-1beb-45ee-8ee5-f3ba6c9da81d,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe314f32-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.822 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe314f32-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.831 2 INFO os_vif [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:e1:05,bridge_name='br-int',has_traffic_filtering=True,id=be314f32-1beb-45ee-8ee5-f3ba6c9da81d,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe314f32-1b')
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.831 2 INFO nova.virt.libvirt.driver [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Deleting instance files /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb_del
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.840 2 INFO nova.virt.libvirt.driver [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Deletion of /var/lib/nova/instances/85a25db6-a162-405f-86f3-8b7a4bf007cb_del complete
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.912 2 DEBUG nova.compute.manager [req-d050cc9b-edc4-41e4-9cbb-7774a3f00a69 req-d1572c4f-3afa-4e6f-9e35-edba28986176 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Received event network-changed-be314f32-1beb-45ee-8ee5-f3ba6c9da81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.913 2 DEBUG nova.compute.manager [req-d050cc9b-edc4-41e4-9cbb-7774a3f00a69 req-d1572c4f-3afa-4e6f-9e35-edba28986176 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Refreshing instance network info cache due to event network-changed-be314f32-1beb-45ee-8ee5-f3ba6c9da81d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.913 2 DEBUG oslo_concurrency.lockutils [req-d050cc9b-edc4-41e4-9cbb-7774a3f00a69 req-d1572c4f-3afa-4e6f-9e35-edba28986176 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.913 2 DEBUG oslo_concurrency.lockutils [req-d050cc9b-edc4-41e4-9cbb-7774a3f00a69 req-d1572c4f-3afa-4e6f-9e35-edba28986176 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.914 2 DEBUG nova.network.neutron [req-d050cc9b-edc4-41e4-9cbb-7774a3f00a69 req-d1572c4f-3afa-4e6f-9e35-edba28986176 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Refreshing network info cache for port be314f32-1beb-45ee-8ee5-f3ba6c9da81d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:25:42 compute-1 nova_compute[192795]: 2025-09-30 21:25:42.973 2 INFO nova.scheduler.client.report [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Deleted allocations for instance 85a25db6-a162-405f-86f3-8b7a4bf007cb
Sep 30 21:25:43 compute-1 nova_compute[192795]: 2025-09-30 21:25:43.027 2 DEBUG oslo_concurrency.lockutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:43 compute-1 nova_compute[192795]: 2025-09-30 21:25:43.028 2 DEBUG oslo_concurrency.lockutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:43 compute-1 nova_compute[192795]: 2025-09-30 21:25:43.059 2 DEBUG nova.compute.provider_tree [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:43 compute-1 nova_compute[192795]: 2025-09-30 21:25:43.079 2 DEBUG nova.scheduler.client.report [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:43 compute-1 nova_compute[192795]: 2025-09-30 21:25:43.110 2 DEBUG oslo_concurrency.lockutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:43 compute-1 nova_compute[192795]: 2025-09-30 21:25:43.203 2 DEBUG oslo_concurrency.lockutils [None req-c789e670-9f21-41b7-a0ea-726a01a8910d bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "85a25db6-a162-405f-86f3-8b7a4bf007cb" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 22.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:43 compute-1 nova_compute[192795]: 2025-09-30 21:25:43.322 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:43 compute-1 nova_compute[192795]: 2025-09-30 21:25:43.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:44 compute-1 podman[228627]: 2025-09-30 21:25:44.22913176 +0000 UTC m=+0.064280057 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 21:25:44 compute-1 podman[228646]: 2025-09-30 21:25:44.307474601 +0000 UTC m=+0.055616868 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:25:44 compute-1 podman[228671]: 2025-09-30 21:25:44.396374152 +0000 UTC m=+0.058971197 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.buildah.version=1.33.7)
Sep 30 21:25:44 compute-1 nova_compute[192795]: 2025-09-30 21:25:44.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:44 compute-1 nova_compute[192795]: 2025-09-30 21:25:44.692 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:25:45 compute-1 nova_compute[192795]: 2025-09-30 21:25:45.429 2 DEBUG nova.network.neutron [req-d050cc9b-edc4-41e4-9cbb-7774a3f00a69 req-d1572c4f-3afa-4e6f-9e35-edba28986176 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Updated VIF entry in instance network info cache for port be314f32-1beb-45ee-8ee5-f3ba6c9da81d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:25:45 compute-1 nova_compute[192795]: 2025-09-30 21:25:45.430 2 DEBUG nova.network.neutron [req-d050cc9b-edc4-41e4-9cbb-7774a3f00a69 req-d1572c4f-3afa-4e6f-9e35-edba28986176 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Updating instance_info_cache with network_info: [{"id": "be314f32-1beb-45ee-8ee5-f3ba6c9da81d", "address": "fa:16:3e:56:e1:05", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": null, "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapbe314f32-1b", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:45 compute-1 nova_compute[192795]: 2025-09-30 21:25:45.455 2 DEBUG oslo_concurrency.lockutils [req-d050cc9b-edc4-41e4-9cbb-7774a3f00a69 req-d1572c4f-3afa-4e6f-9e35-edba28986176 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-85a25db6-a162-405f-86f3-8b7a4bf007cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:45 compute-1 nova_compute[192795]: 2025-09-30 21:25:45.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:45 compute-1 nova_compute[192795]: 2025-09-30 21:25:45.870 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:45 compute-1 nova_compute[192795]: 2025-09-30 21:25:45.871 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:45 compute-1 nova_compute[192795]: 2025-09-30 21:25:45.893 2 DEBUG nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:25:45 compute-1 nova_compute[192795]: 2025-09-30 21:25:45.992 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:45 compute-1 nova_compute[192795]: 2025-09-30 21:25:45.993 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.002 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.003 2 INFO nova.compute.claims [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.181 2 DEBUG nova.compute.provider_tree [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.196 2 DEBUG nova.scheduler.client.report [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.231 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.232 2 DEBUG nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.315 2 DEBUG nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.316 2 DEBUG nova.network.neutron [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.334 2 INFO nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.365 2 DEBUG nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.506 2 DEBUG nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.507 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.508 2 INFO nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Creating image(s)
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.509 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.509 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.510 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.529 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.602 2 DEBUG nova.policy [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.606 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.606 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.607 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.617 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.675 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.676 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.697 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.714 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.715 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.715 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.774 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.775 2 DEBUG nova.virt.disk.api [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.776 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.840 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.842 2 DEBUG nova.virt.disk.api [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.843 2 DEBUG nova.objects.instance [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.856 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.857 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Ensure instance console log exists: /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.858 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.858 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:46 compute-1 nova_compute[192795]: 2025-09-30 21:25:46.858 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:47 compute-1 nova_compute[192795]: 2025-09-30 21:25:47.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:47 compute-1 nova_compute[192795]: 2025-09-30 21:25:47.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:47 compute-1 nova_compute[192795]: 2025-09-30 21:25:47.858 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267532.8574703, 85a25db6-a162-405f-86f3-8b7a4bf007cb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:47 compute-1 nova_compute[192795]: 2025-09-30 21:25:47.858 2 INFO nova.compute.manager [-] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] VM Stopped (Lifecycle Event)
Sep 30 21:25:47 compute-1 nova_compute[192795]: 2025-09-30 21:25:47.885 2 DEBUG nova.network.neutron [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Successfully created port: 242fb53f-7c71-48ef-a180-00bad1488d61 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:25:47 compute-1 nova_compute[192795]: 2025-09-30 21:25:47.892 2 DEBUG nova.compute.manager [None req-9d449388-84fe-46b5-a232-0881b587a8e6 - - - - - -] [instance: 85a25db6-a162-405f-86f3-8b7a4bf007cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:48 compute-1 nova_compute[192795]: 2025-09-30 21:25:48.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:48 compute-1 nova_compute[192795]: 2025-09-30 21:25:48.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:25:48 compute-1 nova_compute[192795]: 2025-09-30 21:25:48.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:25:48 compute-1 nova_compute[192795]: 2025-09-30 21:25:48.710 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:25:48 compute-1 nova_compute[192795]: 2025-09-30 21:25:48.710 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:25:49 compute-1 nova_compute[192795]: 2025-09-30 21:25:49.614 2 DEBUG nova.network.neutron [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Successfully updated port: 242fb53f-7c71-48ef-a180-00bad1488d61 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:25:49 compute-1 nova_compute[192795]: 2025-09-30 21:25:49.646 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:49 compute-1 nova_compute[192795]: 2025-09-30 21:25:49.646 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:49 compute-1 nova_compute[192795]: 2025-09-30 21:25:49.647 2 DEBUG nova.network.neutron [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:25:49 compute-1 nova_compute[192795]: 2025-09-30 21:25:49.940 2 DEBUG nova.compute.manager [req-b7b324ae-e7aa-49c3-8a17-fd79bd5d0d96 req-e527bbcd-6ce0-4439-8327-55f3ac2f963b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-changed-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:49 compute-1 nova_compute[192795]: 2025-09-30 21:25:49.941 2 DEBUG nova.compute.manager [req-b7b324ae-e7aa-49c3-8a17-fd79bd5d0d96 req-e527bbcd-6ce0-4439-8327-55f3ac2f963b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Refreshing instance network info cache due to event network-changed-242fb53f-7c71-48ef-a180-00bad1488d61. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:25:49 compute-1 nova_compute[192795]: 2025-09-30 21:25:49.941 2 DEBUG oslo_concurrency.lockutils [req-b7b324ae-e7aa-49c3-8a17-fd79bd5d0d96 req-e527bbcd-6ce0-4439-8327-55f3ac2f963b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:50 compute-1 nova_compute[192795]: 2025-09-30 21:25:50.474 2 DEBUG nova.network.neutron [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.809 2 DEBUG nova.network.neutron [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.836 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.837 2 DEBUG nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance network_info: |[{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.837 2 DEBUG oslo_concurrency.lockutils [req-b7b324ae-e7aa-49c3-8a17-fd79bd5d0d96 req-e527bbcd-6ce0-4439-8327-55f3ac2f963b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.837 2 DEBUG nova.network.neutron [req-b7b324ae-e7aa-49c3-8a17-fd79bd5d0d96 req-e527bbcd-6ce0-4439-8327-55f3ac2f963b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Refreshing network info cache for port 242fb53f-7c71-48ef-a180-00bad1488d61 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.840 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Start _get_guest_xml network_info=[{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.845 2 WARNING nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.851 2 DEBUG nova.virt.libvirt.host [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.852 2 DEBUG nova.virt.libvirt.host [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.866 2 DEBUG nova.virt.libvirt.host [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.868 2 DEBUG nova.virt.libvirt.host [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.869 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.869 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.869 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.870 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.870 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.870 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.870 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.870 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.871 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.871 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.871 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.871 2 DEBUG nova.virt.hardware [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.875 2 DEBUG nova.virt.libvirt.vif [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:25:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.875 2 DEBUG nova.network.os_vif_util [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.876 2 DEBUG nova.network.os_vif_util [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.876 2 DEBUG nova.objects.instance [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.889 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <uuid>128bd4be-4a76-4dbb-aef6-65acd9c11cbd</uuid>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <name>instance-00000038</name>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerActionsTestJSON-server-394601736</nova:name>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:25:51</nova:creationTime>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:25:51 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:25:51 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:25:51 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:25:51 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:25:51 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:25:51 compute-1 nova_compute[192795]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:25:51 compute-1 nova_compute[192795]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:25:51 compute-1 nova_compute[192795]:         <nova:port uuid="242fb53f-7c71-48ef-a180-00bad1488d61">
Sep 30 21:25:51 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <system>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <entry name="serial">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <entry name="uuid">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     </system>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <os>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   </os>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <features>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   </features>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:65:e3:f2"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <target dev="tap242fb53f-7c"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/console.log" append="off"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <video>
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     </video>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:25:51 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:25:51 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:25:51 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:25:51 compute-1 nova_compute[192795]: </domain>
Sep 30 21:25:51 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.890 2 DEBUG nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Preparing to wait for external event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.890 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.890 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.890 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.891 2 DEBUG nova.virt.libvirt.vif [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:25:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.891 2 DEBUG nova.network.os_vif_util [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.892 2 DEBUG nova.network.os_vif_util [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.892 2 DEBUG os_vif [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.893 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.893 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.896 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap242fb53f-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.897 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap242fb53f-7c, col_values=(('external_ids', {'iface-id': '242fb53f-7c71-48ef-a180-00bad1488d61', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:e3:f2', 'vm-uuid': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:51 compute-1 NetworkManager[51724]: <info>  [1759267551.8992] manager: (tap242fb53f-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.905 2 INFO os_vif [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.959 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.959 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.959 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No VIF found with MAC fa:16:3e:65:e3:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:25:51 compute-1 nova_compute[192795]: 2025-09-30 21:25:51.960 2 INFO nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Using config drive
Sep 30 21:25:52 compute-1 nova_compute[192795]: 2025-09-30 21:25:52.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.381 2 INFO nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Creating config drive at /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.386 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmwag8712 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.515 2 DEBUG oslo_concurrency.processutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmwag8712" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:53 compute-1 kernel: tap242fb53f-7c: entered promiscuous mode
Sep 30 21:25:53 compute-1 NetworkManager[51724]: <info>  [1759267553.5770] manager: (tap242fb53f-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/107)
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-1 ovn_controller[94902]: 2025-09-30T21:25:53Z|00215|binding|INFO|Claiming lport 242fb53f-7c71-48ef-a180-00bad1488d61 for this chassis.
Sep 30 21:25:53 compute-1 ovn_controller[94902]: 2025-09-30T21:25:53Z|00216|binding|INFO|242fb53f-7c71-48ef-a180-00bad1488d61: Claiming fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.593 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.594 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.596 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.609 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e51f6521-fcd0-4529-a59b-747db32f88b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.610 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.612 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.612 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7205b0fe-ffcc-4db5-96ad-f4bb65c021ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.614 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9c440b-773d-40fd-8b9f-718771bbabf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 systemd-udevd[228728]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:25:53 compute-1 NetworkManager[51724]: <info>  [1759267553.6261] device (tap242fb53f-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:25:53 compute-1 NetworkManager[51724]: <info>  [1759267553.6277] device (tap242fb53f-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.626 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[09373900-170f-41c8-a1ee-47b1385f9794]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-1 ovn_controller[94902]: 2025-09-30T21:25:53Z|00217|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 ovn-installed in OVS
Sep 30 21:25:53 compute-1 ovn_controller[94902]: 2025-09-30T21:25:53Z|00218|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 up in Southbound
Sep 30 21:25:53 compute-1 systemd-machined[152783]: New machine qemu-27-instance-00000038.
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.653 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[94421c4d-da4a-4834-ad35-562748eb308a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 systemd[1]: Started Virtual Machine qemu-27-instance-00000038.
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.681 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d4e3a1-ba07-4d78-89e0-a632b1ac1737]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 NetworkManager[51724]: <info>  [1759267553.6895] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/108)
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.689 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c5635955-838d-47ba-840b-e11557d51b2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.705 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.726 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[170a8357-de8a-4433-974e-eb712331f503]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.730 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f62f39fc-b1a2-4aa8-ada3-5ea280e74be1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 NetworkManager[51724]: <info>  [1759267553.7578] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.764 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[527fee75-6168-475c-8e56-9b0ef4109550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.783 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[525d3936-ad35-475d-a033-e5b7d702834d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424943, 'reachable_time': 20482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228761, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.802 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0e48c75a-5c10-42d4-b6bf-d0fd2305f10f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424943, 'tstamp': 424943}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228762, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.820 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[988a55e9-abda-48ea-9a9e-3585e122cdae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424943, 'reachable_time': 20482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228763, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.858 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e891db88-2862-40fe-b321-ee2cc7eb3ece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.914 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dbcbc3d0-37c3-41c9-8771-9ad04d5352cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.916 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.916 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.917 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-1 NetworkManager[51724]: <info>  [1759267553.9195] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Sep 30 21:25:53 compute-1 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.922 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-1 ovn_controller[94902]: 2025-09-30T21:25:53Z|00219|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.925 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.935 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ce97e039-536a-4b63-a94a-4546879a15a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.936 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:25:53 compute-1 nova_compute[192795]: 2025-09-30 21:25:53.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:25:53.937 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.133 2 DEBUG nova.network.neutron [req-b7b324ae-e7aa-49c3-8a17-fd79bd5d0d96 req-e527bbcd-6ce0-4439-8327-55f3ac2f963b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updated VIF entry in instance network info cache for port 242fb53f-7c71-48ef-a180-00bad1488d61. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.134 2 DEBUG nova.network.neutron [req-b7b324ae-e7aa-49c3-8a17-fd79bd5d0d96 req-e527bbcd-6ce0-4439-8327-55f3ac2f963b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.179 2 DEBUG oslo_concurrency.lockutils [req-b7b324ae-e7aa-49c3-8a17-fd79bd5d0d96 req-e527bbcd-6ce0-4439-8327-55f3ac2f963b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:54 compute-1 podman[228801]: 2025-09-30 21:25:54.273761525 +0000 UTC m=+0.046606619 container create 80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.293 2 DEBUG nova.compute.manager [req-3874a8c3-0fb1-4702-afa5-80b506c0de23 req-4cf43780-7cce-46fd-80b8-9aa6266114ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.294 2 DEBUG oslo_concurrency.lockutils [req-3874a8c3-0fb1-4702-afa5-80b506c0de23 req-4cf43780-7cce-46fd-80b8-9aa6266114ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.295 2 DEBUG oslo_concurrency.lockutils [req-3874a8c3-0fb1-4702-afa5-80b506c0de23 req-4cf43780-7cce-46fd-80b8-9aa6266114ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.295 2 DEBUG oslo_concurrency.lockutils [req-3874a8c3-0fb1-4702-afa5-80b506c0de23 req-4cf43780-7cce-46fd-80b8-9aa6266114ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.296 2 DEBUG nova.compute.manager [req-3874a8c3-0fb1-4702-afa5-80b506c0de23 req-4cf43780-7cce-46fd-80b8-9aa6266114ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Processing event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:25:54 compute-1 systemd[1]: Started libpod-conmon-80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1.scope.
Sep 30 21:25:54 compute-1 podman[228801]: 2025-09-30 21:25:54.24757599 +0000 UTC m=+0.020421084 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:25:54 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:25:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df833756e53b9b23c97071434ccb3573d35170380cd8d1941ae00c4f3e00b3d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:25:54 compute-1 podman[228801]: 2025-09-30 21:25:54.364285349 +0000 UTC m=+0.137130463 container init 80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:25:54 compute-1 podman[228801]: 2025-09-30 21:25:54.372456426 +0000 UTC m=+0.145301520 container start 80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:25:54 compute-1 podman[228815]: 2025-09-30 21:25:54.376685438 +0000 UTC m=+0.057993230 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:25:54 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[228818]: [NOTICE]   (228840) : New worker (228842) forked
Sep 30 21:25:54 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[228818]: [NOTICE]   (228840) : Loading success.
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.688 2 DEBUG nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.689 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267554.6890607, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.690 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Started (Lifecycle Event)
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.693 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.696 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance spawned successfully.
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.696 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.731 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.735 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.736 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.736 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.736 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.737 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.737 2 DEBUG nova.virt.libvirt.driver [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.741 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.800 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.801 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267554.6891928, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.801 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Paused (Lifecycle Event)
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.831 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.835 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267554.6926394, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.835 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Resumed (Lifecycle Event)
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.865 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.869 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:25:54 compute-1 nova_compute[192795]: 2025-09-30 21:25:54.921 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.060 2 INFO nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Took 8.55 seconds to spawn the instance on the hypervisor.
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.061 2 DEBUG nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.224 2 INFO nova.compute.manager [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Took 9.26 seconds to build instance.
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.248 2 DEBUG oslo_concurrency.lockutils [None req-f10c80a4-ea21-447e-83bc-d69b0d2aa4d3 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.378s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.311 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.311 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.312 2 INFO nova.compute.manager [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Unshelving
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.508 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.509 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.520 2 DEBUG nova.objects.instance [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.739 2 DEBUG nova.objects.instance [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.818 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:25:55 compute-1 nova_compute[192795]: 2025-09-30 21:25:55.819 2 INFO nova.compute.claims [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:25:56 compute-1 nova_compute[192795]: 2025-09-30 21:25:56.042 2 DEBUG nova.compute.provider_tree [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:56 compute-1 nova_compute[192795]: 2025-09-30 21:25:56.057 2 DEBUG nova.scheduler.client.report [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:56 compute-1 nova_compute[192795]: 2025-09-30 21:25:56.131 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:56 compute-1 nova_compute[192795]: 2025-09-30 21:25:56.469 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:56 compute-1 nova_compute[192795]: 2025-09-30 21:25:56.472 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquired lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:56 compute-1 nova_compute[192795]: 2025-09-30 21:25:56.473 2 DEBUG nova.network.neutron [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:25:56 compute-1 nova_compute[192795]: 2025-09-30 21:25:56.759 2 DEBUG nova.network.neutron [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:25:56 compute-1 nova_compute[192795]: 2025-09-30 21:25:56.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.037 2 DEBUG nova.compute.manager [req-70c7453e-1b89-4cdd-ae27-3ffa5630fab5 req-69bd1909-8141-491e-be51-287b3ff32743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.039 2 DEBUG oslo_concurrency.lockutils [req-70c7453e-1b89-4cdd-ae27-3ffa5630fab5 req-69bd1909-8141-491e-be51-287b3ff32743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.039 2 DEBUG oslo_concurrency.lockutils [req-70c7453e-1b89-4cdd-ae27-3ffa5630fab5 req-69bd1909-8141-491e-be51-287b3ff32743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.040 2 DEBUG oslo_concurrency.lockutils [req-70c7453e-1b89-4cdd-ae27-3ffa5630fab5 req-69bd1909-8141-491e-be51-287b3ff32743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.041 2 DEBUG nova.compute.manager [req-70c7453e-1b89-4cdd-ae27-3ffa5630fab5 req-69bd1909-8141-491e-be51-287b3ff32743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.041 2 WARNING nova.compute.manager [req-70c7453e-1b89-4cdd-ae27-3ffa5630fab5 req-69bd1909-8141-491e-be51-287b3ff32743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state None.
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.398 2 DEBUG nova.network.neutron [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.448 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Releasing lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.451 2 DEBUG nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.452 2 INFO nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Creating image(s)
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.453 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.454 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.455 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.456 2 DEBUG nova.objects.instance [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.471 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "2129e8818893e7ae30fcedd85140012349e40d60" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:57 compute-1 nova_compute[192795]: 2025-09-30 21:25:57.473 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "2129e8818893e7ae30fcedd85140012349e40d60" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:59 compute-1 nova_compute[192795]: 2025-09-30 21:25:59.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:59 compute-1 NetworkManager[51724]: <info>  [1759267559.7479] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Sep 30 21:25:59 compute-1 NetworkManager[51724]: <info>  [1759267559.7492] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Sep 30 21:25:59 compute-1 ovn_controller[94902]: 2025-09-30T21:25:59Z|00220|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:25:59 compute-1 ovn_controller[94902]: 2025-09-30T21:25:59Z|00221|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:25:59 compute-1 nova_compute[192795]: 2025-09-30 21:25:59.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:00 compute-1 nova_compute[192795]: 2025-09-30 21:26:00.698 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:00 compute-1 nova_compute[192795]: 2025-09-30 21:26:00.767 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60.part --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:00 compute-1 nova_compute[192795]: 2025-09-30 21:26:00.769 2 DEBUG nova.virt.images [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] 12a46b22-714c-4e47-9bb0-a53a7e554c30 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:26:00 compute-1 nova_compute[192795]: 2025-09-30 21:26:00.770 2 DEBUG nova.privsep.utils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:26:00 compute-1 nova_compute[192795]: 2025-09-30 21:26:00.771 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60.part /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.046 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60.part /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60.converted" returned: 0 in 0.275s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.064 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.133 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60.converted --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.135 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "2129e8818893e7ae30fcedd85140012349e40d60" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.152 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.252 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.254 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "2129e8818893e7ae30fcedd85140012349e40d60" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.254 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "2129e8818893e7ae30fcedd85140012349e40d60" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.267 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.348 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.349 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60,backing_fmt=raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.388 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60,backing_fmt=raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.389 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "2129e8818893e7ae30fcedd85140012349e40d60" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.390 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.454 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.455 2 DEBUG nova.objects.instance [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'migration_context' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.478 2 INFO nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Rebasing disk image.
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.479 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.556 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.559 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a -F raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:01 compute-1 nova_compute[192795]: 2025-09-30 21:26:01.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:02 compute-1 nova_compute[192795]: 2025-09-30 21:26:02.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:02 compute-1 nova_compute[192795]: 2025-09-30 21:26:02.833 2 DEBUG nova.compute.manager [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-changed-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:02 compute-1 nova_compute[192795]: 2025-09-30 21:26:02.834 2 DEBUG nova.compute.manager [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Refreshing instance network info cache due to event network-changed-242fb53f-7c71-48ef-a180-00bad1488d61. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:26:02 compute-1 nova_compute[192795]: 2025-09-30 21:26:02.834 2 DEBUG oslo_concurrency.lockutils [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:02 compute-1 nova_compute[192795]: 2025-09-30 21:26:02.834 2 DEBUG oslo_concurrency.lockutils [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:02 compute-1 nova_compute[192795]: 2025-09-30 21:26:02.835 2 DEBUG nova.network.neutron [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Refreshing network info cache for port 242fb53f-7c71-48ef-a180-00bad1488d61 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.019 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a -F raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk" returned: 0 in 1.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.020 2 DEBUG nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.021 2 DEBUG nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Ensure instance console log exists: /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.021 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.022 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.022 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.024 2 DEBUG nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='a674519136c2974266d7e074c055c088',container_format='bare',created_at=2025-09-30T21:25:31Z,direct_url=<?>,disk_format='qcow2',id=12a46b22-714c-4e47-9bb0-a53a7e554c30,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1659572675-shelved',owner='1124d87211d04502bce5e44c25ed5d3c',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-09-30T21:25:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.034 2 WARNING nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.039 2 DEBUG nova.virt.libvirt.host [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.040 2 DEBUG nova.virt.libvirt.host [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.043 2 DEBUG nova.virt.libvirt.host [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.044 2 DEBUG nova.virt.libvirt.host [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.048 2 DEBUG nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.049 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='a674519136c2974266d7e074c055c088',container_format='bare',created_at=2025-09-30T21:25:31Z,direct_url=<?>,disk_format='qcow2',id=12a46b22-714c-4e47-9bb0-a53a7e554c30,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1659572675-shelved',owner='1124d87211d04502bce5e44c25ed5d3c',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-09-30T21:25:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.050 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.050 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.051 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.051 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.051 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.052 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.052 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.052 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.052 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.054 2 DEBUG nova.virt.hardware [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.055 2 DEBUG nova.objects.instance [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.122 2 DEBUG nova.objects.instance [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.150 2 DEBUG nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <uuid>2aa71932-03fb-4f75-b359-e1ff961fd8f6</uuid>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <name>instance-00000033</name>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1659572675</nova:name>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:26:03</nova:creationTime>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:26:03 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:26:03 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:26:03 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:26:03 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:26:03 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:26:03 compute-1 nova_compute[192795]:         <nova:user uuid="bd6d636573fd4d899b5e6f4a1f7e1790">tempest-UnshelveToHostMultiNodesTest-98679670-project-member</nova:user>
Sep 30 21:26:03 compute-1 nova_compute[192795]:         <nova:project uuid="1124d87211d04502bce5e44c25ed5d3c">tempest-UnshelveToHostMultiNodesTest-98679670</nova:project>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="12a46b22-714c-4e47-9bb0-a53a7e554c30"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <system>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <entry name="serial">2aa71932-03fb-4f75-b359-e1ff961fd8f6</entry>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <entry name="uuid">2aa71932-03fb-4f75-b359-e1ff961fd8f6</entry>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     </system>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <os>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   </os>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <features>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   </features>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/console.log" append="off"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <video>
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     </video>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <input type="keyboard" bus="usb"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:26:03 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:26:03 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:26:03 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:26:03 compute-1 nova_compute[192795]: </domain>
Sep 30 21:26:03 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.232 2 DEBUG nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.233 2 DEBUG nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.234 2 INFO nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Using config drive
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.258 2 DEBUG nova.objects.instance [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:03 compute-1 podman[228885]: 2025-09-30 21:26:03.262450885 +0000 UTC m=+0.094794557 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:26:03 compute-1 podman[228887]: 2025-09-30 21:26:03.29314391 +0000 UTC m=+0.108546043 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:26:03 compute-1 podman[228886]: 2025-09-30 21:26:03.319167742 +0000 UTC m=+0.148837993 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.332 2 DEBUG nova.objects.instance [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'keypairs' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.551 2 INFO nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Creating config drive at /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.557 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa3i4vez_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:03 compute-1 nova_compute[192795]: 2025-09-30 21:26:03.691 2 DEBUG oslo_concurrency.processutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa3i4vez_" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:03 compute-1 systemd-machined[152783]: New machine qemu-28-instance-00000033.
Sep 30 21:26:03 compute-1 systemd[1]: Started Virtual Machine qemu-28-instance-00000033.
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.640 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267564.6405027, 2aa71932-03fb-4f75-b359-e1ff961fd8f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.641 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] VM Resumed (Lifecycle Event)
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.643 2 DEBUG nova.compute.manager [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.644 2 DEBUG nova.virt.libvirt.driver [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.646 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance spawned successfully.
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.663 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.668 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.704 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.705 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267564.6413453, 2aa71932-03fb-4f75-b359-e1ff961fd8f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.705 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] VM Started (Lifecycle Event)
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.732 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.736 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:04 compute-1 nova_compute[192795]: 2025-09-30 21:26:04.764 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:26:05 compute-1 nova_compute[192795]: 2025-09-30 21:26:05.072 2 DEBUG nova.network.neutron [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updated VIF entry in instance network info cache for port 242fb53f-7c71-48ef-a180-00bad1488d61. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:26:05 compute-1 nova_compute[192795]: 2025-09-30 21:26:05.074 2 DEBUG nova.network.neutron [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:05 compute-1 nova_compute[192795]: 2025-09-30 21:26:05.101 2 DEBUG oslo_concurrency.lockutils [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:05 compute-1 nova_compute[192795]: 2025-09-30 21:26:05.470 2 DEBUG nova.compute.manager [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:05 compute-1 nova_compute[192795]: 2025-09-30 21:26:05.643 2 DEBUG oslo_concurrency.lockutils [None req-762a9f83-111e-4829-b83f-8b53a2335173 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 10.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.835 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.836 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.836 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.837 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.837 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.856 2 INFO nova.compute.manager [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Terminating instance
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.867 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.867 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquired lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.868 2 DEBUG nova.network.neutron [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:26:06 compute-1 nova_compute[192795]: 2025-09-30 21:26:06.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:07 compute-1 nova_compute[192795]: 2025-09-30 21:26:07.045 2 DEBUG nova.network.neutron [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:26:07 compute-1 nova_compute[192795]: 2025-09-30 21:26:07.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:07 compute-1 podman[228992]: 2025-09-30 21:26:07.269741382 +0000 UTC m=+0.102758220 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:26:07 compute-1 nova_compute[192795]: 2025-09-30 21:26:07.468 2 DEBUG nova.network.neutron [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:07 compute-1 nova_compute[192795]: 2025-09-30 21:26:07.491 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Releasing lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:07 compute-1 nova_compute[192795]: 2025-09-30 21:26:07.493 2 DEBUG nova.compute.manager [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:26:07 compute-1 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000033.scope: Deactivated successfully.
Sep 30 21:26:07 compute-1 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000033.scope: Consumed 3.705s CPU time.
Sep 30 21:26:07 compute-1 systemd-machined[152783]: Machine qemu-28-instance-00000033 terminated.
Sep 30 21:26:07 compute-1 ovn_controller[94902]: 2025-09-30T21:26:07Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:26:07 compute-1 ovn_controller[94902]: 2025-09-30T21:26:07Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:26:07 compute-1 nova_compute[192795]: 2025-09-30 21:26:07.756 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance destroyed successfully.
Sep 30 21:26:07 compute-1 nova_compute[192795]: 2025-09-30 21:26:07.757 2 DEBUG nova.objects.instance [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lazy-loading 'resources' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.000 2 INFO nova.virt.libvirt.driver [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Deleting instance files /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6_del
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.012 2 INFO nova.virt.libvirt.driver [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Deletion of /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6_del complete
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.619 2 INFO nova.compute.manager [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Took 2.12 seconds to destroy the instance on the hypervisor.
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.619 2 DEBUG oslo.service.loopingcall [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.620 2 DEBUG nova.compute.manager [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.620 2 DEBUG nova.network.neutron [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.814 2 DEBUG nova.network.neutron [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.837 2 DEBUG nova.network.neutron [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.868 2 INFO nova.compute.manager [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Took 0.25 seconds to deallocate network for instance.
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.953 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:09 compute-1 nova_compute[192795]: 2025-09-30 21:26:09.954 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:10 compute-1 nova_compute[192795]: 2025-09-30 21:26:10.077 2 DEBUG nova.compute.provider_tree [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:26:10 compute-1 nova_compute[192795]: 2025-09-30 21:26:10.234 2 DEBUG nova.scheduler.client.report [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:26:10 compute-1 nova_compute[192795]: 2025-09-30 21:26:10.494 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:10 compute-1 nova_compute[192795]: 2025-09-30 21:26:10.563 2 INFO nova.scheduler.client.report [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Deleted allocations for instance 2aa71932-03fb-4f75-b359-e1ff961fd8f6
Sep 30 21:26:10 compute-1 nova_compute[192795]: 2025-09-30 21:26:10.668 2 DEBUG oslo_concurrency.lockutils [None req-1d380f4b-619a-4c43-8b1a-74c547cd7ebc bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.832s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:11 compute-1 nova_compute[192795]: 2025-09-30 21:26:11.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:12 compute-1 nova_compute[192795]: 2025-09-30 21:26:12.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:15 compute-1 podman[229024]: 2025-09-30 21:26:15.223245853 +0000 UTC m=+0.057711733 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:26:15 compute-1 podman[229023]: 2025-09-30 21:26:15.22350427 +0000 UTC m=+0.057592720 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:26:15 compute-1 podman[229022]: 2025-09-30 21:26:15.244719193 +0000 UTC m=+0.083162429 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 21:26:16 compute-1 nova_compute[192795]: 2025-09-30 21:26:16.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:17 compute-1 nova_compute[192795]: 2025-09-30 21:26:17.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:19 compute-1 nova_compute[192795]: 2025-09-30 21:26:19.420 2 DEBUG oslo_concurrency.lockutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:19 compute-1 nova_compute[192795]: 2025-09-30 21:26:19.421 2 DEBUG oslo_concurrency.lockutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:19 compute-1 nova_compute[192795]: 2025-09-30 21:26:19.422 2 INFO nova.compute.manager [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Rebooting instance
Sep 30 21:26:19 compute-1 nova_compute[192795]: 2025-09-30 21:26:19.445 2 DEBUG oslo_concurrency.lockutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:19 compute-1 nova_compute[192795]: 2025-09-30 21:26:19.446 2 DEBUG oslo_concurrency.lockutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:19 compute-1 nova_compute[192795]: 2025-09-30 21:26:19.446 2 DEBUG nova.network.neutron [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:26:21 compute-1 nova_compute[192795]: 2025-09-30 21:26:21.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.388 2 DEBUG nova.network.neutron [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.408 2 DEBUG oslo_concurrency.lockutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.421 2 DEBUG nova.compute.manager [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:22 compute-1 kernel: tap242fb53f-7c (unregistering): left promiscuous mode
Sep 30 21:26:22 compute-1 NetworkManager[51724]: <info>  [1759267582.5914] device (tap242fb53f-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 ovn_controller[94902]: 2025-09-30T21:26:22Z|00222|binding|INFO|Releasing lport 242fb53f-7c71-48ef-a180-00bad1488d61 from this chassis (sb_readonly=0)
Sep 30 21:26:22 compute-1 ovn_controller[94902]: 2025-09-30T21:26:22Z|00223|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 down in Southbound
Sep 30 21:26:22 compute-1 ovn_controller[94902]: 2025-09-30T21:26:22Z|00224|binding|INFO|Removing iface tap242fb53f-7c ovn-installed in OVS
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.612 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.615 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.617 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.619 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fc209878-4b70-4f1c-951b-2511811ec343]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.620 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:26:22 compute-1 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000038.scope: Deactivated successfully.
Sep 30 21:26:22 compute-1 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000038.scope: Consumed 13.466s CPU time.
Sep 30 21:26:22 compute-1 systemd-machined[152783]: Machine qemu-27-instance-00000038 terminated.
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.753 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267567.7514288, 2aa71932-03fb-4f75-b359-e1ff961fd8f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.753 2 INFO nova.compute.manager [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] VM Stopped (Lifecycle Event)
Sep 30 21:26:22 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[228818]: [NOTICE]   (228840) : haproxy version is 2.8.14-c23fe91
Sep 30 21:26:22 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[228818]: [NOTICE]   (228840) : path to executable is /usr/sbin/haproxy
Sep 30 21:26:22 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[228818]: [WARNING]  (228840) : Exiting Master process...
Sep 30 21:26:22 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[228818]: [WARNING]  (228840) : Exiting Master process...
Sep 30 21:26:22 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[228818]: [ALERT]    (228840) : Current worker (228842) exited with code 143 (Terminated)
Sep 30 21:26:22 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[228818]: [WARNING]  (228840) : All workers exited. Exiting... (0)
Sep 30 21:26:22 compute-1 systemd[1]: libpod-80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1.scope: Deactivated successfully.
Sep 30 21:26:22 compute-1 podman[229106]: 2025-09-30 21:26:22.779918749 +0000 UTC m=+0.052970908 container died 80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:26:22 compute-1 kernel: tap242fb53f-7c: entered promiscuous mode
Sep 30 21:26:22 compute-1 kernel: tap242fb53f-7c (unregistering): left promiscuous mode
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.790 2 DEBUG nova.compute.manager [None req-9e7bb450-eabf-4547-9e4b-d085a1c9ad96 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:22 compute-1 NetworkManager[51724]: <info>  [1759267582.7940] manager: (tap242fb53f-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1-userdata-shm.mount: Deactivated successfully.
Sep 30 21:26:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-df833756e53b9b23c97071434ccb3573d35170380cd8d1941ae00c4f3e00b3d4-merged.mount: Deactivated successfully.
Sep 30 21:26:22 compute-1 podman[229106]: 2025-09-30 21:26:22.836177833 +0000 UTC m=+0.109229992 container cleanup 80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:26:22 compute-1 systemd[1]: libpod-conmon-80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1.scope: Deactivated successfully.
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.851 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance destroyed successfully.
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.852 2 DEBUG nova.objects.instance [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.869 2 DEBUG nova.virt.libvirt.vif [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:26:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.871 2 DEBUG nova.network.os_vif_util [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.872 2 DEBUG nova.network.os_vif_util [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.874 2 DEBUG os_vif [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.883 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap242fb53f-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.891 2 INFO os_vif [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.900 2 DEBUG nova.virt.libvirt.driver [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Start _get_guest_xml network_info=[{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.905 2 WARNING nova.virt.libvirt.driver [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.911 2 DEBUG nova.virt.libvirt.host [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.912 2 DEBUG nova.virt.libvirt.host [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:26:22 compute-1 podman[229151]: 2025-09-30 21:26:22.915731755 +0000 UTC m=+0.051269932 container remove 80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.916 2 DEBUG nova.virt.libvirt.host [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.917 2 DEBUG nova.virt.libvirt.host [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.919 2 DEBUG nova.virt.libvirt.driver [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.920 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.920 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.921 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.921 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.922 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.922 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.923 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.923 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.924 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.924 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.925 2 DEBUG nova.virt.hardware [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.925 2 DEBUG nova.objects.instance [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.930 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbf5c05-58ac-4aa7-934d-eeaaac2576b7]: (4, ('Tue Sep 30 09:26:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1)\n80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1\nTue Sep 30 09:26:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1)\n80c94e4187e1a01a88870bb6939e0767cd3f729dc7daf61276aa79720f176de1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.932 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[71ca2645-6925-4eb2-99cb-b90acbdbe30b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.933 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.940 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[be2549e7-04d4-456c-adff-fa582bcf9942]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.943 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:22 compute-1 nova_compute[192795]: 2025-09-30 21:26:22.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.982 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ee058bb0-5a98-499f-8650-c4a36f488b1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:22.984 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ac77b2-3983-4e5b-b2e9-7ba8f3af2c29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.005 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a0bb8f5d-35a8-497b-94d5-3f90b3bfefae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424934, 'reachable_time': 23088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229166, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.009 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.010 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[42ca07ad-8841-417c-9991-350c6d93a9e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.054 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config --force-share --output=json" returned: 0 in 0.111s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.055 2 DEBUG oslo_concurrency.lockutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.055 2 DEBUG oslo_concurrency.lockutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.056 2 DEBUG oslo_concurrency.lockutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.057 2 DEBUG nova.virt.libvirt.vif [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:26:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.057 2 DEBUG nova.network.os_vif_util [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.058 2 DEBUG nova.network.os_vif_util [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.059 2 DEBUG nova.objects.instance [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.072 2 DEBUG nova.virt.libvirt.driver [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <uuid>128bd4be-4a76-4dbb-aef6-65acd9c11cbd</uuid>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <name>instance-00000038</name>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerActionsTestJSON-server-394601736</nova:name>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:26:22</nova:creationTime>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:26:23 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:26:23 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:26:23 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:26:23 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:26:23 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:26:23 compute-1 nova_compute[192795]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:26:23 compute-1 nova_compute[192795]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:26:23 compute-1 nova_compute[192795]:         <nova:port uuid="242fb53f-7c71-48ef-a180-00bad1488d61">
Sep 30 21:26:23 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <system>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <entry name="serial">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <entry name="uuid">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     </system>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <os>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   </os>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <features>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   </features>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:65:e3:f2"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <target dev="tap242fb53f-7c"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/console.log" append="off"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <video>
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     </video>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <input type="keyboard" bus="usb"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:26:23 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:26:23 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:26:23 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:26:23 compute-1 nova_compute[192795]: </domain>
Sep 30 21:26:23 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.073 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.134 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.135 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.198 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.201 2 DEBUG nova.objects.instance [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.222 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.289 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.291 2 DEBUG nova.virt.disk.api [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.291 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.358 2 DEBUG oslo_concurrency.processutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.359 2 DEBUG nova.virt.disk.api [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.360 2 DEBUG nova.objects.instance [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.380 2 DEBUG nova.virt.libvirt.vif [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:26:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.380 2 DEBUG nova.network.os_vif_util [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.382 2 DEBUG nova.network.os_vif_util [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.382 2 DEBUG os_vif [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.384 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.385 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.390 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap242fb53f-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.391 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap242fb53f-7c, col_values=(('external_ids', {'iface-id': '242fb53f-7c71-48ef-a180-00bad1488d61', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:e3:f2', 'vm-uuid': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:23 compute-1 NetworkManager[51724]: <info>  [1759267583.4215] manager: (tap242fb53f-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.429 2 INFO os_vif [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:26:23 compute-1 kernel: tap242fb53f-7c: entered promiscuous mode
Sep 30 21:26:23 compute-1 systemd-udevd[229089]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:26:23 compute-1 NetworkManager[51724]: <info>  [1759267583.5585] manager: (tap242fb53f-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Sep 30 21:26:23 compute-1 ovn_controller[94902]: 2025-09-30T21:26:23Z|00225|binding|INFO|Claiming lport 242fb53f-7c71-48ef-a180-00bad1488d61 for this chassis.
Sep 30 21:26:23 compute-1 ovn_controller[94902]: 2025-09-30T21:26:23Z|00226|binding|INFO|242fb53f-7c71-48ef-a180-00bad1488d61: Claiming fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.574 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.576 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:26:23 compute-1 NetworkManager[51724]: <info>  [1759267583.5804] device (tap242fb53f-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.579 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:26:23 compute-1 NetworkManager[51724]: <info>  [1759267583.5821] device (tap242fb53f-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:26:23 compute-1 ovn_controller[94902]: 2025-09-30T21:26:23Z|00227|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 ovn-installed in OVS
Sep 30 21:26:23 compute-1 ovn_controller[94902]: 2025-09-30T21:26:23Z|00228|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 up in Southbound
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.601 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0585b8-61af-48db-b791-167f91183dec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.602 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.605 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.605 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7687ad87-7a1c-40a5-98be-52b0d240792c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.606 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bad2597f-ba1d-4752-b9b3-5c36be7a5ae3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 systemd-machined[152783]: New machine qemu-29-instance-00000038.
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.627 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[07e22050-6e4b-4037-9b96-95baed745983]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 systemd[1]: Started Virtual Machine qemu-29-instance-00000038.
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.649 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[de4c5381-3bcb-45e0-87b2-52598647991f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.683 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f4282474-52d8-469c-b462-b9d8462fea00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.693 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8416e648-6908-43da-ab6c-da1b11e156aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 NetworkManager[51724]: <info>  [1759267583.6957] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/115)
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.732 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6341ec-8ac2-4da9-9a6a-4769d3c80dd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.735 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[fef6951c-10f4-4fe9-83ce-6037f3e98276]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.765 2 DEBUG nova.compute.manager [req-f6fce170-cc9f-431a-8fff-a88052c9b8ac req-bed28229-882d-4ebc-96ef-9220d1d98bd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.765 2 DEBUG oslo_concurrency.lockutils [req-f6fce170-cc9f-431a-8fff-a88052c9b8ac req-bed28229-882d-4ebc-96ef-9220d1d98bd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.766 2 DEBUG oslo_concurrency.lockutils [req-f6fce170-cc9f-431a-8fff-a88052c9b8ac req-bed28229-882d-4ebc-96ef-9220d1d98bd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.766 2 DEBUG oslo_concurrency.lockutils [req-f6fce170-cc9f-431a-8fff-a88052c9b8ac req-bed28229-882d-4ebc-96ef-9220d1d98bd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.766 2 DEBUG nova.compute.manager [req-f6fce170-cc9f-431a-8fff-a88052c9b8ac req-bed28229-882d-4ebc-96ef-9220d1d98bd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.766 2 WARNING nova.compute.manager [req-f6fce170-cc9f-431a-8fff-a88052c9b8ac req-bed28229-882d-4ebc-96ef-9220d1d98bd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state reboot_started_hard.
Sep 30 21:26:23 compute-1 NetworkManager[51724]: <info>  [1759267583.7823] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.794 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[dad094f7-f891-444e-a811-423cd9716727]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.824 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[20e4c732-6253-4e59-8897-61fc3e512691]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427945, 'reachable_time': 39017, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229228, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.848 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[42bf48c9-3912-485b-993b-e6b3e92e9206]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 427945, 'tstamp': 427945}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229229, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.873 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:23 compute-1 nova_compute[192795]: 2025-09-30 21:26:23.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.875 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0db47d0d-cba8-44b9-87d4-6c6a5e532637]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427945, 'reachable_time': 39017, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229230, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:23.925 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[937af349-2df4-4ab1-9610-963116280ad9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.035 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cec6623f-359a-438d-8fae-6f9c30e193b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.037 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.038 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.039 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:24 compute-1 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:26:24 compute-1 NetworkManager[51724]: <info>  [1759267584.0436] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.049 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:24 compute-1 ovn_controller[94902]: 2025-09-30T21:26:24Z|00229|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.082 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.083 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[212cb44b-e5a4-46f6-88cd-7d0009ac9fc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.084 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.086 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:26:24 compute-1 podman[229268]: 2025-09-30 21:26:24.539230018 +0000 UTC m=+0.056436479 container create 7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:26:24 compute-1 systemd[1]: Started libpod-conmon-7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef.scope.
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.594 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 128bd4be-4a76-4dbb-aef6-65acd9c11cbd due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.596 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267584.594196, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.597 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Resumed (Lifecycle Event)
Sep 30 21:26:24 compute-1 podman[229268]: 2025-09-30 21:26:24.510115716 +0000 UTC m=+0.027322157 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.600 2 DEBUG nova.compute.manager [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.610 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance rebooted successfully.
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.611 2 DEBUG nova.compute.manager [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:24 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:26:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b89a964af58518df348e2fcf8f6672797c0e8d0b352efdac9e804a839dd263d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:26:24 compute-1 podman[229268]: 2025-09-30 21:26:24.639236205 +0000 UTC m=+0.156442636 container init 7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.644 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.648 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:24 compute-1 podman[229268]: 2025-09-30 21:26:24.651602273 +0000 UTC m=+0.168808694 container start 7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.687 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.688 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267584.6003952, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.688 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Started (Lifecycle Event)
Sep 30 21:26:24 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229284]: [NOTICE]   (229298) : New worker (229306) forked
Sep 30 21:26:24 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229284]: [NOTICE]   (229298) : Loading success.
Sep 30 21:26:24 compute-1 podman[229281]: 2025-09-30 21:26:24.708147164 +0000 UTC m=+0.111990815 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.720 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.724 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:24.738 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:26:24 compute-1 nova_compute[192795]: 2025-09-30 21:26:24.751 2 DEBUG oslo_concurrency.lockutils [None req-7347472e-4c07-4925-a1fb-caab4cd763d1 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 5.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.841 2 DEBUG nova.compute.manager [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.842 2 DEBUG oslo_concurrency.lockutils [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.842 2 DEBUG oslo_concurrency.lockutils [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.842 2 DEBUG oslo_concurrency.lockutils [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.842 2 DEBUG nova.compute.manager [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.842 2 WARNING nova.compute.manager [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state None.
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.843 2 DEBUG nova.compute.manager [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.843 2 DEBUG oslo_concurrency.lockutils [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.843 2 DEBUG oslo_concurrency.lockutils [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.843 2 DEBUG oslo_concurrency.lockutils [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.843 2 DEBUG nova.compute.manager [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.843 2 WARNING nova.compute.manager [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state None.
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.843 2 DEBUG nova.compute.manager [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.844 2 DEBUG oslo_concurrency.lockutils [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.844 2 DEBUG oslo_concurrency.lockutils [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.844 2 DEBUG oslo_concurrency.lockutils [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.844 2 DEBUG nova.compute.manager [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.844 2 WARNING nova.compute.manager [req-1e64c76f-9ad0-4c59-b407-dfe93c23d8da req-9545d9a5-e909-4ab4-8258-726e6efad2c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state None.
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.864 2 INFO nova.compute.manager [None req-74236572-b9c9-44d2-a830-f0e5e2a76f34 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Get console output
Sep 30 21:26:25 compute-1 nova_compute[192795]: 2025-09-30 21:26:25.994 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:26:26 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:26.743 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:27 compute-1 nova_compute[192795]: 2025-09-30 21:26:27.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:27 compute-1 nova_compute[192795]: 2025-09-30 21:26:27.197 2 INFO nova.compute.manager [None req-f15f5459-83bb-4e98-8f61-c28f4136016b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Get console output
Sep 30 21:26:28 compute-1 nova_compute[192795]: 2025-09-30 21:26:28.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:32 compute-1 nova_compute[192795]: 2025-09-30 21:26:32.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:32 compute-1 nova_compute[192795]: 2025-09-30 21:26:32.291 2 DEBUG oslo_concurrency.lockutils [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:32 compute-1 nova_compute[192795]: 2025-09-30 21:26:32.292 2 DEBUG oslo_concurrency.lockutils [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:32 compute-1 nova_compute[192795]: 2025-09-30 21:26:32.293 2 DEBUG nova.compute.manager [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:32 compute-1 nova_compute[192795]: 2025-09-30 21:26:32.299 2 DEBUG nova.compute.manager [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Sep 30 21:26:32 compute-1 nova_compute[192795]: 2025-09-30 21:26:32.303 2 DEBUG nova.objects.instance [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'flavor' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:32 compute-1 nova_compute[192795]: 2025-09-30 21:26:32.335 2 DEBUG nova.objects.instance [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'info_cache' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:32 compute-1 nova_compute[192795]: 2025-09-30 21:26:32.382 2 DEBUG nova.virt.libvirt.driver [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:26:33 compute-1 nova_compute[192795]: 2025-09-30 21:26:33.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:34 compute-1 podman[229321]: 2025-09-30 21:26:34.253054868 +0000 UTC m=+0.088350068 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:26:34 compute-1 podman[229319]: 2025-09-30 21:26:34.269841033 +0000 UTC m=+0.103283024 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:26:34 compute-1 podman[229320]: 2025-09-30 21:26:34.31038849 +0000 UTC m=+0.138906169 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:26:37 compute-1 nova_compute[192795]: 2025-09-30 21:26:37.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:38 compute-1 ovn_controller[94902]: 2025-09-30T21:26:38Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:26:38 compute-1 podman[229394]: 2025-09-30 21:26:38.261393523 +0000 UTC m=+0.095515158 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, managed_by=edpm_ansible)
Sep 30 21:26:38 compute-1 nova_compute[192795]: 2025-09-30 21:26:38.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:38.687 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:38.688 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:38.689 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:39 compute-1 nova_compute[192795]: 2025-09-30 21:26:39.715 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:41 compute-1 sshd-session[229415]: Invalid user ubuntu from 167.71.248.239 port 54718
Sep 30 21:26:41 compute-1 sshd-session[229415]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:26:41 compute-1 sshd-session[229415]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239
Sep 30 21:26:41 compute-1 nova_compute[192795]: 2025-09-30 21:26:41.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:41 compute-1 nova_compute[192795]: 2025-09-30 21:26:41.727 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:41 compute-1 nova_compute[192795]: 2025-09-30 21:26:41.727 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:41 compute-1 nova_compute[192795]: 2025-09-30 21:26:41.728 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:41 compute-1 nova_compute[192795]: 2025-09-30 21:26:41.728 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:26:41 compute-1 nova_compute[192795]: 2025-09-30 21:26:41.809 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:41 compute-1 nova_compute[192795]: 2025-09-30 21:26:41.909 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:41 compute-1 nova_compute[192795]: 2025-09-30 21:26:41.910 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:41 compute-1 nova_compute[192795]: 2025-09-30 21:26:41.974 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.150 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.151 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5549MB free_disk=73.36477661132812GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.151 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.152 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.227 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 128bd4be-4a76-4dbb-aef6-65acd9c11cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.228 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.228 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.256 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.275 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.275 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.290 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.311 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.353 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.381 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.420 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.421 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:42 compute-1 nova_compute[192795]: 2025-09-30 21:26:42.436 2 DEBUG nova.virt.libvirt.driver [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:26:43 compute-1 nova_compute[192795]: 2025-09-30 21:26:43.421 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:43 compute-1 nova_compute[192795]: 2025-09-30 21:26:43.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:43 compute-1 sshd-session[229415]: Failed password for invalid user ubuntu from 167.71.248.239 port 54718 ssh2
Sep 30 21:26:43 compute-1 nova_compute[192795]: 2025-09-30 21:26:43.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.018 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'name': 'tempest-ServerActionsTestJSON-server-394601736', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2af578a858a44374a3dc027bbf7c69f2', 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'hostId': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.019 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.019 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-394601736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-394601736>]
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.020 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.022 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 128bd4be-4a76-4dbb-aef6-65acd9c11cbd / tap242fb53f-7c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.023 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02610d08-5780-40f0-b593-85e0347d9b87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.020334', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '29881172-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': '1acf2827fc8d415d14a0a2de24d3eb71df284224a66642427f07e16536becd14'}]}, 'timestamp': '2025-09-30 21:26:44.023583', '_unique_id': '0ffa9a84a8b748bab0a06c5765e50108'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.025 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.041 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.usage volume: 30146560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.042 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ba7e197-13d9-4da7-99fc-63b5675232ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30146560, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:26:44.026067', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '298afb9e-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.758841835, 'message_signature': 'a998b3e54ecd5dd6b65b670559c87f2428e1571b3424cb685b3587847a553484'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:26:44.026067', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '298b0c4c-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.758841835, 'message_signature': 'c3c63e7da56d254930bb3a16529367a7825bdc575a40cb443d3299a2c50877a6'}]}, 'timestamp': '2025-09-30 21:26:44.043027', '_unique_id': 'd55e14bfacab4946b3e8db176bb50ad6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 sshd-session[229415]: Connection closed by invalid user ubuntu 167.71.248.239 port 54718 [preauth]
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.045 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.046 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e2bd070-1320-4781-854d-58251ba816cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.046629', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '298ba954-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': '65fb50478fc8badd19cb0cf9beec07dac1cb50bb30adc5782b83ac1e3574b864'}]}, 'timestamp': '2025-09-30 21:26:44.047235', '_unique_id': 'e79b9b8e8708413e86d5eb80d23527ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.049 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96db28a3-52a6-432c-bf02-eebf2c5d201c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.049013', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '298c0368-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': '49947a8d06a59e5574bc1eef5636640833281efc2c2c39bf359431a94c0a7d6b'}]}, 'timestamp': '2025-09-30 21:26:44.049390', '_unique_id': '7b7c313d1f2a4c838caefbcb3bba9e2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.050 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.051 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.051 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f06a6b7-642c-4ae5-8b7d-c660f1fff902', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.051353', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '298c6362-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': 'a5d66c3f54896e60bcb94426313a81640431b7d041cd23ad7cbbe7259e33a823'}]}, 'timestamp': '2025-09-30 21:26:44.051813', '_unique_id': '44cf6e90e28f4f40aee8205d6560a503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.052 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.053 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.053 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.053 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-394601736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-394601736>]
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.053 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.053 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.054 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e37e5b91-b893-47d2-b8a1-91a8fc394a42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:26:44.053948', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '298cc4e2-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.758841835, 'message_signature': '658c6d867fcd72dcf34c45cd66d58938428cd2c49d1411cde37e10f43962c6c7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:26:44.053948', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '298cd1c6-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.758841835, 'message_signature': '20b80a78de428c3017cfced14736a309d0d1e6fe6fcc0f291bc887a2cbc93260'}]}, 'timestamp': '2025-09-30 21:26:44.054612', '_unique_id': '7d6bfb29ca494e3ba97a081826fa8eea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.055 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.056 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.056 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.056 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-394601736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-394601736>]
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.056 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.075 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/cpu volume: 11680000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1eaac8cf-d20a-4627-b339-89a1a304d85d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11680000000, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'timestamp': '2025-09-30T21:26:44.056790', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '29901674-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.807866406, 'message_signature': 'd1c2b810073292ea54a2e586a64d0ced2f2ee594879c0bb330f3f9fede92b1d6'}]}, 'timestamp': '2025-09-30 21:26:44.076176', '_unique_id': 'b09621f79692436abac20dd32ecd9a87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.077 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.078 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.079 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1af4187-94e1-4636-8280-c506bfaa277d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:26:44.078693', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29908c44-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.758841835, 'message_signature': '0b37431e6ecd5ec3d3423008e2d05cf36755749fac4b55c4ea8b626f8460a2f4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:26:44.078693', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29909932-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.758841835, 'message_signature': '68265e8229e523e56f5867fb063062782c73102dcc6b5330a5c9873adf242237'}]}, 'timestamp': '2025-09-30 21:26:44.079408', '_unique_id': '9fa6de1408c243c09da63300bf31126e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.080 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.106 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.bytes volume: 32065536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.107 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dda79310-b400-45f0-ba3f-8f988d9690aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32065536, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:26:44.081467', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2994d86c-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': '538cd128653f5b4944a5aa2ee93820b66c1c038b5555f30ad99b115123b2ed82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:26:44.081467', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2994e9ba-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': 'adab66c113559faf5db9143008f6973d4a7138d78a5ab4245eb8620cd5b29b66'}]}, 'timestamp': '2025-09-30 21:26:44.107669', '_unique_id': 'd6a82bb292b448808a3c9ab52ffe6457'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.108 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.109 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.109 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb687351-89f2-4755-a356-20ac31709eb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'timestamp': '2025-09-30T21:26:44.109565', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '29953f1e-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.807866406, 'message_signature': 'c388b44dacdff7c443cff5e6ee0abefaf2c92c0643597c5d73967b5116bb0681'}]}, 'timestamp': '2025-09-30 21:26:44.109806', '_unique_id': '8d7b71eb99694f1c8001f8bd45dc52f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.110 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee0990de-6f1e-4257-8708-2be0d25132c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.110915', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '29957380-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': '9e987c7844e198d0c7b6a48eabc1cdb9e072e310d90971ad30e38c7a55270ee6'}]}, 'timestamp': '2025-09-30 21:26:44.111183', '_unique_id': 'd9489b39249445e7b9ddcbc3da0d9cdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.112 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.latency volume: 84243603 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.112 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a01250a-c9c6-4f91-aa55-00899777e015', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 84243603, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:26:44.112280', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2995a9a4-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': 'bbfe9952ae7ed6cfea22c51c2acddb5842f5c9a3d2b4c54e241bd53658133d69'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:26:44.112280', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2995b1c4-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': '91d3be6745e6924cba0b3a794f678ce7e654845c5fed1fdb326c3f6c5b23110c'}]}, 'timestamp': '2025-09-30 21:26:44.112728', '_unique_id': 'cfc3da9bad2246d4bcb499c127c3af34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.113 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.bytes volume: 1287 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '902750d7-b7fe-4230-b0d2-cf98af94c9dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1287, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.113863', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '2995e676-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': '1e8c05c241b980d35d9bc1275b8e76c29774ef78a237c432357c138fa25df61d'}]}, 'timestamp': '2025-09-30 21:26:44.114091', '_unique_id': '91c2d79e47dd48c791f0409b4c277ddf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '816947c1-2083-4119-bdc3-75c438b124e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.115201', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '29961b96-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': '7f360e365c4bafd9e30215704791e5507b46b86731cc8d9745dd81b7790ccf27'}]}, 'timestamp': '2025-09-30 21:26:44.115450', '_unique_id': '9dac6da9657c4d2ea30846dcd9c849ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.115 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.116 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbc5917c-559c-475c-ad3c-dab1168baef2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.116573', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '2996507a-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': '7ef57d06df0dc32fb14695360831c76cf41b3fe437b47ef7addadc80e6a9650f'}]}, 'timestamp': '2025-09-30 21:26:44.116807', '_unique_id': '29c42025a2d04eaabf3da8fd73458bfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82af959f-0ca8-40d5-9463-e613a0eba421', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.118032', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '299689f0-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': '4c9422bd0d7ca9da174056257b11f9d94a850e0a5d42e02a966f92ebe5750a86'}]}, 'timestamp': '2025-09-30 21:26:44.118279', '_unique_id': '155003cafea5468db3ca801ed4fb4f98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.118 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.119 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.bytes volume: 462848 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.119 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d92a25f-ff14-405e-be1f-64ec153d95c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 462848, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:26:44.119608', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2996c744-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': 'e8ed6c5741737167498ebd54bc1d6f23ba6a1a8069ccef3399444ba3bdb3b610'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:26:44.119608', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2996d036-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': '38ee4de8a421da89c34c3a790a52cfc17d61edd38609852a47c1f274bb5a6f2e'}]}, 'timestamp': '2025-09-30 21:26:44.120100', '_unique_id': 'b9480d21b237426eb4dfa02a4555bd4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.121 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.bytes volume: 1054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13d4094d-8b11-4d44-88fc-56307adbf209', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1054, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:26:44.121442', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '2997103c-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.753090622, 'message_signature': '4b8c6289da40c647bc3928f34ec164ac304d4e857235fae9d132acc31fe65159'}]}, 'timestamp': '2025-09-30 21:26:44.121757', '_unique_id': 'd95ba1af5f2344b99046af030ae2a7a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.122 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.123 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.requests volume: 59 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.123 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e451a65-fe8c-46f4-8c1c-23896de09dc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 59, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:26:44.123321', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29975b00-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': 'e396b40457e20774379c900615de6d5ee877f5688e62a10b99be7d44a3427e7f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:26:44.123321', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '299763fc-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': '7bcb18f8ccfdc9c4a7a64d8d886e1ce5b694b0409183b57bddd8e54c1bef5e59'}]}, 'timestamp': '2025-09-30 21:26:44.123847', '_unique_id': 'fb8a42b66a5e4a288ff71fa5553c2f09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.124 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.125 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.latency volume: 689759262 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.125 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.latency volume: 46117718 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7564feea-9d9d-4a3c-aa8b-94ca67673751', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 689759262, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:26:44.125163', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29979fd4-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': '065a3fb632665f0035ddafec2d4cfc1a7d809fa1204c69a6905b50828bc83b39'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46117718, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:26:44.125163', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2997a8da-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': 'c2e650d058e30c69cf1d564db9e50c3fe4edbbc5d91c783955e8c508c9881f19'}]}, 'timestamp': '2025-09-30 21:26:44.125607', '_unique_id': '58d17c19e28449de8cb8e02300929f84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.126 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.requests volume: 1216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09f096de-85d6-4e9b-8fdf-d41b5b487616', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1216, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:26:44.126853', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2997e1ce-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': '83098424e340caaa6c17cf960ad0f8ce5e45123cdd5d2855d948f87f238a3e9a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:26:44.126853', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2997eaca-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4299.814265227, 'message_signature': '71627e702348375e8e909dc5b9b664c1fbcd9024dc060bb42afa47d9f09062be'}]}, 'timestamp': '2025-09-30 21:26:44.127313', '_unique_id': '48bd11022025499bb12dfe654887c9e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.127 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.128 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.128 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:26:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:26:44.128 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-394601736>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-394601736>]
Sep 30 21:26:44 compute-1 kernel: tap242fb53f-7c (unregistering): left promiscuous mode
Sep 30 21:26:44 compute-1 NetworkManager[51724]: <info>  [1759267604.6397] device (tap242fb53f-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:26:44 compute-1 ovn_controller[94902]: 2025-09-30T21:26:44Z|00230|binding|INFO|Releasing lport 242fb53f-7c71-48ef-a180-00bad1488d61 from this chassis (sb_readonly=0)
Sep 30 21:26:44 compute-1 nova_compute[192795]: 2025-09-30 21:26:44.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:44 compute-1 ovn_controller[94902]: 2025-09-30T21:26:44Z|00231|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 down in Southbound
Sep 30 21:26:44 compute-1 ovn_controller[94902]: 2025-09-30T21:26:44Z|00232|binding|INFO|Removing iface tap242fb53f-7c ovn-installed in OVS
Sep 30 21:26:44 compute-1 nova_compute[192795]: 2025-09-30 21:26:44.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:44.660 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:44.662 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:26:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:44.665 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:26:44 compute-1 nova_compute[192795]: 2025-09-30 21:26:44.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:44.668 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7b427e-d658-46b8-bb8b-a048bae6a871]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:44.671 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:26:44 compute-1 nova_compute[192795]: 2025-09-30 21:26:44.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:44 compute-1 nova_compute[192795]: 2025-09-30 21:26:44.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:26:44 compute-1 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000038.scope: Deactivated successfully.
Sep 30 21:26:44 compute-1 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000038.scope: Consumed 14.257s CPU time.
Sep 30 21:26:44 compute-1 systemd-machined[152783]: Machine qemu-29-instance-00000038 terminated.
Sep 30 21:26:44 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229284]: [NOTICE]   (229298) : haproxy version is 2.8.14-c23fe91
Sep 30 21:26:44 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229284]: [NOTICE]   (229298) : path to executable is /usr/sbin/haproxy
Sep 30 21:26:44 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229284]: [WARNING]  (229298) : Exiting Master process...
Sep 30 21:26:44 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229284]: [ALERT]    (229298) : Current worker (229306) exited with code 143 (Terminated)
Sep 30 21:26:44 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229284]: [WARNING]  (229298) : All workers exited. Exiting... (0)
Sep 30 21:26:44 compute-1 systemd[1]: libpod-7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef.scope: Deactivated successfully.
Sep 30 21:26:44 compute-1 podman[229448]: 2025-09-30 21:26:44.845633032 +0000 UTC m=+0.058444364 container died 7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:26:44 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef-userdata-shm.mount: Deactivated successfully.
Sep 30 21:26:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-b89a964af58518df348e2fcf8f6672797c0e8d0b352efdac9e804a839dd263d4-merged.mount: Deactivated successfully.
Sep 30 21:26:44 compute-1 podman[229448]: 2025-09-30 21:26:44.907941336 +0000 UTC m=+0.120752638 container cleanup 7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:26:44 compute-1 systemd[1]: libpod-conmon-7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef.scope: Deactivated successfully.
Sep 30 21:26:44 compute-1 podman[229492]: 2025-09-30 21:26:44.986330978 +0000 UTC m=+0.050743619 container remove 7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:26:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:44.992 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d282c5e4-64cc-4222-a6e8-108893726d6b]: (4, ('Tue Sep 30 09:26:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef)\n7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef\nTue Sep 30 09:26:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef)\n7ff85a2fdbc75750115628d108be6305a752e6f831d4568e924ac197b247e9ef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:44.994 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b08c248a-a3bd-40b8-9995-452ada9cfff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:44.994 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:44 compute-1 nova_compute[192795]: 2025-09-30 21:26:44.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:44 compute-1 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:26:45 compute-1 nova_compute[192795]: 2025-09-30 21:26:45.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:45.021 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[25500635-98ac-49a1-864d-2a64eea3dd81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:45.053 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[40c768ed-7e0a-4f2c-b6ac-207e76b186c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:45.054 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6a54b472-3097-4471-b4d5-404a267f4726]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:45.073 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bea59353-27ea-48b5-a414-e717706d98a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427935, 'reachable_time': 34980, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229512, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-1 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:26:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:45.079 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:26:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:45.079 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[abfca8d6-9340-4b62-bedc-ded40dee37e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-1 nova_compute[192795]: 2025-09-30 21:26:45.456 2 INFO nova.virt.libvirt.driver [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance shutdown successfully after 13 seconds.
Sep 30 21:26:45 compute-1 nova_compute[192795]: 2025-09-30 21:26:45.465 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance destroyed successfully.
Sep 30 21:26:45 compute-1 nova_compute[192795]: 2025-09-30 21:26:45.466 2 DEBUG nova.objects.instance [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'numa_topology' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:45 compute-1 nova_compute[192795]: 2025-09-30 21:26:45.480 2 DEBUG nova.compute.manager [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:45 compute-1 nova_compute[192795]: 2025-09-30 21:26:45.574 2 DEBUG oslo_concurrency.lockutils [None req-da3cadcf-a33d-44da-996b-be54d8bc19f0 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:45 compute-1 nova_compute[192795]: 2025-09-30 21:26:45.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.232 2 DEBUG nova.compute.manager [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.233 2 DEBUG oslo_concurrency.lockutils [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.233 2 DEBUG oslo_concurrency.lockutils [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.233 2 DEBUG oslo_concurrency.lockutils [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.234 2 DEBUG nova.compute.manager [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.234 2 WARNING nova.compute.manager [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state stopped and task_state None.
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.234 2 DEBUG nova.compute.manager [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.234 2 DEBUG oslo_concurrency.lockutils [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:46 compute-1 podman[229513]: 2025-09-30 21:26:46.269920154 +0000 UTC m=+0.087394362 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, version=9.6, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible)
Sep 30 21:26:46 compute-1 podman[229515]: 2025-09-30 21:26:46.276002126 +0000 UTC m=+0.082084322 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:26:46 compute-1 podman[229514]: 2025-09-30 21:26:46.292582626 +0000 UTC m=+0.108846942 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.399 2 DEBUG oslo_concurrency.lockutils [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.400 2 DEBUG oslo_concurrency.lockutils [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.401 2 DEBUG nova.compute.manager [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:46 compute-1 nova_compute[192795]: 2025-09-30 21:26:46.401 2 WARNING nova.compute.manager [req-625c40b6-2d28-41a4-961a-a9e700e89654 req-db80b88d-3166-4a3e-b02c-9a67852ebb62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state stopped and task_state None.
Sep 30 21:26:47 compute-1 nova_compute[192795]: 2025-09-30 21:26:47.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:47 compute-1 nova_compute[192795]: 2025-09-30 21:26:47.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:47 compute-1 nova_compute[192795]: 2025-09-30 21:26:47.390 2 DEBUG nova.objects.instance [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'flavor' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:47 compute-1 nova_compute[192795]: 2025-09-30 21:26:47.418 2 DEBUG nova.objects.instance [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'info_cache' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:47 compute-1 nova_compute[192795]: 2025-09-30 21:26:47.445 2 DEBUG oslo_concurrency.lockutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:47 compute-1 nova_compute[192795]: 2025-09-30 21:26:47.445 2 DEBUG oslo_concurrency.lockutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:47 compute-1 nova_compute[192795]: 2025-09-30 21:26:47.445 2 DEBUG nova.network.neutron [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:26:47 compute-1 nova_compute[192795]: 2025-09-30 21:26:47.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:48 compute-1 nova_compute[192795]: 2025-09-30 21:26:48.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:48 compute-1 nova_compute[192795]: 2025-09-30 21:26:48.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.296 2 DEBUG nova.network.neutron [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.325 2 DEBUG oslo_concurrency.lockutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.361 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance destroyed successfully.
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.362 2 DEBUG nova.objects.instance [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'numa_topology' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.378 2 DEBUG nova.objects.instance [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.393 2 DEBUG nova.virt.libvirt.vif [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:26:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.394 2 DEBUG nova.network.os_vif_util [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.395 2 DEBUG nova.network.os_vif_util [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.396 2 DEBUG os_vif [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.399 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap242fb53f-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.409 2 INFO os_vif [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.420 2 DEBUG nova.virt.libvirt.driver [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Start _get_guest_xml network_info=[{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.427 2 WARNING nova.virt.libvirt.driver [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.436 2 DEBUG nova.virt.libvirt.host [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.437 2 DEBUG nova.virt.libvirt.host [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.444 2 DEBUG nova.virt.libvirt.host [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.445 2 DEBUG nova.virt.libvirt.host [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.449 2 DEBUG nova.virt.libvirt.driver [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.450 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.451 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.451 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.452 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.452 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.453 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.453 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.454 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.455 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.455 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.456 2 DEBUG nova.virt.hardware [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.457 2 DEBUG nova.objects.instance [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.480 2 DEBUG nova.virt.libvirt.vif [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:26:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.481 2 DEBUG nova.network.os_vif_util [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.482 2 DEBUG nova.network.os_vif_util [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.484 2 DEBUG nova.objects.instance [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.498 2 DEBUG nova.virt.libvirt.driver [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <uuid>128bd4be-4a76-4dbb-aef6-65acd9c11cbd</uuid>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <name>instance-00000038</name>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerActionsTestJSON-server-394601736</nova:name>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:26:49</nova:creationTime>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:26:49 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:26:49 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:26:49 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:26:49 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:26:49 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:26:49 compute-1 nova_compute[192795]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:26:49 compute-1 nova_compute[192795]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:26:49 compute-1 nova_compute[192795]:         <nova:port uuid="242fb53f-7c71-48ef-a180-00bad1488d61">
Sep 30 21:26:49 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <system>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <entry name="serial">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <entry name="uuid">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     </system>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <os>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   </os>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <features>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   </features>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:65:e3:f2"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <target dev="tap242fb53f-7c"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/console.log" append="off"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <video>
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     </video>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <input type="keyboard" bus="usb"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:26:49 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:26:49 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:26:49 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:26:49 compute-1 nova_compute[192795]: </domain>
Sep 30 21:26:49 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.501 2 DEBUG oslo_concurrency.processutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.592 2 DEBUG oslo_concurrency.processutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.595 2 DEBUG oslo_concurrency.processutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.664 2 DEBUG oslo_concurrency.processutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.667 2 DEBUG nova.objects.instance [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.692 2 DEBUG oslo_concurrency.processutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.763 2 DEBUG oslo_concurrency.processutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.765 2 DEBUG nova.virt.disk.api [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.765 2 DEBUG oslo_concurrency.processutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.838 2 DEBUG oslo_concurrency.processutils [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.840 2 DEBUG nova.virt.disk.api [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.841 2 DEBUG nova.objects.instance [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.858 2 DEBUG nova.virt.libvirt.vif [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:26:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.858 2 DEBUG nova.network.os_vif_util [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.860 2 DEBUG nova.network.os_vif_util [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.860 2 DEBUG os_vif [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.862 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.863 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.867 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap242fb53f-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.868 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap242fb53f-7c, col_values=(('external_ids', {'iface-id': '242fb53f-7c71-48ef-a180-00bad1488d61', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:e3:f2', 'vm-uuid': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:49 compute-1 NetworkManager[51724]: <info>  [1759267609.8709] manager: (tap242fb53f-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.879 2 INFO os_vif [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:26:49 compute-1 kernel: tap242fb53f-7c: entered promiscuous mode
Sep 30 21:26:49 compute-1 nova_compute[192795]: 2025-09-30 21:26:49.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:49 compute-1 ovn_controller[94902]: 2025-09-30T21:26:49Z|00233|binding|INFO|Claiming lport 242fb53f-7c71-48ef-a180-00bad1488d61 for this chassis.
Sep 30 21:26:49 compute-1 ovn_controller[94902]: 2025-09-30T21:26:49Z|00234|binding|INFO|242fb53f-7c71-48ef-a180-00bad1488d61: Claiming fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:26:49 compute-1 NetworkManager[51724]: <info>  [1759267609.9859] manager: (tap242fb53f-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Sep 30 21:26:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:49.991 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '7', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:49.994 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:26:49 compute-1 ovn_controller[94902]: 2025-09-30T21:26:49Z|00235|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 ovn-installed in OVS
Sep 30 21:26:49 compute-1 ovn_controller[94902]: 2025-09-30T21:26:49Z|00236|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 up in Southbound
Sep 30 21:26:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:49.998 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.063 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[60b314bd-02c8-450a-bec3-2cbebf59ea41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.065 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.067 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.067 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6f872960-02e3-4b4d-b023-d905a2c285af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.070 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c588179f-bbf4-4f3c-9796-7f2312999795]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 systemd-machined[152783]: New machine qemu-30-instance-00000038.
Sep 30 21:26:50 compute-1 systemd[1]: Started Virtual Machine qemu-30-instance-00000038.
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.087 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[72bd7e28-63e8-40e7-929c-ee9708449005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 systemd-udevd[229606]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:26:50 compute-1 NetworkManager[51724]: <info>  [1759267610.1211] device (tap242fb53f-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.121 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fed65cdc-6327-42b8-bf9c-d763997c255c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 NetworkManager[51724]: <info>  [1759267610.1263] device (tap242fb53f-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.166 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6acaaa8c-f922-466e-b108-10f5fefabd20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 NetworkManager[51724]: <info>  [1759267610.1743] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.174 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3342197f-39b6-4c72-84be-9f38d34823a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 systemd-udevd[229609]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.218 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[3be7276d-0705-4aa8-874b-e5f942fb46a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.222 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[222b90ae-1e95-40bc-8489-32f3a0a61571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 NetworkManager[51724]: <info>  [1759267610.2509] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.258 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1e0ed4-b805-4d0e-aa68-a4c8a259039d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.279 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ff53e7-64fc-4231-ac96-55f5417f0f02]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430592, 'reachable_time': 40240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229639, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.300 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[deed73a8-cb1d-48c7-ac01-01b3b0a21ff7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430592, 'tstamp': 430592}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229644, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.312 2 DEBUG nova.compute.manager [req-9931c0c2-b743-4de6-8f94-07712d104530 req-3a750647-14f8-4f98-850d-292c689afb0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.313 2 DEBUG oslo_concurrency.lockutils [req-9931c0c2-b743-4de6-8f94-07712d104530 req-3a750647-14f8-4f98-850d-292c689afb0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.314 2 DEBUG oslo_concurrency.lockutils [req-9931c0c2-b743-4de6-8f94-07712d104530 req-3a750647-14f8-4f98-850d-292c689afb0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.314 2 DEBUG oslo_concurrency.lockutils [req-9931c0c2-b743-4de6-8f94-07712d104530 req-3a750647-14f8-4f98-850d-292c689afb0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.315 2 DEBUG nova.compute.manager [req-9931c0c2-b743-4de6-8f94-07712d104530 req-3a750647-14f8-4f98-850d-292c689afb0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.315 2 WARNING nova.compute.manager [req-9931c0c2-b743-4de6-8f94-07712d104530 req-3a750647-14f8-4f98-850d-292c689afb0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state stopped and task_state powering-on.
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.323 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6a825bf8-575b-4c17-8be6-a0b8ac5b91a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430592, 'reachable_time': 40240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229645, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.369 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b0da884d-8f9f-4183-beb5-0bca0a97f7f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.441 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf1892f-8c20-4f0e-86c5-30ee1727e741]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.443 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.443 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.444 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:50 compute-1 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:50 compute-1 NetworkManager[51724]: <info>  [1759267610.4470] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.450 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:50 compute-1 ovn_controller[94902]: 2025-09-30T21:26:50Z|00237|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.470 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.472 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b8b9507e-93ff-4e8b-8962-1b120da19fe2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.473 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:26:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:26:50.473 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.621 2 DEBUG nova.compute.manager [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.712 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.712 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.712 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.712 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.720 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.720 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.745 2 DEBUG nova.objects.instance [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'pci_requests' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.761 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.762 2 INFO nova.compute.claims [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.762 2 DEBUG nova.objects.instance [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'resources' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.780 2 DEBUG nova.objects.instance [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.817 2 INFO nova.compute.resource_tracker [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating resource usage from migration aa12879e-c6a8-4b86-8b41-bda11bc7ed2a
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.817 2 DEBUG nova.compute.resource_tracker [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Starting to track incoming migration aa12879e-c6a8-4b86-8b41-bda11bc7ed2a with flavor c9779bca-1eb6-4567-a36c-b452abeafc70 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.820 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 128bd4be-4a76-4dbb-aef6-65acd9c11cbd due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.820 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267610.8198574, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.820 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Resumed (Lifecycle Event)
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.824 2 DEBUG nova.compute.manager [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.828 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance rebooted successfully.
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.828 2 DEBUG nova.compute.manager [None req-651bece8-3450-4e13-96bf-4fea984344df 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.836 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.845 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.867 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] During sync_power_state the instance has a pending task (powering-on). Skip.
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.868 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267610.823956, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.868 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Started (Lifecycle Event)
Sep 30 21:26:50 compute-1 podman[229678]: 2025-09-30 21:26:50.888906006 +0000 UTC m=+0.053398219 container create bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.891 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.909 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.934 2 DEBUG nova.compute.provider_tree [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.947 2 DEBUG nova.scheduler.client.report [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:26:50 compute-1 systemd[1]: Started libpod-conmon-bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c.scope.
Sep 30 21:26:50 compute-1 podman[229678]: 2025-09-30 21:26:50.863944342 +0000 UTC m=+0.028436565 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.969 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:50 compute-1 nova_compute[192795]: 2025-09-30 21:26:50.969 2 INFO nova.compute.manager [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Migrating
Sep 30 21:26:50 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:26:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b84c451430759ebb6038e1ecf9f91950ef23915947248e08708a3996611fc4f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:26:51 compute-1 podman[229678]: 2025-09-30 21:26:51.013691729 +0000 UTC m=+0.178183962 container init bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:26:51 compute-1 podman[229678]: 2025-09-30 21:26:51.020783998 +0000 UTC m=+0.185276211 container start bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:26:51 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229693]: [NOTICE]   (229697) : New worker (229699) forked
Sep 30 21:26:51 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229693]: [NOTICE]   (229697) : Loading success.
Sep 30 21:26:51 compute-1 nova_compute[192795]: 2025-09-30 21:26:51.987 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:52 compute-1 nova_compute[192795]: 2025-09-30 21:26:52.003 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:52 compute-1 nova_compute[192795]: 2025-09-30 21:26:52.004 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:26:52 compute-1 nova_compute[192795]: 2025-09-30 21:26:52.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:52 compute-1 nova_compute[192795]: 2025-09-30 21:26:52.434 2 DEBUG nova.compute.manager [req-c0cf6186-e241-4d56-ad70-711df283c807 req-0459716a-aa46-4047-921c-713e16de6110 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:52 compute-1 nova_compute[192795]: 2025-09-30 21:26:52.434 2 DEBUG oslo_concurrency.lockutils [req-c0cf6186-e241-4d56-ad70-711df283c807 req-0459716a-aa46-4047-921c-713e16de6110 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:52 compute-1 nova_compute[192795]: 2025-09-30 21:26:52.435 2 DEBUG oslo_concurrency.lockutils [req-c0cf6186-e241-4d56-ad70-711df283c807 req-0459716a-aa46-4047-921c-713e16de6110 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:52 compute-1 nova_compute[192795]: 2025-09-30 21:26:52.435 2 DEBUG oslo_concurrency.lockutils [req-c0cf6186-e241-4d56-ad70-711df283c807 req-0459716a-aa46-4047-921c-713e16de6110 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:52 compute-1 nova_compute[192795]: 2025-09-30 21:26:52.435 2 DEBUG nova.compute.manager [req-c0cf6186-e241-4d56-ad70-711df283c807 req-0459716a-aa46-4047-921c-713e16de6110 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:52 compute-1 nova_compute[192795]: 2025-09-30 21:26:52.436 2 WARNING nova.compute.manager [req-c0cf6186-e241-4d56-ad70-711df283c807 req-0459716a-aa46-4047-921c-713e16de6110 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state None.
Sep 30 21:26:52 compute-1 sshd-session[229709]: Accepted publickey for nova from 192.168.122.100 port 51778 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:26:52 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:26:52 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:26:52 compute-1 systemd-logind[793]: New session 42 of user nova.
Sep 30 21:26:52 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:26:52 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:26:52 compute-1 systemd[229713]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:26:52 compute-1 systemd[229713]: Queued start job for default target Main User Target.
Sep 30 21:26:52 compute-1 systemd[229713]: Created slice User Application Slice.
Sep 30 21:26:52 compute-1 systemd[229713]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:26:52 compute-1 systemd[229713]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:26:52 compute-1 systemd[229713]: Reached target Paths.
Sep 30 21:26:52 compute-1 systemd[229713]: Reached target Timers.
Sep 30 21:26:52 compute-1 systemd[229713]: Starting D-Bus User Message Bus Socket...
Sep 30 21:26:52 compute-1 systemd[229713]: Starting Create User's Volatile Files and Directories...
Sep 30 21:26:52 compute-1 systemd[229713]: Finished Create User's Volatile Files and Directories.
Sep 30 21:26:52 compute-1 systemd[229713]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:26:52 compute-1 systemd[229713]: Reached target Sockets.
Sep 30 21:26:52 compute-1 systemd[229713]: Reached target Basic System.
Sep 30 21:26:52 compute-1 systemd[229713]: Reached target Main User Target.
Sep 30 21:26:52 compute-1 systemd[229713]: Startup finished in 171ms.
Sep 30 21:26:52 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:26:52 compute-1 systemd[1]: Started Session 42 of User nova.
Sep 30 21:26:52 compute-1 sshd-session[229709]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:26:52 compute-1 sshd-session[229728]: Received disconnect from 192.168.122.100 port 51778:11: disconnected by user
Sep 30 21:26:52 compute-1 sshd-session[229728]: Disconnected from user nova 192.168.122.100 port 51778
Sep 30 21:26:52 compute-1 sshd-session[229709]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:26:52 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Sep 30 21:26:52 compute-1 systemd-logind[793]: Session 42 logged out. Waiting for processes to exit.
Sep 30 21:26:52 compute-1 systemd-logind[793]: Removed session 42.
Sep 30 21:26:53 compute-1 sshd-session[229730]: Accepted publickey for nova from 192.168.122.100 port 51786 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:26:53 compute-1 systemd-logind[793]: New session 44 of user nova.
Sep 30 21:26:53 compute-1 systemd[1]: Started Session 44 of User nova.
Sep 30 21:26:53 compute-1 sshd-session[229730]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:26:53 compute-1 sshd-session[229733]: Received disconnect from 192.168.122.100 port 51786:11: disconnected by user
Sep 30 21:26:53 compute-1 sshd-session[229733]: Disconnected from user nova 192.168.122.100 port 51786
Sep 30 21:26:53 compute-1 sshd-session[229730]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:26:53 compute-1 systemd-logind[793]: Session 44 logged out. Waiting for processes to exit.
Sep 30 21:26:53 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Sep 30 21:26:53 compute-1 systemd-logind[793]: Removed session 44.
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.545 2 INFO nova.compute.manager [None req-cfbcdd55-8fe2-4627-a880-a0818eaa254f 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Pausing
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.548 2 DEBUG nova.objects.instance [None req-cfbcdd55-8fe2-4627-a880-a0818eaa254f 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'flavor' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.596 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267613.5965965, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.597 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Paused (Lifecycle Event)
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.598 2 DEBUG nova.compute.manager [None req-cfbcdd55-8fe2-4627-a880-a0818eaa254f 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.629 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.633 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.656 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] During sync_power_state the instance has a pending task (pausing). Skip.
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.844 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "fd0159fd-1f19-45df-a72d-25de1e287dcd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.845 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.866 2 DEBUG nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.974 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.975 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.983 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:26:53 compute-1 nova_compute[192795]: 2025-09-30 21:26:53.984 2 INFO nova.compute.claims [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.236 2 DEBUG nova.compute.provider_tree [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.255 2 DEBUG nova.scheduler.client.report [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.279 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.280 2 DEBUG nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.336 2 DEBUG nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.336 2 DEBUG nova.network.neutron [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.355 2 INFO nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.375 2 DEBUG nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.487 2 DEBUG nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.488 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.488 2 INFO nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Creating image(s)
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.489 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "/var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.489 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "/var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.490 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "/var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.503 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.552 2 DEBUG nova.policy [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aa01576fa62e4e208b7362e64674479f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1b9086019024e4cbbc8aee7f8972fd1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.585 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.586 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.587 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.599 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.661 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.662 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.718 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.719 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.720 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.791 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.792 2 DEBUG nova.virt.disk.api [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Checking if we can resize image /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.792 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.858 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.859 2 DEBUG nova.virt.disk.api [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Cannot resize image /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.860 2 DEBUG nova.objects.instance [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lazy-loading 'migration_context' on Instance uuid fd0159fd-1f19-45df-a72d-25de1e287dcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.872 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.873 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Ensure instance console log exists: /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.873 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.873 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:54 compute-1 nova_compute[192795]: 2025-09-30 21:26:54.874 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:55 compute-1 podman[229750]: 2025-09-30 21:26:55.224388797 +0000 UTC m=+0.062296654 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:26:56 compute-1 nova_compute[192795]: 2025-09-30 21:26:56.900 2 DEBUG nova.network.neutron [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Successfully created port: 4467e22a-a4e7-4951-b98a-c78e9f926681 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:26:56 compute-1 nova_compute[192795]: 2025-09-30 21:26:56.977 2 INFO nova.compute.manager [None req-54f33a81-da2b-405f-bc79-dc78452317d6 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Unpausing
Sep 30 21:26:56 compute-1 nova_compute[192795]: 2025-09-30 21:26:56.978 2 DEBUG nova.objects.instance [None req-54f33a81-da2b-405f-bc79-dc78452317d6 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'flavor' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.012 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267617.0120797, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.013 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Resumed (Lifecycle Event)
Sep 30 21:26:57 compute-1 virtqemud[192217]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.017 2 DEBUG nova.virt.libvirt.guest [None req-54f33a81-da2b-405f-bc79-dc78452317d6 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.018 2 DEBUG nova.compute.manager [None req-54f33a81-da2b-405f-bc79-dc78452317d6 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.045 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.050 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.079 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] During sync_power_state the instance has a pending task (unpausing). Skip.
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.587 2 DEBUG nova.network.neutron [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Successfully updated port: 4467e22a-a4e7-4951-b98a-c78e9f926681 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.610 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.611 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquired lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.611 2 DEBUG nova.network.neutron [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.811 2 DEBUG nova.compute.manager [req-30eed7e9-1d3a-4f8b-b1af-f77eed545f8d req-33626e92-68d6-4c34-abe1-e973307e062e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received event network-changed-4467e22a-a4e7-4951-b98a-c78e9f926681 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.811 2 DEBUG nova.compute.manager [req-30eed7e9-1d3a-4f8b-b1af-f77eed545f8d req-33626e92-68d6-4c34-abe1-e973307e062e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Refreshing instance network info cache due to event network-changed-4467e22a-a4e7-4951-b98a-c78e9f926681. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:26:57 compute-1 nova_compute[192795]: 2025-09-30 21:26:57.811 2 DEBUG oslo_concurrency.lockutils [req-30eed7e9-1d3a-4f8b-b1af-f77eed545f8d req-33626e92-68d6-4c34-abe1-e973307e062e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:58 compute-1 nova_compute[192795]: 2025-09-30 21:26:58.550 2 DEBUG nova.network.neutron [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.502 2 DEBUG nova.network.neutron [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Updating instance_info_cache with network_info: [{"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.546 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Releasing lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.546 2 DEBUG nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Instance network_info: |[{"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.547 2 DEBUG oslo_concurrency.lockutils [req-30eed7e9-1d3a-4f8b-b1af-f77eed545f8d req-33626e92-68d6-4c34-abe1-e973307e062e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.547 2 DEBUG nova.network.neutron [req-30eed7e9-1d3a-4f8b-b1af-f77eed545f8d req-33626e92-68d6-4c34-abe1-e973307e062e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Refreshing network info cache for port 4467e22a-a4e7-4951-b98a-c78e9f926681 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.552 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Start _get_guest_xml network_info=[{"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.558 2 WARNING nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.565 2 DEBUG nova.virt.libvirt.host [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.565 2 DEBUG nova.virt.libvirt.host [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.572 2 DEBUG nova.virt.libvirt.host [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.573 2 DEBUG nova.virt.libvirt.host [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.574 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.574 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.575 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.575 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.575 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.576 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.576 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.576 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.576 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.576 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.577 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.577 2 DEBUG nova.virt.hardware [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.580 2 DEBUG nova.virt.libvirt.vif [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-1949663081',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-1949663081',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-194966308',id=63,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1b9086019024e4cbbc8aee7f8972fd1',ramdisk_id='',reservation_id='r-74wrhydr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-862544542',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-862544542-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:26:54Z,user_data=None,user_id='aa01576fa62e4e208b7362e64674479f',uuid=fd0159fd-1f19-45df-a72d-25de1e287dcd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.580 2 DEBUG nova.network.os_vif_util [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Converting VIF {"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.581 2 DEBUG nova.network.os_vif_util [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:e1,bridge_name='br-int',has_traffic_filtering=True,id=4467e22a-a4e7-4951-b98a-c78e9f926681,network=Network(093b4e0f-6b01-42f3-8ea3-902bf0bf0397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4467e22a-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.582 2 DEBUG nova.objects.instance [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lazy-loading 'pci_devices' on Instance uuid fd0159fd-1f19-45df-a72d-25de1e287dcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.600 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <uuid>fd0159fd-1f19-45df-a72d-25de1e287dcd</uuid>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <name>instance-0000003f</name>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-1949663081</nova:name>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:26:59</nova:creationTime>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:26:59 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:26:59 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:26:59 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:26:59 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:26:59 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:26:59 compute-1 nova_compute[192795]:         <nova:user uuid="aa01576fa62e4e208b7362e64674479f">tempest-FloatingIPsAssociationNegativeTestJSON-862544542-project-member</nova:user>
Sep 30 21:26:59 compute-1 nova_compute[192795]:         <nova:project uuid="b1b9086019024e4cbbc8aee7f8972fd1">tempest-FloatingIPsAssociationNegativeTestJSON-862544542</nova:project>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:26:59 compute-1 nova_compute[192795]:         <nova:port uuid="4467e22a-a4e7-4951-b98a-c78e9f926681">
Sep 30 21:26:59 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <system>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <entry name="serial">fd0159fd-1f19-45df-a72d-25de1e287dcd</entry>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <entry name="uuid">fd0159fd-1f19-45df-a72d-25de1e287dcd</entry>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     </system>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <os>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   </os>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <features>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   </features>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk.config"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:51:a5:e1"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <target dev="tap4467e22a-a4"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/console.log" append="off"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <video>
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     </video>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:26:59 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:26:59 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:26:59 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:26:59 compute-1 nova_compute[192795]: </domain>
Sep 30 21:26:59 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.602 2 DEBUG nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Preparing to wait for external event network-vif-plugged-4467e22a-a4e7-4951-b98a-c78e9f926681 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.602 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.602 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.602 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.603 2 DEBUG nova.virt.libvirt.vif [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-1949663081',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-1949663081',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-194966308',id=63,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1b9086019024e4cbbc8aee7f8972fd1',ramdisk_id='',reservation_id='r-74wrhydr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-862544542',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-862544542-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:26:54Z,user_data=None,user_id='aa01576fa62e4e208b7362e64674479f',uuid=fd0159fd-1f19-45df-a72d-25de1e287dcd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.604 2 DEBUG nova.network.os_vif_util [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Converting VIF {"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.605 2 DEBUG nova.network.os_vif_util [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:e1,bridge_name='br-int',has_traffic_filtering=True,id=4467e22a-a4e7-4951-b98a-c78e9f926681,network=Network(093b4e0f-6b01-42f3-8ea3-902bf0bf0397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4467e22a-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.605 2 DEBUG os_vif [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:e1,bridge_name='br-int',has_traffic_filtering=True,id=4467e22a-a4e7-4951-b98a-c78e9f926681,network=Network(093b4e0f-6b01-42f3-8ea3-902bf0bf0397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4467e22a-a4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.606 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.607 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.611 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4467e22a-a4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.612 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4467e22a-a4, col_values=(('external_ids', {'iface-id': '4467e22a-a4e7-4951-b98a-c78e9f926681', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:51:a5:e1', 'vm-uuid': 'fd0159fd-1f19-45df-a72d-25de1e287dcd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:59 compute-1 NetworkManager[51724]: <info>  [1759267619.6148] manager: (tap4467e22a-a4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.622 2 INFO os_vif [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:a5:e1,bridge_name='br-int',has_traffic_filtering=True,id=4467e22a-a4e7-4951-b98a-c78e9f926681,network=Network(093b4e0f-6b01-42f3-8ea3-902bf0bf0397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4467e22a-a4')
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.682 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.683 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.683 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] No VIF found with MAC fa:16:3e:51:a5:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:26:59 compute-1 nova_compute[192795]: 2025-09-30 21:26:59.684 2 INFO nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Using config drive
Sep 30 21:27:00 compute-1 nova_compute[192795]: 2025-09-30 21:27:00.589 2 INFO nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Creating config drive at /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk.config
Sep 30 21:27:00 compute-1 nova_compute[192795]: 2025-09-30 21:27:00.599 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0l_84dsk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:00 compute-1 nova_compute[192795]: 2025-09-30 21:27:00.738 2 DEBUG oslo_concurrency.processutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0l_84dsk" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:00 compute-1 kernel: tap4467e22a-a4: entered promiscuous mode
Sep 30 21:27:00 compute-1 NetworkManager[51724]: <info>  [1759267620.8184] manager: (tap4467e22a-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Sep 30 21:27:00 compute-1 nova_compute[192795]: 2025-09-30 21:27:00.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:00 compute-1 ovn_controller[94902]: 2025-09-30T21:27:00Z|00238|binding|INFO|Claiming lport 4467e22a-a4e7-4951-b98a-c78e9f926681 for this chassis.
Sep 30 21:27:00 compute-1 ovn_controller[94902]: 2025-09-30T21:27:00Z|00239|binding|INFO|4467e22a-a4e7-4951-b98a-c78e9f926681: Claiming fa:16:3e:51:a5:e1 10.100.0.7
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.833 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:a5:e1 10.100.0.7'], port_security=['fa:16:3e:51:a5:e1 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fd0159fd-1f19-45df-a72d-25de1e287dcd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-093b4e0f-6b01-42f3-8ea3-902bf0bf0397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1b9086019024e4cbbc8aee7f8972fd1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '466b3098-7b1d-4e9f-af8d-2137ea8ba1ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78f5a320-60ed-4426-bcd7-1ccda3e9f5c7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=4467e22a-a4e7-4951-b98a-c78e9f926681) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.834 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 4467e22a-a4e7-4951-b98a-c78e9f926681 in datapath 093b4e0f-6b01-42f3-8ea3-902bf0bf0397 bound to our chassis
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.836 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 093b4e0f-6b01-42f3-8ea3-902bf0bf0397
Sep 30 21:27:00 compute-1 ovn_controller[94902]: 2025-09-30T21:27:00Z|00240|binding|INFO|Setting lport 4467e22a-a4e7-4951-b98a-c78e9f926681 ovn-installed in OVS
Sep 30 21:27:00 compute-1 ovn_controller[94902]: 2025-09-30T21:27:00Z|00241|binding|INFO|Setting lport 4467e22a-a4e7-4951-b98a-c78e9f926681 up in Southbound
Sep 30 21:27:00 compute-1 nova_compute[192795]: 2025-09-30 21:27:00.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:00 compute-1 nova_compute[192795]: 2025-09-30 21:27:00.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.856 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d72c1e27-1305-44d8-a144-d9d5a2806534]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.857 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap093b4e0f-61 in ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.859 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap093b4e0f-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.859 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c9df6f-017a-4e80-8f33-dd69d2d1cef8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.860 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c63ae58c-0860-42a6-b785-02ddbcdfbac0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:00 compute-1 systemd-udevd[229792]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.880 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[f913d9b5-c5da-4f33-9498-fd4e93a6fb98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:00 compute-1 systemd-machined[152783]: New machine qemu-31-instance-0000003f.
Sep 30 21:27:00 compute-1 NetworkManager[51724]: <info>  [1759267620.8972] device (tap4467e22a-a4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:27:00 compute-1 NetworkManager[51724]: <info>  [1759267620.8981] device (tap4467e22a-a4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:27:00 compute-1 systemd[1]: Started Virtual Machine qemu-31-instance-0000003f.
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.901 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ee4384-4833-4d84-b58c-30be922324e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.932 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ee5deec3-5695-4f2e-96fc-3ea122b42d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.939 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[01a48d0d-fade-414e-bda4-dfca6e8daf14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:00 compute-1 NetworkManager[51724]: <info>  [1759267620.9407] manager: (tap093b4e0f-60): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.976 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[70561cab-931d-48a5-ba44-00410a7495ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:00.980 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b5531679-51d7-4f37-9f54-994f7540fce4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:01 compute-1 NetworkManager[51724]: <info>  [1759267621.0162] device (tap093b4e0f-60): carrier: link connected
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.024 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[8c72324b-d844-4975-bd4f-f576d64d1ef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.043 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[73edbb16-4592-4528-92a9-5bb855cdfb32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap093b4e0f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:3b:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431668, 'reachable_time': 34647, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229823, 'error': None, 'target': 'ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.063 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d51176c6-cea6-4234-a444-5121660e4933]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef6:3b31'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431668, 'tstamp': 431668}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229824, 'error': None, 'target': 'ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.082 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0c35c990-9e71-4ed8-9373-94e353172b36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap093b4e0f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:3b:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431668, 'reachable_time': 34647, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229825, 'error': None, 'target': 'ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.116 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4489ab17-dedd-4343-aed9-b88bcda6a929]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.200 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0e6964-6014-40e3-908c-66c8e1b72aa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.203 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap093b4e0f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.204 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.204 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap093b4e0f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:01 compute-1 kernel: tap093b4e0f-60: entered promiscuous mode
Sep 30 21:27:01 compute-1 NetworkManager[51724]: <info>  [1759267621.2083] manager: (tap093b4e0f-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.217 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap093b4e0f-60, col_values=(('external_ids', {'iface-id': '7f73ff06-fed5-4055-89ee-1a2cf9bedacd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:01 compute-1 ovn_controller[94902]: 2025-09-30T21:27:01Z|00242|binding|INFO|Releasing lport 7f73ff06-fed5-4055-89ee-1a2cf9bedacd from this chassis (sb_readonly=0)
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.222 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/093b4e0f-6b01-42f3-8ea3-902bf0bf0397.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/093b4e0f-6b01-42f3-8ea3-902bf0bf0397.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.235 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[14f88849-29fd-4e66-84e3-6cffe3691159]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.236 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-093b4e0f-6b01-42f3-8ea3-902bf0bf0397
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/093b4e0f-6b01-42f3-8ea3-902bf0bf0397.pid.haproxy
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 093b4e0f-6b01-42f3-8ea3-902bf0bf0397
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:27:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:01.236 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397', 'env', 'PROCESS_TAG=haproxy-093b4e0f-6b01-42f3-8ea3-902bf0bf0397', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/093b4e0f-6b01-42f3-8ea3-902bf0bf0397.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.646 2 DEBUG nova.compute.manager [req-748cc591-0791-46af-ae3c-b9995c26a705 req-75dc45ce-b515-485c-b5d2-b2d7a89bf839 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received event network-vif-plugged-4467e22a-a4e7-4951-b98a-c78e9f926681 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.647 2 DEBUG oslo_concurrency.lockutils [req-748cc591-0791-46af-ae3c-b9995c26a705 req-75dc45ce-b515-485c-b5d2-b2d7a89bf839 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.647 2 DEBUG oslo_concurrency.lockutils [req-748cc591-0791-46af-ae3c-b9995c26a705 req-75dc45ce-b515-485c-b5d2-b2d7a89bf839 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.647 2 DEBUG oslo_concurrency.lockutils [req-748cc591-0791-46af-ae3c-b9995c26a705 req-75dc45ce-b515-485c-b5d2-b2d7a89bf839 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.647 2 DEBUG nova.compute.manager [req-748cc591-0791-46af-ae3c-b9995c26a705 req-75dc45ce-b515-485c-b5d2-b2d7a89bf839 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Processing event network-vif-plugged-4467e22a-a4e7-4951-b98a-c78e9f926681 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:27:01 compute-1 podman[229864]: 2025-09-30 21:27:01.686103695 +0000 UTC m=+0.069216040 container create 75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.736 2 DEBUG nova.network.neutron [req-30eed7e9-1d3a-4f8b-b1af-f77eed545f8d req-33626e92-68d6-4c34-abe1-e973307e062e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Updated VIF entry in instance network info cache for port 4467e22a-a4e7-4951-b98a-c78e9f926681. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.738 2 DEBUG nova.network.neutron [req-30eed7e9-1d3a-4f8b-b1af-f77eed545f8d req-33626e92-68d6-4c34-abe1-e973307e062e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Updating instance_info_cache with network_info: [{"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:01 compute-1 systemd[1]: Started libpod-conmon-75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82.scope.
Sep 30 21:27:01 compute-1 podman[229864]: 2025-09-30 21:27:01.651288855 +0000 UTC m=+0.034401210 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.761 2 DEBUG oslo_concurrency.lockutils [req-30eed7e9-1d3a-4f8b-b1af-f77eed545f8d req-33626e92-68d6-4c34-abe1-e973307e062e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:01 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:27:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0117203bc0cb1b50d19bb8c0e8772dc08711ebd613d2d25c89be58a9b18f6ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:27:01 compute-1 podman[229864]: 2025-09-30 21:27:01.797044407 +0000 UTC m=+0.180156832 container init 75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:27:01 compute-1 podman[229864]: 2025-09-30 21:27:01.804706422 +0000 UTC m=+0.187818767 container start 75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:27:01 compute-1 neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397[229879]: [NOTICE]   (229883) : New worker (229885) forked
Sep 30 21:27:01 compute-1 neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397[229879]: [NOTICE]   (229883) : Loading success.
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.869 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267621.8691902, fd0159fd-1f19-45df-a72d-25de1e287dcd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.870 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] VM Started (Lifecycle Event)
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.872 2 DEBUG nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.882 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.885 2 INFO nova.virt.libvirt.driver [-] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Instance spawned successfully.
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.886 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.893 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.896 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.904 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.904 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.905 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.905 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.905 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.907 2 DEBUG nova.virt.libvirt.driver [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.913 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.913 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267621.8693905, fd0159fd-1f19-45df-a72d-25de1e287dcd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.913 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] VM Paused (Lifecycle Event)
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.939 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.942 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267621.8813777, fd0159fd-1f19-45df-a72d-25de1e287dcd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.943 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] VM Resumed (Lifecycle Event)
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.968 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.971 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.996 2 INFO nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Took 7.51 seconds to spawn the instance on the hypervisor.
Sep 30 21:27:01 compute-1 nova_compute[192795]: 2025-09-30 21:27:01.996 2 DEBUG nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:02 compute-1 nova_compute[192795]: 2025-09-30 21:27:02.003 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:27:02 compute-1 nova_compute[192795]: 2025-09-30 21:27:02.098 2 INFO nova.compute.manager [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Took 8.15 seconds to build instance.
Sep 30 21:27:02 compute-1 nova_compute[192795]: 2025-09-30 21:27:02.126 2 DEBUG oslo_concurrency.lockutils [None req-489377ab-6787-4898-a31f-7e41eeab5f64 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:02 compute-1 nova_compute[192795]: 2025-09-30 21:27:02.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:03 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:27:03 compute-1 systemd[229713]: Activating special unit Exit the Session...
Sep 30 21:27:03 compute-1 systemd[229713]: Stopped target Main User Target.
Sep 30 21:27:03 compute-1 systemd[229713]: Stopped target Basic System.
Sep 30 21:27:03 compute-1 systemd[229713]: Stopped target Paths.
Sep 30 21:27:03 compute-1 systemd[229713]: Stopped target Sockets.
Sep 30 21:27:03 compute-1 systemd[229713]: Stopped target Timers.
Sep 30 21:27:03 compute-1 systemd[229713]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:27:03 compute-1 systemd[229713]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:27:03 compute-1 systemd[229713]: Closed D-Bus User Message Bus Socket.
Sep 30 21:27:03 compute-1 systemd[229713]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:27:03 compute-1 systemd[229713]: Removed slice User Application Slice.
Sep 30 21:27:03 compute-1 systemd[229713]: Reached target Shutdown.
Sep 30 21:27:03 compute-1 systemd[229713]: Finished Exit the Session.
Sep 30 21:27:03 compute-1 systemd[229713]: Reached target Exit the Session.
Sep 30 21:27:03 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:27:03 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:27:03 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:27:03 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:27:03 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:27:03 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:27:03 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:27:03 compute-1 nova_compute[192795]: 2025-09-30 21:27:03.934 2 DEBUG nova.compute.manager [req-15f56b74-5582-4aef-9008-e6639376bf9e req-fc65643b-8f71-4af9-8c7d-efeff9bb0743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received event network-vif-plugged-4467e22a-a4e7-4951-b98a-c78e9f926681 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:03 compute-1 nova_compute[192795]: 2025-09-30 21:27:03.934 2 DEBUG oslo_concurrency.lockutils [req-15f56b74-5582-4aef-9008-e6639376bf9e req-fc65643b-8f71-4af9-8c7d-efeff9bb0743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:03 compute-1 nova_compute[192795]: 2025-09-30 21:27:03.935 2 DEBUG oslo_concurrency.lockutils [req-15f56b74-5582-4aef-9008-e6639376bf9e req-fc65643b-8f71-4af9-8c7d-efeff9bb0743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:03 compute-1 nova_compute[192795]: 2025-09-30 21:27:03.935 2 DEBUG oslo_concurrency.lockutils [req-15f56b74-5582-4aef-9008-e6639376bf9e req-fc65643b-8f71-4af9-8c7d-efeff9bb0743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:03 compute-1 nova_compute[192795]: 2025-09-30 21:27:03.935 2 DEBUG nova.compute.manager [req-15f56b74-5582-4aef-9008-e6639376bf9e req-fc65643b-8f71-4af9-8c7d-efeff9bb0743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] No waiting events found dispatching network-vif-plugged-4467e22a-a4e7-4951-b98a-c78e9f926681 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:03 compute-1 nova_compute[192795]: 2025-09-30 21:27:03.935 2 WARNING nova.compute.manager [req-15f56b74-5582-4aef-9008-e6639376bf9e req-fc65643b-8f71-4af9-8c7d-efeff9bb0743 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received unexpected event network-vif-plugged-4467e22a-a4e7-4951-b98a-c78e9f926681 for instance with vm_state active and task_state None.
Sep 30 21:27:04 compute-1 nova_compute[192795]: 2025-09-30 21:27:04.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:05 compute-1 podman[229897]: 2025-09-30 21:27:05.230589858 +0000 UTC m=+0.064162765 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:27:05 compute-1 podman[229895]: 2025-09-30 21:27:05.239939217 +0000 UTC m=+0.082775321 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:27:05 compute-1 podman[229896]: 2025-09-30 21:27:05.269184228 +0000 UTC m=+0.109082764 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 21:27:05 compute-1 nova_compute[192795]: 2025-09-30 21:27:05.754 2 DEBUG nova.compute.manager [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-unplugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:05 compute-1 nova_compute[192795]: 2025-09-30 21:27:05.755 2 DEBUG oslo_concurrency.lockutils [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:05 compute-1 nova_compute[192795]: 2025-09-30 21:27:05.755 2 DEBUG oslo_concurrency.lockutils [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:05 compute-1 nova_compute[192795]: 2025-09-30 21:27:05.755 2 DEBUG oslo_concurrency.lockutils [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:05 compute-1 nova_compute[192795]: 2025-09-30 21:27:05.756 2 DEBUG nova.compute.manager [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-unplugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:05 compute-1 nova_compute[192795]: 2025-09-30 21:27:05.756 2 WARNING nova.compute.manager [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-unplugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:27:06 compute-1 sshd-session[229971]: Accepted publickey for nova from 192.168.122.100 port 35994 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:27:06 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:27:06 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:27:06 compute-1 systemd-logind[793]: New session 45 of user nova.
Sep 30 21:27:06 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:27:06 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:27:06 compute-1 systemd[229975]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:27:06 compute-1 systemd[229975]: Queued start job for default target Main User Target.
Sep 30 21:27:06 compute-1 systemd[229975]: Created slice User Application Slice.
Sep 30 21:27:06 compute-1 systemd[229975]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:27:06 compute-1 systemd[229975]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:27:06 compute-1 systemd[229975]: Reached target Paths.
Sep 30 21:27:06 compute-1 systemd[229975]: Reached target Timers.
Sep 30 21:27:06 compute-1 systemd[229975]: Starting D-Bus User Message Bus Socket...
Sep 30 21:27:06 compute-1 systemd[229975]: Starting Create User's Volatile Files and Directories...
Sep 30 21:27:06 compute-1 systemd[229975]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:27:06 compute-1 systemd[229975]: Reached target Sockets.
Sep 30 21:27:06 compute-1 systemd[229975]: Finished Create User's Volatile Files and Directories.
Sep 30 21:27:06 compute-1 systemd[229975]: Reached target Basic System.
Sep 30 21:27:06 compute-1 systemd[229975]: Reached target Main User Target.
Sep 30 21:27:06 compute-1 systemd[229975]: Startup finished in 143ms.
Sep 30 21:27:06 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:27:06 compute-1 systemd[1]: Started Session 45 of User nova.
Sep 30 21:27:06 compute-1 sshd-session[229971]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:27:07 compute-1 sshd-session[229990]: Received disconnect from 192.168.122.100 port 35994:11: disconnected by user
Sep 30 21:27:07 compute-1 sshd-session[229990]: Disconnected from user nova 192.168.122.100 port 35994
Sep 30 21:27:07 compute-1 sshd-session[229971]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:27:07 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Sep 30 21:27:07 compute-1 systemd-logind[793]: Session 45 logged out. Waiting for processes to exit.
Sep 30 21:27:07 compute-1 systemd-logind[793]: Removed session 45.
Sep 30 21:27:07 compute-1 ovn_controller[94902]: 2025-09-30T21:27:07Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:07 compute-1 sshd-session[229992]: Accepted publickey for nova from 192.168.122.100 port 42830 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:27:07 compute-1 systemd-logind[793]: New session 47 of user nova.
Sep 30 21:27:07 compute-1 systemd[1]: Started Session 47 of User nova.
Sep 30 21:27:07 compute-1 sshd-session[229992]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:27:07 compute-1 sshd-session[229995]: Received disconnect from 192.168.122.100 port 42830:11: disconnected by user
Sep 30 21:27:07 compute-1 sshd-session[229995]: Disconnected from user nova 192.168.122.100 port 42830
Sep 30 21:27:07 compute-1 sshd-session[229992]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:27:07 compute-1 systemd[1]: session-47.scope: Deactivated successfully.
Sep 30 21:27:07 compute-1 systemd-logind[793]: Session 47 logged out. Waiting for processes to exit.
Sep 30 21:27:07 compute-1 systemd-logind[793]: Removed session 47.
Sep 30 21:27:07 compute-1 sshd-session[229997]: Accepted publickey for nova from 192.168.122.100 port 42838 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:27:07 compute-1 systemd-logind[793]: New session 48 of user nova.
Sep 30 21:27:07 compute-1 systemd[1]: Started Session 48 of User nova.
Sep 30 21:27:07 compute-1 sshd-session[229997]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:27:07 compute-1 sshd-session[230000]: Received disconnect from 192.168.122.100 port 42838:11: disconnected by user
Sep 30 21:27:07 compute-1 sshd-session[230000]: Disconnected from user nova 192.168.122.100 port 42838
Sep 30 21:27:07 compute-1 sshd-session[229997]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:27:07 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Sep 30 21:27:07 compute-1 systemd-logind[793]: Session 48 logged out. Waiting for processes to exit.
Sep 30 21:27:07 compute-1 systemd-logind[793]: Removed session 48.
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.851 2 DEBUG nova.compute.manager [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.852 2 DEBUG oslo_concurrency.lockutils [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.853 2 DEBUG oslo_concurrency.lockutils [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.853 2 DEBUG oslo_concurrency.lockutils [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.853 2 DEBUG nova.compute.manager [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.853 2 WARNING nova.compute.manager [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.853 2 DEBUG nova.compute.manager [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received event network-changed-4467e22a-a4e7-4951-b98a-c78e9f926681 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.854 2 DEBUG nova.compute.manager [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Refreshing instance network info cache due to event network-changed-4467e22a-a4e7-4951-b98a-c78e9f926681. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.854 2 DEBUG oslo_concurrency.lockutils [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.854 2 DEBUG oslo_concurrency.lockutils [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:07 compute-1 nova_compute[192795]: 2025-09-30 21:27:07.854 2 DEBUG nova.network.neutron [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Refreshing network info cache for port 4467e22a-a4e7-4951-b98a-c78e9f926681 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:27:08 compute-1 nova_compute[192795]: 2025-09-30 21:27:08.180 2 INFO nova.network.neutron [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating port dd77e849-d522-43c2-94ac-03e8228ad770 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 21:27:09 compute-1 podman[230002]: 2025-09-30 21:27:09.241459666 +0000 UTC m=+0.082334169 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.342 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.342 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquired lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.343 2 DEBUG nova.network.neutron [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.555 2 DEBUG nova.network.neutron [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Updated VIF entry in instance network info cache for port 4467e22a-a4e7-4951-b98a-c78e9f926681. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.556 2 DEBUG nova.network.neutron [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Updating instance_info_cache with network_info: [{"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.572 2 DEBUG oslo_concurrency.lockutils [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.938 2 DEBUG nova.compute.manager [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-changed-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.939 2 DEBUG nova.compute.manager [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Refreshing instance network info cache due to event network-changed-dd77e849-d522-43c2-94ac-03e8228ad770. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:27:09 compute-1 nova_compute[192795]: 2025-09-30 21:27:09.939 2 DEBUG oslo_concurrency.lockutils [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.635 2 DEBUG nova.network.neutron [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating instance_info_cache with network_info: [{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.663 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Releasing lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.668 2 DEBUG oslo_concurrency.lockutils [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.669 2 DEBUG nova.network.neutron [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Refreshing network info cache for port dd77e849-d522-43c2-94ac-03e8228ad770 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.808 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.810 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.810 2 INFO nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Creating image(s)
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.811 2 DEBUG nova.objects.instance [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.828 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.900 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.904 2 DEBUG nova.virt.disk.api [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Checking if we can resize image /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.904 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.970 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.971 2 DEBUG nova.virt.disk.api [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Cannot resize image /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.987 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.988 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Ensure instance console log exists: /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.989 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.989 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.990 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.992 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Start _get_guest_xml network_info=[{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-250418448-network", "vif_mac": "fa:16:3e:7b:d4:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:27:10 compute-1 nova_compute[192795]: 2025-09-30 21:27:10.998 2 WARNING nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.004 2 DEBUG nova.virt.libvirt.host [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.005 2 DEBUG nova.virt.libvirt.host [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.008 2 DEBUG nova.virt.libvirt.host [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.009 2 DEBUG nova.virt.libvirt.host [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.010 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.010 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9779bca-1eb6-4567-a36c-b452abeafc70',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.011 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.011 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.012 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.012 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.012 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.013 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.013 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.013 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.014 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.014 2 DEBUG nova.virt.hardware [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.014 2 DEBUG nova.objects.instance [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.036 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.119 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.121 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.121 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.123 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.124 2 DEBUG nova.virt.libvirt.vif [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1431604073',display_name='tempest-DeleteServersTestJSON-server-1431604073',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1431604073',id=61,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:26:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-fu0pukiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:27:07Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=3391dcee-6677-46e1-bda2-82f72ebee7f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-250418448-network", "vif_mac": "fa:16:3e:7b:d4:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.125 2 DEBUG nova.network.os_vif_util [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-250418448-network", "vif_mac": "fa:16:3e:7b:d4:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.127 2 DEBUG nova.network.os_vif_util [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.129 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <uuid>3391dcee-6677-46e1-bda2-82f72ebee7f2</uuid>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <name>instance-0000003d</name>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <memory>196608</memory>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <nova:name>tempest-DeleteServersTestJSON-server-1431604073</nova:name>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:27:10</nova:creationTime>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <nova:flavor name="m1.micro">
Sep 30 21:27:11 compute-1 nova_compute[192795]:         <nova:memory>192</nova:memory>
Sep 30 21:27:11 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:27:11 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:27:11 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:27:11 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:27:11 compute-1 nova_compute[192795]:         <nova:user uuid="bfe43dba9d03417182dd245d360568e6">tempest-DeleteServersTestJSON-314554874-project-member</nova:user>
Sep 30 21:27:11 compute-1 nova_compute[192795]:         <nova:project uuid="c4bb94b19ac546f195f1f1f35411cce9">tempest-DeleteServersTestJSON-314554874</nova:project>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:27:11 compute-1 nova_compute[192795]:         <nova:port uuid="dd77e849-d522-43c2-94ac-03e8228ad770">
Sep 30 21:27:11 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <system>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <entry name="serial">3391dcee-6677-46e1-bda2-82f72ebee7f2</entry>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <entry name="uuid">3391dcee-6677-46e1-bda2-82f72ebee7f2</entry>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     </system>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <os>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   </os>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <features>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   </features>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:7b:d4:2d"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <target dev="tapdd77e849-d5"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/console.log" append="off"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <video>
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     </video>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:27:11 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:27:11 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:27:11 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:27:11 compute-1 nova_compute[192795]: </domain>
Sep 30 21:27:11 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.137 2 DEBUG nova.virt.libvirt.vif [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1431604073',display_name='tempest-DeleteServersTestJSON-server-1431604073',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1431604073',id=61,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:26:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-fu0pukiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:27:07Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=3391dcee-6677-46e1-bda2-82f72ebee7f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-250418448-network", "vif_mac": "fa:16:3e:7b:d4:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.137 2 DEBUG nova.network.os_vif_util [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-250418448-network", "vif_mac": "fa:16:3e:7b:d4:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.138 2 DEBUG nova.network.os_vif_util [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.139 2 DEBUG os_vif [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.140 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.141 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.145 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd77e849-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.146 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdd77e849-d5, col_values=(('external_ids', {'iface-id': 'dd77e849-d522-43c2-94ac-03e8228ad770', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:d4:2d', 'vm-uuid': '3391dcee-6677-46e1-bda2-82f72ebee7f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-1 NetworkManager[51724]: <info>  [1759267631.1487] manager: (tapdd77e849-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.171 2 INFO os_vif [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5')
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.240 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.241 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.241 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No VIF found with MAC fa:16:3e:7b:d4:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.242 2 INFO nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Using config drive
Sep 30 21:27:11 compute-1 kernel: tapdd77e849-d5: entered promiscuous mode
Sep 30 21:27:11 compute-1 NetworkManager[51724]: <info>  [1759267631.3161] manager: (tapdd77e849-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-1 ovn_controller[94902]: 2025-09-30T21:27:11Z|00243|binding|INFO|Claiming lport dd77e849-d522-43c2-94ac-03e8228ad770 for this chassis.
Sep 30 21:27:11 compute-1 ovn_controller[94902]: 2025-09-30T21:27:11Z|00244|binding|INFO|dd77e849-d522-43c2-94ac-03e8228ad770: Claiming fa:16:3e:7b:d4:2d 10.100.0.7
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.336 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:d4:2d 10.100.0.7'], port_security=['fa:16:3e:7b:d4:2d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3391dcee-6677-46e1-bda2-82f72ebee7f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=dd77e849-d522-43c2-94ac-03e8228ad770) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.337 103861 INFO neutron.agent.ovn.metadata.agent [-] Port dd77e849-d522-43c2-94ac-03e8228ad770 in datapath 5569112a-9fb3-4151-add0-95b595cbe309 bound to our chassis
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.340 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:27:11 compute-1 ovn_controller[94902]: 2025-09-30T21:27:11Z|00245|binding|INFO|Setting lport dd77e849-d522-43c2-94ac-03e8228ad770 ovn-installed in OVS
Sep 30 21:27:11 compute-1 ovn_controller[94902]: 2025-09-30T21:27:11Z|00246|binding|INFO|Setting lport dd77e849-d522-43c2-94ac-03e8228ad770 up in Southbound
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.356 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1d6e47-497e-4e19-9bfa-d600127f7271]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.357 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5569112a-91 in ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.359 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5569112a-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.360 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f1aaa2d9-fb85-4d80-a7eb-137b35db439e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.360 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[244f0046-aae5-4934-ac9e-ef28ad29633f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 systemd-udevd[230048]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:27:11 compute-1 NetworkManager[51724]: <info>  [1759267631.3805] device (tapdd77e849-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:27:11 compute-1 systemd-machined[152783]: New machine qemu-32-instance-0000003d.
Sep 30 21:27:11 compute-1 NetworkManager[51724]: <info>  [1759267631.3814] device (tapdd77e849-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.383 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[698890c3-fabe-4e69-8089-e5ebb9f98a5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 systemd[1]: Started Virtual Machine qemu-32-instance-0000003d.
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.400 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[00f49d06-1410-47c6-9ea3-a0b606d10fcd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.432 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[c97496f4-10b2-4942-bf67-ea18376ded36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 NetworkManager[51724]: <info>  [1759267631.4398] manager: (tap5569112a-90): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.439 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[771097c3-0eca-4723-882f-4fb53077a5c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 systemd-udevd[230052]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.476 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ce2a335e-ee16-4be4-954d-9418bba9f785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.479 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[024b8297-82a6-4365-b440-26a0783e2f2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 NetworkManager[51724]: <info>  [1759267631.5022] device (tap5569112a-90): carrier: link connected
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.507 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7b24be-b020-4d15-a57a-3a267658ea53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.524 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[12d03503-ff73-405d-8989-324ce42c7b51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432717, 'reachable_time': 39710, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230081, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.539 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c30c28-919b-4dd6-81a5-5677a10df042]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:1ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 432717, 'tstamp': 432717}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230082, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.557 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[da470f9a-af75-4d19-8f1a-88269ca197a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432717, 'reachable_time': 39710, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230083, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.588 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8e94f06e-081f-4837-a147-9ab10291a992]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.651 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b9c072b6-ad03-4c8d-bbe8-18d6725a3b7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.653 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.654 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.654 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5569112a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-1 kernel: tap5569112a-90: entered promiscuous mode
Sep 30 21:27:11 compute-1 NetworkManager[51724]: <info>  [1759267631.6569] manager: (tap5569112a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.665 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5569112a-90, col_values=(('external_ids', {'iface-id': 'af49dc17-c7c9-4524-8791-14107f2ff34d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-1 ovn_controller[94902]: 2025-09-30T21:27:11Z|00247|binding|INFO|Releasing lport af49dc17-c7c9-4524-8791-14107f2ff34d from this chassis (sb_readonly=0)
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.669 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.670 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[131d07cb-519f-40c9-a4fc-e5af01f8b81d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.670 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:27:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:11.671 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'env', 'PROCESS_TAG=haproxy-5569112a-9fb3-4151-add0-95b595cbe309', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5569112a-9fb3-4151-add0-95b595cbe309.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:27:11 compute-1 nova_compute[192795]: 2025-09-30 21:27:11.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.061 2 DEBUG nova.compute.manager [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.063 2 DEBUG oslo_concurrency.lockutils [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.063 2 DEBUG oslo_concurrency.lockutils [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.063 2 DEBUG oslo_concurrency.lockutils [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.064 2 DEBUG nova.compute.manager [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.064 2 WARNING nova.compute.manager [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state active and task_state resize_finish.
Sep 30 21:27:12 compute-1 podman[230123]: 2025-09-30 21:27:12.116952123 +0000 UTC m=+0.067330969 container create 526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:27:12 compute-1 podman[230123]: 2025-09-30 21:27:12.080487179 +0000 UTC m=+0.030866025 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:27:12 compute-1 systemd[1]: Started libpod-conmon-526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783.scope.
Sep 30 21:27:12 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:27:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/915aaef63f8f127cbbd30047e3c050dc3179706a64abc000396700ea813c14b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.226 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267632.226176, 3391dcee-6677-46e1-bda2-82f72ebee7f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.227 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] VM Resumed (Lifecycle Event)
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.229 2 DEBUG nova.compute.manager [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:12 compute-1 podman[230123]: 2025-09-30 21:27:12.234594705 +0000 UTC m=+0.184973551 container init 526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.237 2 INFO nova.virt.libvirt.driver [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance running successfully.
Sep 30 21:27:12 compute-1 virtqemud[192217]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.240 2 DEBUG nova.virt.libvirt.guest [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.241 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Sep 30 21:27:12 compute-1 podman[230123]: 2025-09-30 21:27:12.242830405 +0000 UTC m=+0.193209231 container start 526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.245 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.249 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:27:12 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[230139]: [NOTICE]   (230143) : New worker (230145) forked
Sep 30 21:27:12 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[230139]: [NOTICE]   (230143) : Loading success.
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.284 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.287 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267632.2264261, 3391dcee-6677-46e1-bda2-82f72ebee7f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.288 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] VM Started (Lifecycle Event)
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.315 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.320 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.344 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.356 2 DEBUG nova.network.neutron [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updated VIF entry in instance network info cache for port dd77e849-d522-43c2-94ac-03e8228ad770. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.357 2 DEBUG nova.network.neutron [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating instance_info_cache with network_info: [{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:12 compute-1 nova_compute[192795]: 2025-09-30 21:27:12.370 2 DEBUG oslo_concurrency.lockutils [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:14 compute-1 nova_compute[192795]: 2025-09-30 21:27:14.225 2 DEBUG nova.compute.manager [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:14 compute-1 nova_compute[192795]: 2025-09-30 21:27:14.226 2 DEBUG oslo_concurrency.lockutils [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:14 compute-1 nova_compute[192795]: 2025-09-30 21:27:14.227 2 DEBUG oslo_concurrency.lockutils [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:14 compute-1 nova_compute[192795]: 2025-09-30 21:27:14.228 2 DEBUG oslo_concurrency.lockutils [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:14 compute-1 nova_compute[192795]: 2025-09-30 21:27:14.228 2 DEBUG nova.compute.manager [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:14 compute-1 nova_compute[192795]: 2025-09-30 21:27:14.228 2 WARNING nova.compute.manager [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state resized and task_state None.
Sep 30 21:27:15 compute-1 ovn_controller[94902]: 2025-09-30T21:27:15Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:51:a5:e1 10.100.0.7
Sep 30 21:27:15 compute-1 ovn_controller[94902]: 2025-09-30T21:27:15Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:51:a5:e1 10.100.0.7
Sep 30 21:27:16 compute-1 nova_compute[192795]: 2025-09-30 21:27:16.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:16 compute-1 nova_compute[192795]: 2025-09-30 21:27:16.326 2 DEBUG nova.compute.manager [req-064d1572-f3d3-48f4-95fd-3266bf434093 req-f62201af-3ade-45f1-a42c-51230d5fe320 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received event network-changed-4467e22a-a4e7-4951-b98a-c78e9f926681 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:16 compute-1 nova_compute[192795]: 2025-09-30 21:27:16.327 2 DEBUG nova.compute.manager [req-064d1572-f3d3-48f4-95fd-3266bf434093 req-f62201af-3ade-45f1-a42c-51230d5fe320 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Refreshing instance network info cache due to event network-changed-4467e22a-a4e7-4951-b98a-c78e9f926681. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:27:16 compute-1 nova_compute[192795]: 2025-09-30 21:27:16.328 2 DEBUG oslo_concurrency.lockutils [req-064d1572-f3d3-48f4-95fd-3266bf434093 req-f62201af-3ade-45f1-a42c-51230d5fe320 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:16 compute-1 nova_compute[192795]: 2025-09-30 21:27:16.328 2 DEBUG oslo_concurrency.lockutils [req-064d1572-f3d3-48f4-95fd-3266bf434093 req-f62201af-3ade-45f1-a42c-51230d5fe320 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:16 compute-1 nova_compute[192795]: 2025-09-30 21:27:16.328 2 DEBUG nova.network.neutron [req-064d1572-f3d3-48f4-95fd-3266bf434093 req-f62201af-3ade-45f1-a42c-51230d5fe320 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Refreshing network info cache for port 4467e22a-a4e7-4951-b98a-c78e9f926681 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.179 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.180 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.181 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.181 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.181 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.197 2 INFO nova.compute.manager [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Terminating instance
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.210 2 DEBUG nova.compute.manager [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:27:17 compute-1 kernel: tapdd77e849-d5 (unregistering): left promiscuous mode
Sep 30 21:27:17 compute-1 NetworkManager[51724]: <info>  [1759267637.2368] device (tapdd77e849-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:27:17 compute-1 podman[230177]: 2025-09-30 21:27:17.243878799 +0000 UTC m=+0.066672802 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:27:17 compute-1 ovn_controller[94902]: 2025-09-30T21:27:17Z|00248|binding|INFO|Releasing lport dd77e849-d522-43c2-94ac-03e8228ad770 from this chassis (sb_readonly=0)
Sep 30 21:27:17 compute-1 ovn_controller[94902]: 2025-09-30T21:27:17Z|00249|binding|INFO|Setting lport dd77e849-d522-43c2-94ac-03e8228ad770 down in Southbound
Sep 30 21:27:17 compute-1 ovn_controller[94902]: 2025-09-30T21:27:17Z|00250|binding|INFO|Removing iface tapdd77e849-d5 ovn-installed in OVS
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.284 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:d4:2d 10.100.0.7'], port_security=['fa:16:3e:7b:d4:2d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3391dcee-6677-46e1-bda2-82f72ebee7f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=dd77e849-d522-43c2-94ac-03e8228ad770) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.285 103861 INFO neutron.agent.ovn.metadata.agent [-] Port dd77e849-d522-43c2-94ac-03e8228ad770 in datapath 5569112a-9fb3-4151-add0-95b595cbe309 unbound from our chassis
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.287 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5569112a-9fb3-4151-add0-95b595cbe309, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.289 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1e71f2c3-70c3-4518-a7b9-99ffecfbc890]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.290 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace which is not needed anymore
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:17 compute-1 podman[230175]: 2025-09-30 21:27:17.30757557 +0000 UTC m=+0.137650247 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:27:17 compute-1 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Sep 30 21:27:17 compute-1 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000003d.scope: Consumed 5.782s CPU time.
Sep 30 21:27:17 compute-1 podman[230176]: 2025-09-30 21:27:17.31695008 +0000 UTC m=+0.137836273 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:27:17 compute-1 systemd-machined[152783]: Machine qemu-32-instance-0000003d terminated.
Sep 30 21:27:17 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[230139]: [NOTICE]   (230143) : haproxy version is 2.8.14-c23fe91
Sep 30 21:27:17 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[230139]: [NOTICE]   (230143) : path to executable is /usr/sbin/haproxy
Sep 30 21:27:17 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[230139]: [WARNING]  (230143) : Exiting Master process...
Sep 30 21:27:17 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[230139]: [WARNING]  (230143) : Exiting Master process...
Sep 30 21:27:17 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[230139]: [ALERT]    (230143) : Current worker (230145) exited with code 143 (Terminated)
Sep 30 21:27:17 compute-1 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[230139]: [WARNING]  (230143) : All workers exited. Exiting... (0)
Sep 30 21:27:17 compute-1 systemd[1]: libpod-526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783.scope: Deactivated successfully.
Sep 30 21:27:17 compute-1 podman[230261]: 2025-09-30 21:27:17.44164387 +0000 UTC m=+0.054075585 container died 526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.485 2 INFO nova.virt.libvirt.driver [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance destroyed successfully.
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.486 2 DEBUG nova.objects.instance [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'resources' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:17 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783-userdata-shm.mount: Deactivated successfully.
Sep 30 21:27:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-915aaef63f8f127cbbd30047e3c050dc3179706a64abc000396700ea813c14b0-merged.mount: Deactivated successfully.
Sep 30 21:27:17 compute-1 podman[230261]: 2025-09-30 21:27:17.499260629 +0000 UTC m=+0.111692304 container cleanup 526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.502 2 DEBUG nova.virt.libvirt.vif [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1431604073',display_name='tempest-DeleteServersTestJSON-server-1431604073',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1431604073',id=61,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:27:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-fu0pukiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:27:12Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=3391dcee-6677-46e1-bda2-82f72ebee7f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.503 2 DEBUG nova.network.os_vif_util [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.503 2 DEBUG nova.network.os_vif_util [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.504 2 DEBUG os_vif [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.506 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd77e849-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.512 2 INFO os_vif [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5')
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.513 2 INFO nova.virt.libvirt.driver [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Deleting instance files /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_del
Sep 30 21:27:17 compute-1 systemd[1]: libpod-conmon-526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783.scope: Deactivated successfully.
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.518 2 INFO nova.virt.libvirt.driver [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Deletion of /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_del complete
Sep 30 21:27:17 compute-1 podman[230308]: 2025-09-30 21:27:17.574259172 +0000 UTC m=+0.045731502 container remove 526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.580 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[30a550e6-f996-421d-9f5e-2d7b749eee1c]: (4, ('Tue Sep 30 09:27:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783)\n526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783\nTue Sep 30 09:27:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783)\n526c6cc9e459c245308c6c7b376f7a20a639f4243844a3db0b667656384b9783\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.583 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0e71626d-225b-4a76-ae69-01fc91d049f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.584 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:17 compute-1 kernel: tap5569112a-90: left promiscuous mode
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.607 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[11b62716-6c59-4ae0-85c1-681b554150ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.617 2 INFO nova.compute.manager [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.618 2 DEBUG oslo.service.loopingcall [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.618 2 DEBUG nova.compute.manager [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.619 2 DEBUG nova.network.neutron [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:27:17 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:27:17 compute-1 systemd[229975]: Activating special unit Exit the Session...
Sep 30 21:27:17 compute-1 systemd[229975]: Stopped target Main User Target.
Sep 30 21:27:17 compute-1 systemd[229975]: Stopped target Basic System.
Sep 30 21:27:17 compute-1 systemd[229975]: Stopped target Paths.
Sep 30 21:27:17 compute-1 systemd[229975]: Stopped target Sockets.
Sep 30 21:27:17 compute-1 systemd[229975]: Stopped target Timers.
Sep 30 21:27:17 compute-1 systemd[229975]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:27:17 compute-1 systemd[229975]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:27:17 compute-1 systemd[229975]: Closed D-Bus User Message Bus Socket.
Sep 30 21:27:17 compute-1 systemd[229975]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:27:17 compute-1 systemd[229975]: Removed slice User Application Slice.
Sep 30 21:27:17 compute-1 systemd[229975]: Reached target Shutdown.
Sep 30 21:27:17 compute-1 systemd[229975]: Finished Exit the Session.
Sep 30 21:27:17 compute-1 systemd[229975]: Reached target Exit the Session.
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.641 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[37016e64-7310-4ec6-b55e-f339fd786132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.645 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2724ec-2a89-41e9-b769-622ab57adfbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:17 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:27:17 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.664 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3c27ee-9179-4300-90a5-2cc5e46de231]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432710, 'reachable_time': 16809, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230323, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.667 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:27:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:17.667 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[6c75405c-10cc-42a5-a25f-6f155836855d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:17 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:27:17 compute-1 systemd[1]: run-netns-ovnmeta\x2d5569112a\x2d9fb3\x2d4151\x2dadd0\x2d95b595cbe309.mount: Deactivated successfully.
Sep 30 21:27:17 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:27:17 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:27:17 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:27:17 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.773 2 DEBUG nova.network.neutron [req-064d1572-f3d3-48f4-95fd-3266bf434093 req-f62201af-3ade-45f1-a42c-51230d5fe320 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Updated VIF entry in instance network info cache for port 4467e22a-a4e7-4951-b98a-c78e9f926681. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.773 2 DEBUG nova.network.neutron [req-064d1572-f3d3-48f4-95fd-3266bf434093 req-f62201af-3ade-45f1-a42c-51230d5fe320 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Updating instance_info_cache with network_info: [{"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:17 compute-1 nova_compute[192795]: 2025-09-30 21:27:17.789 2 DEBUG oslo_concurrency.lockutils [req-064d1572-f3d3-48f4-95fd-3266bf434093 req-f62201af-3ade-45f1-a42c-51230d5fe320 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-fd0159fd-1f19-45df-a72d-25de1e287dcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.423 2 DEBUG nova.compute.manager [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-unplugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.424 2 DEBUG oslo_concurrency.lockutils [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.424 2 DEBUG oslo_concurrency.lockutils [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.424 2 DEBUG oslo_concurrency.lockutils [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.424 2 DEBUG nova.compute.manager [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-unplugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.424 2 WARNING nova.compute.manager [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-unplugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state active and task_state None.
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.424 2 DEBUG nova.compute.manager [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.425 2 DEBUG oslo_concurrency.lockutils [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.425 2 DEBUG oslo_concurrency.lockutils [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.425 2 DEBUG oslo_concurrency.lockutils [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.425 2 DEBUG nova.compute.manager [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.425 2 WARNING nova.compute.manager [req-abb2a1e7-e930-48ab-bd10-39d6f3f87a93 req-7d9839a9-9cd9-4a29-bf3c-3a7447010f47 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state active and task_state None.
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.884 2 DEBUG nova.network.neutron [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:18 compute-1 nova_compute[192795]: 2025-09-30 21:27:18.904 2 INFO nova.compute.manager [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Took 1.29 seconds to deallocate network for instance.
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.025 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.026 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.032 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.075 2 DEBUG oslo_concurrency.lockutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.076 2 DEBUG oslo_concurrency.lockutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.076 2 INFO nova.compute.manager [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Rebooting instance
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.080 2 INFO nova.scheduler.client.report [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Deleted allocations for instance 3391dcee-6677-46e1-bda2-82f72ebee7f2
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.105 2 DEBUG oslo_concurrency.lockutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.105 2 DEBUG oslo_concurrency.lockutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.106 2 DEBUG nova.network.neutron [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.203 2 DEBUG nova.compute.manager [req-b110ab15-c8d2-4d5e-844b-4df9bd057855 req-4323ab81-43d0-46c8-84da-b9d5c4c23ade dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-deleted-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.204 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.377 2 DEBUG oslo_concurrency.lockutils [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "fd0159fd-1f19-45df-a72d-25de1e287dcd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.378 2 DEBUG oslo_concurrency.lockutils [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.378 2 DEBUG oslo_concurrency.lockutils [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.379 2 DEBUG oslo_concurrency.lockutils [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.379 2 DEBUG oslo_concurrency.lockutils [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.396 2 INFO nova.compute.manager [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Terminating instance
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.413 2 DEBUG nova.compute.manager [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:27:19 compute-1 kernel: tap4467e22a-a4 (unregistering): left promiscuous mode
Sep 30 21:27:19 compute-1 NetworkManager[51724]: <info>  [1759267639.4495] device (tap4467e22a-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:27:19 compute-1 ovn_controller[94902]: 2025-09-30T21:27:19Z|00251|binding|INFO|Releasing lport 4467e22a-a4e7-4951-b98a-c78e9f926681 from this chassis (sb_readonly=0)
Sep 30 21:27:19 compute-1 ovn_controller[94902]: 2025-09-30T21:27:19Z|00252|binding|INFO|Setting lport 4467e22a-a4e7-4951-b98a-c78e9f926681 down in Southbound
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:19 compute-1 ovn_controller[94902]: 2025-09-30T21:27:19Z|00253|binding|INFO|Removing iface tap4467e22a-a4 ovn-installed in OVS
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.470 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:a5:e1 10.100.0.7'], port_security=['fa:16:3e:51:a5:e1 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fd0159fd-1f19-45df-a72d-25de1e287dcd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-093b4e0f-6b01-42f3-8ea3-902bf0bf0397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1b9086019024e4cbbc8aee7f8972fd1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '466b3098-7b1d-4e9f-af8d-2137ea8ba1ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78f5a320-60ed-4426-bcd7-1ccda3e9f5c7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=4467e22a-a4e7-4951-b98a-c78e9f926681) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.472 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 4467e22a-a4e7-4951-b98a-c78e9f926681 in datapath 093b4e0f-6b01-42f3-8ea3-902bf0bf0397 unbound from our chassis
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.473 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 093b4e0f-6b01-42f3-8ea3-902bf0bf0397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.475 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[695083b7-7752-4521-82fb-12bfa48fa3b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.475 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397 namespace which is not needed anymore
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:19 compute-1 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Sep 30 21:27:19 compute-1 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000003f.scope: Consumed 13.304s CPU time.
Sep 30 21:27:19 compute-1 systemd-machined[152783]: Machine qemu-31-instance-0000003f terminated.
Sep 30 21:27:19 compute-1 neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397[229879]: [NOTICE]   (229883) : haproxy version is 2.8.14-c23fe91
Sep 30 21:27:19 compute-1 neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397[229879]: [NOTICE]   (229883) : path to executable is /usr/sbin/haproxy
Sep 30 21:27:19 compute-1 neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397[229879]: [WARNING]  (229883) : Exiting Master process...
Sep 30 21:27:19 compute-1 neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397[229879]: [ALERT]    (229883) : Current worker (229885) exited with code 143 (Terminated)
Sep 30 21:27:19 compute-1 neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397[229879]: [WARNING]  (229883) : All workers exited. Exiting... (0)
Sep 30 21:27:19 compute-1 systemd[1]: libpod-75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82.scope: Deactivated successfully.
Sep 30 21:27:19 compute-1 podman[230346]: 2025-09-30 21:27:19.635383899 +0000 UTC m=+0.056053208 container died 75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:19 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82-userdata-shm.mount: Deactivated successfully.
Sep 30 21:27:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-d0117203bc0cb1b50d19bb8c0e8772dc08711ebd613d2d25c89be58a9b18f6ea-merged.mount: Deactivated successfully.
Sep 30 21:27:19 compute-1 podman[230346]: 2025-09-30 21:27:19.691838926 +0000 UTC m=+0.112508215 container cleanup 75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.705 2 INFO nova.virt.libvirt.driver [-] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Instance destroyed successfully.
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.707 2 DEBUG nova.objects.instance [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lazy-loading 'resources' on Instance uuid fd0159fd-1f19-45df-a72d-25de1e287dcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:19 compute-1 systemd[1]: libpod-conmon-75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82.scope: Deactivated successfully.
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.731 2 DEBUG nova.virt.libvirt.vif [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-1949663081',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-1949663081',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-194966308',id=63,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:27:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b1b9086019024e4cbbc8aee7f8972fd1',ramdisk_id='',reservation_id='r-74wrhydr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-862544542',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-862544542-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:27:02Z,user_data=None,user_id='aa01576fa62e4e208b7362e64674479f',uuid=fd0159fd-1f19-45df-a72d-25de1e287dcd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.732 2 DEBUG nova.network.os_vif_util [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Converting VIF {"id": "4467e22a-a4e7-4951-b98a-c78e9f926681", "address": "fa:16:3e:51:a5:e1", "network": {"id": "093b4e0f-6b01-42f3-8ea3-902bf0bf0397", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1458049522-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1b9086019024e4cbbc8aee7f8972fd1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4467e22a-a4", "ovs_interfaceid": "4467e22a-a4e7-4951-b98a-c78e9f926681", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.733 2 DEBUG nova.network.os_vif_util [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:51:a5:e1,bridge_name='br-int',has_traffic_filtering=True,id=4467e22a-a4e7-4951-b98a-c78e9f926681,network=Network(093b4e0f-6b01-42f3-8ea3-902bf0bf0397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4467e22a-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.734 2 DEBUG os_vif [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:51:a5:e1,bridge_name='br-int',has_traffic_filtering=True,id=4467e22a-a4e7-4951-b98a-c78e9f926681,network=Network(093b4e0f-6b01-42f3-8ea3-902bf0bf0397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4467e22a-a4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.737 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4467e22a-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.748 2 INFO os_vif [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:51:a5:e1,bridge_name='br-int',has_traffic_filtering=True,id=4467e22a-a4e7-4951-b98a-c78e9f926681,network=Network(093b4e0f-6b01-42f3-8ea3-902bf0bf0397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4467e22a-a4')
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.749 2 INFO nova.virt.libvirt.driver [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Deleting instance files /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd_del
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.750 2 INFO nova.virt.libvirt.driver [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Deletion of /var/lib/nova/instances/fd0159fd-1f19-45df-a72d-25de1e287dcd_del complete
Sep 30 21:27:19 compute-1 podman[230391]: 2025-09-30 21:27:19.785546519 +0000 UTC m=+0.057391724 container remove 75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.793 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e643c247-37b5-470c-b86f-c6ccf4678bb1]: (4, ('Tue Sep 30 09:27:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397 (75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82)\n75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82\nTue Sep 30 09:27:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397 (75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82)\n75b0491b188c55703d4c61ab1bd3a96fcfc232ec1087a67f24497a0629d3fc82\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.796 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d360e4f9-6001-4117-9bd3-b377f6d9a14c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.797 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap093b4e0f-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:19 compute-1 kernel: tap093b4e0f-60: left promiscuous mode
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.825 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[834901f0-856d-4f16-b986-7f502d64b965]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.864 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e7195f-a7b0-4732-a739-eee9d4cde2cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.866 2 INFO nova.compute.manager [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Took 0.45 seconds to destroy the instance on the hypervisor.
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.867 2 DEBUG oslo.service.loopingcall [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.866 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cacda925-aa4d-4849-a934-187a350f4ad9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.867 2 DEBUG nova.compute.manager [-] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:27:19 compute-1 nova_compute[192795]: 2025-09-30 21:27:19.867 2 DEBUG nova.network.neutron [-] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.885 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d88ad662-f8db-4cde-a260-3240891f6d6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431660, 'reachable_time': 22144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230406, 'error': None, 'target': 'ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.889 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-093b4e0f-6b01-42f3-8ea3-902bf0bf0397 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:27:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:19.889 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[14776590-3f33-4717-8d06-30a861e9f973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:19 compute-1 systemd[1]: run-netns-ovnmeta\x2d093b4e0f\x2d6b01\x2d42f3\x2d8ea3\x2d902bf0bf0397.mount: Deactivated successfully.
Sep 30 21:27:20 compute-1 nova_compute[192795]: 2025-09-30 21:27:20.920 2 DEBUG nova.network.neutron [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:20 compute-1 nova_compute[192795]: 2025-09-30 21:27:20.937 2 DEBUG oslo_concurrency.lockutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:20 compute-1 nova_compute[192795]: 2025-09-30 21:27:20.957 2 DEBUG nova.compute.manager [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:21 compute-1 kernel: tap242fb53f-7c (unregistering): left promiscuous mode
Sep 30 21:27:21 compute-1 NetworkManager[51724]: <info>  [1759267641.1590] device (tap242fb53f-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 ovn_controller[94902]: 2025-09-30T21:27:21Z|00254|binding|INFO|Releasing lport 242fb53f-7c71-48ef-a180-00bad1488d61 from this chassis (sb_readonly=0)
Sep 30 21:27:21 compute-1 ovn_controller[94902]: 2025-09-30T21:27:21Z|00255|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 down in Southbound
Sep 30 21:27:21 compute-1 ovn_controller[94902]: 2025-09-30T21:27:21Z|00256|binding|INFO|Removing iface tap242fb53f-7c ovn-installed in OVS
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.194 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.199 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.202 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.203 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[23a5bf9c-b28b-409d-b535-f00a86f8e050]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.204 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:27:21 compute-1 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000038.scope: Deactivated successfully.
Sep 30 21:27:21 compute-1 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000038.scope: Consumed 13.963s CPU time.
Sep 30 21:27:21 compute-1 systemd-machined[152783]: Machine qemu-30-instance-00000038 terminated.
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.336 2 DEBUG nova.compute.manager [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received event network-vif-unplugged-4467e22a-a4e7-4951-b98a-c78e9f926681 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.337 2 DEBUG oslo_concurrency.lockutils [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.337 2 DEBUG oslo_concurrency.lockutils [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.337 2 DEBUG oslo_concurrency.lockutils [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.337 2 DEBUG nova.compute.manager [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] No waiting events found dispatching network-vif-unplugged-4467e22a-a4e7-4951-b98a-c78e9f926681 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.337 2 DEBUG nova.compute.manager [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received event network-vif-unplugged-4467e22a-a4e7-4951-b98a-c78e9f926681 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.338 2 DEBUG nova.compute.manager [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received event network-vif-plugged-4467e22a-a4e7-4951-b98a-c78e9f926681 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.338 2 DEBUG oslo_concurrency.lockutils [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.338 2 DEBUG oslo_concurrency.lockutils [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.338 2 DEBUG oslo_concurrency.lockutils [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.338 2 DEBUG nova.compute.manager [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] No waiting events found dispatching network-vif-plugged-4467e22a-a4e7-4951-b98a-c78e9f926681 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.339 2 WARNING nova.compute.manager [req-c0082b94-cc17-496f-83f6-75ba726ca5ec req-4746549c-ae71-4d17-b37e-31526fe28b56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received unexpected event network-vif-plugged-4467e22a-a4e7-4951-b98a-c78e9f926681 for instance with vm_state active and task_state deleting.
Sep 30 21:27:21 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229693]: [NOTICE]   (229697) : haproxy version is 2.8.14-c23fe91
Sep 30 21:27:21 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229693]: [NOTICE]   (229697) : path to executable is /usr/sbin/haproxy
Sep 30 21:27:21 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229693]: [WARNING]  (229697) : Exiting Master process...
Sep 30 21:27:21 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229693]: [ALERT]    (229697) : Current worker (229699) exited with code 143 (Terminated)
Sep 30 21:27:21 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229693]: [WARNING]  (229697) : All workers exited. Exiting... (0)
Sep 30 21:27:21 compute-1 systemd[1]: libpod-bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c.scope: Deactivated successfully.
Sep 30 21:27:21 compute-1 podman[230431]: 2025-09-30 21:27:21.397239693 +0000 UTC m=+0.075881967 container died bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.398 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance destroyed successfully.
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.399 2 DEBUG nova.objects.instance [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.425 2 DEBUG nova.virt.libvirt.vif [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:27:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.425 2 DEBUG nova.network.os_vif_util [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.431 2 DEBUG nova.network.os_vif_util [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.432 2 DEBUG os_vif [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:27:21 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.436 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap242fb53f-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:21 compute-1 systemd[1]: var-lib-containers-storage-overlay-b84c451430759ebb6038e1ecf9f91950ef23915947248e08708a3996611fc4f8-merged.mount: Deactivated successfully.
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 podman[230431]: 2025-09-30 21:27:21.442850112 +0000 UTC m=+0.121492386 container cleanup bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.444 2 INFO os_vif [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.455 2 DEBUG nova.virt.libvirt.driver [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Start _get_guest_xml network_info=[{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:27:21 compute-1 systemd[1]: libpod-conmon-bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c.scope: Deactivated successfully.
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.467 2 WARNING nova.virt.libvirt.driver [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.473 2 DEBUG nova.virt.libvirt.host [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.474 2 DEBUG nova.virt.libvirt.host [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.478 2 DEBUG nova.virt.libvirt.host [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.478 2 DEBUG nova.virt.libvirt.host [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.480 2 DEBUG nova.virt.libvirt.driver [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.480 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.480 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.481 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.481 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.481 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.481 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.482 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.482 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.482 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.482 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.483 2 DEBUG nova.virt.hardware [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.483 2 DEBUG nova.objects.instance [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.487 2 DEBUG nova.network.neutron [-] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:21 compute-1 podman[230477]: 2025-09-30 21:27:21.509853552 +0000 UTC m=+0.041097230 container remove bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.513 2 DEBUG nova.virt.libvirt.vif [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:27:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.514 2 DEBUG nova.network.os_vif_util [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.514 2 DEBUG nova.network.os_vif_util [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.516 2 DEBUG nova.objects.instance [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.518 2 INFO nova.compute.manager [-] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Took 1.65 seconds to deallocate network for instance.
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.519 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dc14d555-f6e6-4da5-92d2-8578af8a5369]: (4, ('Tue Sep 30 09:27:21 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c)\nbbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c\nTue Sep 30 09:27:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (bbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c)\nbbda3c8373b80302788ddfd53fbeaea0e48f5abc79b78debb1cbead2078d518c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.522 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e724ae0e-c788-49c3-b01f-15181b0325fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.524 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.549 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb5eebf-4b21-417c-8aec-8fe1d939f69f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.552 2 DEBUG nova.virt.libvirt.driver [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <uuid>128bd4be-4a76-4dbb-aef6-65acd9c11cbd</uuid>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <name>instance-00000038</name>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerActionsTestJSON-server-394601736</nova:name>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:27:21</nova:creationTime>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:27:21 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:27:21 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:27:21 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:27:21 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:27:21 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:27:21 compute-1 nova_compute[192795]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:27:21 compute-1 nova_compute[192795]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:27:21 compute-1 nova_compute[192795]:         <nova:port uuid="242fb53f-7c71-48ef-a180-00bad1488d61">
Sep 30 21:27:21 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <system>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <entry name="serial">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <entry name="uuid">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     </system>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <os>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   </os>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <features>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   </features>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:65:e3:f2"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <target dev="tap242fb53f-7c"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/console.log" append="off"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <video>
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     </video>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <input type="keyboard" bus="usb"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:27:21 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:27:21 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:27:21 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:27:21 compute-1 nova_compute[192795]: </domain>
Sep 30 21:27:21 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.553 2 DEBUG oslo_concurrency.processutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.580 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1b687d4c-b5a5-4a62-8365-76768e74830d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.582 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[14d444cb-a78b-4f60-91f0-5795c2211173]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.596 2 DEBUG nova.compute.manager [req-d819ba53-5957-4cc6-ae76-6161436ba8d6 req-086802ee-627e-4349-9c6e-1879008e29f5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Received event network-vif-deleted-4467e22a-a4e7-4951-b98a-c78e9f926681 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.603 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[74e912ed-2207-4eef-b716-35b092f74099]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430582, 'reachable_time': 33636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230494, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.606 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.606 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[824c9974-e54f-4db4-aad8-dd35717b1170]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.619 2 DEBUG oslo_concurrency.processutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.620 2 DEBUG oslo_concurrency.processutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.645 2 DEBUG oslo_concurrency.lockutils [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.646 2 DEBUG oslo_concurrency.lockutils [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.687 2 DEBUG oslo_concurrency.processutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.689 2 DEBUG nova.objects.instance [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.707 2 DEBUG oslo_concurrency.processutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.756 2 DEBUG nova.compute.provider_tree [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.769 2 DEBUG oslo_concurrency.processutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.770 2 DEBUG nova.virt.disk.api [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.770 2 DEBUG oslo_concurrency.processutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.793 2 DEBUG nova.scheduler.client.report [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.822 2 DEBUG oslo_concurrency.lockutils [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.831 2 DEBUG oslo_concurrency.processutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.834 2 DEBUG nova.virt.disk.api [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.834 2 DEBUG nova.objects.instance [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.849 2 INFO nova.scheduler.client.report [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Deleted allocations for instance fd0159fd-1f19-45df-a72d-25de1e287dcd
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.852 2 DEBUG nova.virt.libvirt.vif [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:27:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.853 2 DEBUG nova.network.os_vif_util [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.853 2 DEBUG nova.network.os_vif_util [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.854 2 DEBUG os_vif [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.855 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.856 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap242fb53f-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap242fb53f-7c, col_values=(('external_ids', {'iface-id': '242fb53f-7c71-48ef-a180-00bad1488d61', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:e3:f2', 'vm-uuid': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 NetworkManager[51724]: <info>  [1759267641.8621] manager: (tap242fb53f-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.868 2 INFO os_vif [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.916 2 DEBUG oslo_concurrency.lockutils [None req-3d01fb72-4ca1-4e7a-9902-48e918ace765 aa01576fa62e4e208b7362e64674479f b1b9086019024e4cbbc8aee7f8972fd1 - - default default] Lock "fd0159fd-1f19-45df-a72d-25de1e287dcd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:21 compute-1 kernel: tap242fb53f-7c: entered promiscuous mode
Sep 30 21:27:21 compute-1 NetworkManager[51724]: <info>  [1759267641.9505] manager: (tap242fb53f-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Sep 30 21:27:21 compute-1 systemd-udevd[230411]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 ovn_controller[94902]: 2025-09-30T21:27:21Z|00257|binding|INFO|Claiming lport 242fb53f-7c71-48ef-a180-00bad1488d61 for this chassis.
Sep 30 21:27:21 compute-1 ovn_controller[94902]: 2025-09-30T21:27:21Z|00258|binding|INFO|242fb53f-7c71-48ef-a180-00bad1488d61: Claiming fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.960 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '9', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.961 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.963 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:27:21 compute-1 ovn_controller[94902]: 2025-09-30T21:27:21Z|00259|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 ovn-installed in OVS
Sep 30 21:27:21 compute-1 ovn_controller[94902]: 2025-09-30T21:27:21Z|00260|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 up in Southbound
Sep 30 21:27:21 compute-1 NetworkManager[51724]: <info>  [1759267641.9700] device (tap242fb53f-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:27:21 compute-1 NetworkManager[51724]: <info>  [1759267641.9713] device (tap242fb53f-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 nova_compute[192795]: 2025-09-30 21:27:21.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.976 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d0c15b-78c0-4e98-a172-18aa665b2de9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.977 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.979 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.979 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[57a63e1e-f44d-4efb-9159-26fea6655243]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.980 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[679f345c-5bbc-4ce1-8652-de4e01fb02ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:21.995 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[d00a2eb6-171f-4e01-b42c-c0e96445eddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 systemd-machined[152783]: New machine qemu-33-instance-00000038.
Sep 30 21:27:22 compute-1 systemd[1]: Started Virtual Machine qemu-33-instance-00000038.
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.023 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9d2815-c8c1-4d41-8e69-442c4dbc4fb1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.061 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[819884da-731f-4b05-b118-4ffee6c7f5d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.068 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[37325bce-6cf0-4e68-adf3-4cdf63c9370e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 NetworkManager[51724]: <info>  [1759267642.0706] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/131)
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.107 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[0843f8a5-9688-4234-b5f3-d42d5c0aef8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.112 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd8c934-3b1b-4041-a838-363aae1eb3ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 NetworkManager[51724]: <info>  [1759267642.1440] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.149 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b15eec15-9429-4206-920e-7bb93db4d030]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.172 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6f133b0e-b8f9-4435-914e-f890a032286e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433781, 'reachable_time': 17334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230553, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.195 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[38ae2349-231c-42f3-afd1-bf29e81a8d69]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433781, 'tstamp': 433781}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230554, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.223 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[594380be-4270-41b4-aa8b-e50d3904fcc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433781, 'reachable_time': 17334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230555, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.272 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[90cc439f-43ad-4cb4-a9d8-85eadce627ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 nova_compute[192795]: 2025-09-30 21:27:22.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.374 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3df6f7a9-9252-4f9d-8fd6-efdc896a2563]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.375 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.376 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.376 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:22 compute-1 nova_compute[192795]: 2025-09-30 21:27:22.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:22 compute-1 NetworkManager[51724]: <info>  [1759267642.3794] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Sep 30 21:27:22 compute-1 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.388 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:22 compute-1 nova_compute[192795]: 2025-09-30 21:27:22.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:22 compute-1 ovn_controller[94902]: 2025-09-30T21:27:22Z|00261|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:27:22 compute-1 nova_compute[192795]: 2025-09-30 21:27:22.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:22 compute-1 nova_compute[192795]: 2025-09-30 21:27:22.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.410 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.412 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[47807bd4-219b-43c9-979f-d026e3281073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.413 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:27:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:22.414 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:27:22 compute-1 podman[230591]: 2025-09-30 21:27:22.85701919 +0000 UTC m=+0.057695432 container create c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:27:22 compute-1 systemd[1]: Started libpod-conmon-c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa.scope.
Sep 30 21:27:22 compute-1 podman[230591]: 2025-09-30 21:27:22.825996712 +0000 UTC m=+0.026672984 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:27:22 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:27:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cdad755e7cb883770076668c4f4a8150bdda80301b6556ce0e64fc280ae75c3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:27:22 compute-1 podman[230591]: 2025-09-30 21:27:22.957356079 +0000 UTC m=+0.158032351 container init c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:27:22 compute-1 podman[230591]: 2025-09-30 21:27:22.968122257 +0000 UTC m=+0.168798489 container start c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:27:22 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230608]: [NOTICE]   (230612) : New worker (230614) forked
Sep 30 21:27:22 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230608]: [NOTICE]   (230612) : Loading success.
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.310 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 128bd4be-4a76-4dbb-aef6-65acd9c11cbd due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.311 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267643.3099265, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.311 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Resumed (Lifecycle Event)
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.315 2 DEBUG nova.compute.manager [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.320 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance rebooted successfully.
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.320 2 DEBUG nova.compute.manager [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.333 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.339 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.369 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.369 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267643.311201, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.370 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Started (Lifecycle Event)
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.392 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.403 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.414 2 DEBUG oslo_concurrency.lockutils [None req-dd80c684-da3b-44ad-ad8f-cfb469d4fc4e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.466 2 DEBUG nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.467 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.467 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.468 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.468 2 DEBUG nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.469 2 WARNING nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state None.
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.469 2 DEBUG nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.470 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.470 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.471 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.471 2 DEBUG nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.471 2 WARNING nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state None.
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.472 2 DEBUG nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.472 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.473 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.473 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.473 2 DEBUG nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.474 2 WARNING nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state None.
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.474 2 DEBUG nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.474 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.475 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.475 2 DEBUG oslo_concurrency.lockutils [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.475 2 DEBUG nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:23 compute-1 nova_compute[192795]: 2025-09-30 21:27:23.476 2 WARNING nova.compute.manager [req-a5d1ea5a-56f6-4d88-9e66-2a5e72e64fbd req-d491f0f0-2dce-4d48-8d2f-6a5e0d7afe6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state None.
Sep 30 21:27:26 compute-1 podman[230624]: 2025-09-30 21:27:26.26588417 +0000 UTC m=+0.099680782 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible)
Sep 30 21:27:26 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:26.291 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:26 compute-1 nova_compute[192795]: 2025-09-30 21:27:26.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:26.293 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:27:26 compute-1 ovn_controller[94902]: 2025-09-30T21:27:26Z|00262|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:27:26 compute-1 nova_compute[192795]: 2025-09-30 21:27:26.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-1 nova_compute[192795]: 2025-09-30 21:27:26.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:27 compute-1 nova_compute[192795]: 2025-09-30 21:27:27.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:31 compute-1 nova_compute[192795]: 2025-09-30 21:27:31.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:32 compute-1 nova_compute[192795]: 2025-09-30 21:27:32.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:32 compute-1 nova_compute[192795]: 2025-09-30 21:27:32.483 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267637.4831557, 3391dcee-6677-46e1-bda2-82f72ebee7f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:32 compute-1 nova_compute[192795]: 2025-09-30 21:27:32.484 2 INFO nova.compute.manager [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] VM Stopped (Lifecycle Event)
Sep 30 21:27:32 compute-1 nova_compute[192795]: 2025-09-30 21:27:32.503 2 DEBUG nova.compute.manager [None req-f93880d5-101f-4c95-ba48-306cd2b5f2ec - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:33 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:33.297 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:33 compute-1 nova_compute[192795]: 2025-09-30 21:27:33.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:34 compute-1 nova_compute[192795]: 2025-09-30 21:27:34.700 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267639.6991906, fd0159fd-1f19-45df-a72d-25de1e287dcd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:34 compute-1 nova_compute[192795]: 2025-09-30 21:27:34.702 2 INFO nova.compute.manager [-] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] VM Stopped (Lifecycle Event)
Sep 30 21:27:34 compute-1 nova_compute[192795]: 2025-09-30 21:27:34.723 2 DEBUG nova.compute.manager [None req-ab86e841-70c3-4c62-bced-46f0caf5ff7b - - - - - -] [instance: fd0159fd-1f19-45df-a72d-25de1e287dcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:35 compute-1 nova_compute[192795]: 2025-09-30 21:27:35.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:35 compute-1 ovn_controller[94902]: 2025-09-30T21:27:35Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:27:36 compute-1 podman[230656]: 2025-09-30 21:27:36.235874931 +0000 UTC m=+0.071328496 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:27:36 compute-1 podman[230658]: 2025-09-30 21:27:36.243139335 +0000 UTC m=+0.071617624 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:27:36 compute-1 podman[230657]: 2025-09-30 21:27:36.297070196 +0000 UTC m=+0.128211766 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 21:27:36 compute-1 nova_compute[192795]: 2025-09-30 21:27:36.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:37 compute-1 nova_compute[192795]: 2025-09-30 21:27:37.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:38.688 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:38.690 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:27:38.691 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:40 compute-1 podman[230721]: 2025-09-30 21:27:40.232676815 +0000 UTC m=+0.076276309 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 21:27:40 compute-1 nova_compute[192795]: 2025-09-30 21:27:40.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.737 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.737 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.738 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.738 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.834 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.894 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.895 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:41 compute-1 nova_compute[192795]: 2025-09-30 21:27:41.950 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.151 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.152 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5552MB free_disk=73.35671615600586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.152 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.153 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.227 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 128bd4be-4a76-4dbb-aef6-65acd9c11cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.228 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.228 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.280 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.300 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.320 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:27:42 compute-1 nova_compute[192795]: 2025-09-30 21:27:42.321 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:43 compute-1 nova_compute[192795]: 2025-09-30 21:27:43.321 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:43 compute-1 nova_compute[192795]: 2025-09-30 21:27:43.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:44 compute-1 nova_compute[192795]: 2025-09-30 21:27:44.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:45 compute-1 nova_compute[192795]: 2025-09-30 21:27:45.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:45 compute-1 nova_compute[192795]: 2025-09-30 21:27:45.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:27:46 compute-1 nova_compute[192795]: 2025-09-30 21:27:46.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:47 compute-1 nova_compute[192795]: 2025-09-30 21:27:47.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:47 compute-1 nova_compute[192795]: 2025-09-30 21:27:47.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:47 compute-1 nova_compute[192795]: 2025-09-30 21:27:47.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:48 compute-1 ovn_controller[94902]: 2025-09-30T21:27:48Z|00263|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:27:48 compute-1 nova_compute[192795]: 2025-09-30 21:27:48.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:48 compute-1 podman[230748]: 2025-09-30 21:27:48.224240236 +0000 UTC m=+0.063228140 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git)
Sep 30 21:27:48 compute-1 podman[230749]: 2025-09-30 21:27:48.239666508 +0000 UTC m=+0.062416048 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:27:48 compute-1 podman[230755]: 2025-09-30 21:27:48.239660268 +0000 UTC m=+0.062491350 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:27:48 compute-1 nova_compute[192795]: 2025-09-30 21:27:48.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:50 compute-1 nova_compute[192795]: 2025-09-30 21:27:50.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:50 compute-1 nova_compute[192795]: 2025-09-30 21:27:50.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:27:50 compute-1 nova_compute[192795]: 2025-09-30 21:27:50.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:27:50 compute-1 nova_compute[192795]: 2025-09-30 21:27:50.928 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:50 compute-1 nova_compute[192795]: 2025-09-30 21:27:50.928 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:50 compute-1 nova_compute[192795]: 2025-09-30 21:27:50.928 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:27:50 compute-1 nova_compute[192795]: 2025-09-30 21:27:50.929 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:51 compute-1 nova_compute[192795]: 2025-09-30 21:27:51.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:51 compute-1 nova_compute[192795]: 2025-09-30 21:27:51.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:52 compute-1 nova_compute[192795]: 2025-09-30 21:27:52.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:52 compute-1 nova_compute[192795]: 2025-09-30 21:27:52.892 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:52 compute-1 nova_compute[192795]: 2025-09-30 21:27:52.913 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:52 compute-1 nova_compute[192795]: 2025-09-30 21:27:52.914 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:27:52 compute-1 nova_compute[192795]: 2025-09-30 21:27:52.914 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:53 compute-1 ovn_controller[94902]: 2025-09-30T21:27:53Z|00264|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:27:53 compute-1 nova_compute[192795]: 2025-09-30 21:27:53.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:56 compute-1 nova_compute[192795]: 2025-09-30 21:27:56.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:57 compute-1 podman[230809]: 2025-09-30 21:27:57.212169148 +0000 UTC m=+0.054417305 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:27:57 compute-1 nova_compute[192795]: 2025-09-30 21:27:57.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:57 compute-1 nova_compute[192795]: 2025-09-30 21:27:57.909 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:57 compute-1 ovn_controller[94902]: 2025-09-30T21:27:57Z|00265|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:27:58 compute-1 nova_compute[192795]: 2025-09-30 21:27:58.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:01 compute-1 nova_compute[192795]: 2025-09-30 21:28:01.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:01 compute-1 nova_compute[192795]: 2025-09-30 21:28:01.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:02 compute-1 nova_compute[192795]: 2025-09-30 21:28:02.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:06 compute-1 nova_compute[192795]: 2025-09-30 21:28:06.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:07 compute-1 podman[230830]: 2025-09-30 21:28:07.221408176 +0000 UTC m=+0.058801962 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, config_id=multipathd)
Sep 30 21:28:07 compute-1 podman[230832]: 2025-09-30 21:28:07.224906709 +0000 UTC m=+0.054650140 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:28:07 compute-1 podman[230831]: 2025-09-30 21:28:07.255160447 +0000 UTC m=+0.088680939 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:28:07 compute-1 nova_compute[192795]: 2025-09-30 21:28:07.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:07 compute-1 nova_compute[192795]: 2025-09-30 21:28:07.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-1 nova_compute[192795]: 2025-09-30 21:28:08.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:11 compute-1 podman[230895]: 2025-09-30 21:28:11.210116773 +0000 UTC m=+0.057647060 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:28:11 compute-1 nova_compute[192795]: 2025-09-30 21:28:11.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:12 compute-1 nova_compute[192795]: 2025-09-30 21:28:12.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:16 compute-1 nova_compute[192795]: 2025-09-30 21:28:16.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:17 compute-1 nova_compute[192795]: 2025-09-30 21:28:17.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:17 compute-1 nova_compute[192795]: 2025-09-30 21:28:17.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:19 compute-1 podman[230916]: 2025-09-30 21:28:19.221142025 +0000 UTC m=+0.058787931 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:28:19 compute-1 podman[230915]: 2025-09-30 21:28:19.221259288 +0000 UTC m=+0.064113584 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Sep 30 21:28:19 compute-1 podman[230917]: 2025-09-30 21:28:19.243742598 +0000 UTC m=+0.066875267 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 21:28:21 compute-1 nova_compute[192795]: 2025-09-30 21:28:21.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.235 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "e136dd3c-da47-4b62-aadc-c45739d7f389" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.236 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.249 2 DEBUG nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.344 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.345 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.352 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.352 2 INFO nova.compute.claims [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.559 2 DEBUG nova.compute.provider_tree [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.577 2 DEBUG nova.scheduler.client.report [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.597 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.598 2 DEBUG nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:28:22 compute-1 ovn_controller[94902]: 2025-09-30T21:28:22Z|00266|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.671 2 DEBUG nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.672 2 DEBUG nova.network.neutron [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.692 2 INFO nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.721 2 DEBUG nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.854 2 DEBUG nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.856 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.856 2 INFO nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Creating image(s)
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.856 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "/var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.857 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "/var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.857 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "/var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.870 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.927 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.928 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.929 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.940 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.959 2 DEBUG nova.policy [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f015ae90e1c4450086ba1ede16312cd8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '56451656d6c84a6dac27d379e7a887a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.996 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:22 compute-1 nova_compute[192795]: 2025-09-30 21:28:22.996 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.031 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.032 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.033 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.089 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.090 2 DEBUG nova.virt.disk.api [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Checking if we can resize image /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.091 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.146 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.148 2 DEBUG nova.virt.disk.api [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Cannot resize image /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.148 2 DEBUG nova.objects.instance [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lazy-loading 'migration_context' on Instance uuid e136dd3c-da47-4b62-aadc-c45739d7f389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.171 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.172 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Ensure instance console log exists: /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.172 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.173 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:23 compute-1 nova_compute[192795]: 2025-09-30 21:28:23.173 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:24 compute-1 nova_compute[192795]: 2025-09-30 21:28:24.840 2 DEBUG nova.network.neutron [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Successfully created port: 0395c6ca-d794-4975-8eae-64ebee202c5e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:28:26 compute-1 nova_compute[192795]: 2025-09-30 21:28:26.744 2 DEBUG nova.network.neutron [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Successfully updated port: 0395c6ca-d794-4975-8eae-64ebee202c5e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:28:26 compute-1 nova_compute[192795]: 2025-09-30 21:28:26.759 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "refresh_cache-e136dd3c-da47-4b62-aadc-c45739d7f389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:26 compute-1 nova_compute[192795]: 2025-09-30 21:28:26.759 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquired lock "refresh_cache-e136dd3c-da47-4b62-aadc-c45739d7f389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:26 compute-1 nova_compute[192795]: 2025-09-30 21:28:26.759 2 DEBUG nova.network.neutron [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:28:26 compute-1 nova_compute[192795]: 2025-09-30 21:28:26.892 2 DEBUG nova.compute.manager [req-a73edb04-920b-4d12-b885-f4ffba4e9458 req-ec869dbb-ca1b-45f6-a6a0-0bb44704a5b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Received event network-changed-0395c6ca-d794-4975-8eae-64ebee202c5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:26 compute-1 nova_compute[192795]: 2025-09-30 21:28:26.892 2 DEBUG nova.compute.manager [req-a73edb04-920b-4d12-b885-f4ffba4e9458 req-ec869dbb-ca1b-45f6-a6a0-0bb44704a5b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Refreshing instance network info cache due to event network-changed-0395c6ca-d794-4975-8eae-64ebee202c5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:28:26 compute-1 nova_compute[192795]: 2025-09-30 21:28:26.893 2 DEBUG oslo_concurrency.lockutils [req-a73edb04-920b-4d12-b885-f4ffba4e9458 req-ec869dbb-ca1b-45f6-a6a0-0bb44704a5b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-e136dd3c-da47-4b62-aadc-c45739d7f389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:26 compute-1 nova_compute[192795]: 2025-09-30 21:28:26.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:26 compute-1 nova_compute[192795]: 2025-09-30 21:28:26.956 2 DEBUG nova.network.neutron [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:28:27 compute-1 nova_compute[192795]: 2025-09-30 21:28:27.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:27.842 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:27.843 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:28:27 compute-1 nova_compute[192795]: 2025-09-30 21:28:27.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-1 podman[230993]: 2025-09-30 21:28:28.215279312 +0000 UTC m=+0.053053658 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.248 2 DEBUG nova.network.neutron [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Updating instance_info_cache with network_info: [{"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.281 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Releasing lock "refresh_cache-e136dd3c-da47-4b62-aadc-c45739d7f389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.282 2 DEBUG nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Instance network_info: |[{"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.282 2 DEBUG oslo_concurrency.lockutils [req-a73edb04-920b-4d12-b885-f4ffba4e9458 req-ec869dbb-ca1b-45f6-a6a0-0bb44704a5b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-e136dd3c-da47-4b62-aadc-c45739d7f389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.282 2 DEBUG nova.network.neutron [req-a73edb04-920b-4d12-b885-f4ffba4e9458 req-ec869dbb-ca1b-45f6-a6a0-0bb44704a5b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Refreshing network info cache for port 0395c6ca-d794-4975-8eae-64ebee202c5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.285 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Start _get_guest_xml network_info=[{"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.290 2 WARNING nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.295 2 DEBUG nova.virt.libvirt.host [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.296 2 DEBUG nova.virt.libvirt.host [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.298 2 DEBUG nova.virt.libvirt.host [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.299 2 DEBUG nova.virt.libvirt.host [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.300 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.300 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.301 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.301 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.301 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.302 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.302 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.302 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.302 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.302 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.303 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.303 2 DEBUG nova.virt.hardware [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.306 2 DEBUG nova.virt.libvirt.vif [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:28:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1079223107',display_name='tempest-ImagesNegativeTestJSON-server-1079223107',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1079223107',id=69,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='56451656d6c84a6dac27d379e7a887a3',ramdisk_id='',reservation_id='r-7dsomi2v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1419038098',owner_user_name='tempest-ImagesNegativeTestJSON-1419038098-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:22Z,user_data=None,user_id='f015ae90e1c4450086ba1ede16312cd8',uuid=e136dd3c-da47-4b62-aadc-c45739d7f389,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.307 2 DEBUG nova.network.os_vif_util [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Converting VIF {"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.308 2 DEBUG nova.network.os_vif_util [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:a8:ba,bridge_name='br-int',has_traffic_filtering=True,id=0395c6ca-d794-4975-8eae-64ebee202c5e,network=Network(a4a5f176-fa60-4487-8f3f-18165cdaa575),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0395c6ca-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.309 2 DEBUG nova.objects.instance [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid e136dd3c-da47-4b62-aadc-c45739d7f389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.322 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <uuid>e136dd3c-da47-4b62-aadc-c45739d7f389</uuid>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <name>instance-00000045</name>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <nova:name>tempest-ImagesNegativeTestJSON-server-1079223107</nova:name>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:28:28</nova:creationTime>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:28:28 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:28:28 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:28:28 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:28:28 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:28:28 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:28:28 compute-1 nova_compute[192795]:         <nova:user uuid="f015ae90e1c4450086ba1ede16312cd8">tempest-ImagesNegativeTestJSON-1419038098-project-member</nova:user>
Sep 30 21:28:28 compute-1 nova_compute[192795]:         <nova:project uuid="56451656d6c84a6dac27d379e7a887a3">tempest-ImagesNegativeTestJSON-1419038098</nova:project>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:28:28 compute-1 nova_compute[192795]:         <nova:port uuid="0395c6ca-d794-4975-8eae-64ebee202c5e">
Sep 30 21:28:28 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <system>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <entry name="serial">e136dd3c-da47-4b62-aadc-c45739d7f389</entry>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <entry name="uuid">e136dd3c-da47-4b62-aadc-c45739d7f389</entry>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     </system>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <os>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   </os>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <features>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   </features>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk.config"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:50:a8:ba"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <target dev="tap0395c6ca-d7"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/console.log" append="off"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <video>
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     </video>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:28:28 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:28:28 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:28:28 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:28:28 compute-1 nova_compute[192795]: </domain>
Sep 30 21:28:28 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.324 2 DEBUG nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Preparing to wait for external event network-vif-plugged-0395c6ca-d794-4975-8eae-64ebee202c5e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.324 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.325 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.325 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.326 2 DEBUG nova.virt.libvirt.vif [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:28:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1079223107',display_name='tempest-ImagesNegativeTestJSON-server-1079223107',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1079223107',id=69,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='56451656d6c84a6dac27d379e7a887a3',ramdisk_id='',reservation_id='r-7dsomi2v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1419038098',owner_user_name='tempest-ImagesNegativeTestJSON-1419038098-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:22Z,user_data=None,user_id='f015ae90e1c4450086ba1ede16312cd8',uuid=e136dd3c-da47-4b62-aadc-c45739d7f389,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.326 2 DEBUG nova.network.os_vif_util [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Converting VIF {"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.327 2 DEBUG nova.network.os_vif_util [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:a8:ba,bridge_name='br-int',has_traffic_filtering=True,id=0395c6ca-d794-4975-8eae-64ebee202c5e,network=Network(a4a5f176-fa60-4487-8f3f-18165cdaa575),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0395c6ca-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.327 2 DEBUG os_vif [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:a8:ba,bridge_name='br-int',has_traffic_filtering=True,id=0395c6ca-d794-4975-8eae-64ebee202c5e,network=Network(a4a5f176-fa60-4487-8f3f-18165cdaa575),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0395c6ca-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.328 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.329 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.332 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0395c6ca-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.332 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0395c6ca-d7, col_values=(('external_ids', {'iface-id': '0395c6ca-d794-4975-8eae-64ebee202c5e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:a8:ba', 'vm-uuid': 'e136dd3c-da47-4b62-aadc-c45739d7f389'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:28 compute-1 NetworkManager[51724]: <info>  [1759267708.3351] manager: (tap0395c6ca-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.344 2 INFO os_vif [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:a8:ba,bridge_name='br-int',has_traffic_filtering=True,id=0395c6ca-d794-4975-8eae-64ebee202c5e,network=Network(a4a5f176-fa60-4487-8f3f-18165cdaa575),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0395c6ca-d7')
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.435 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.436 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.436 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] No VIF found with MAC fa:16:3e:50:a8:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.436 2 INFO nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Using config drive
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.809 2 INFO nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Creating config drive at /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk.config
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.814 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp62kigby7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:28 compute-1 nova_compute[192795]: 2025-09-30 21:28:28.945 2 DEBUG oslo_concurrency.processutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp62kigby7" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:29 compute-1 kernel: tap0395c6ca-d7: entered promiscuous mode
Sep 30 21:28:29 compute-1 NetworkManager[51724]: <info>  [1759267709.0673] manager: (tap0395c6ca-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/134)
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:29 compute-1 ovn_controller[94902]: 2025-09-30T21:28:29Z|00267|binding|INFO|Claiming lport 0395c6ca-d794-4975-8eae-64ebee202c5e for this chassis.
Sep 30 21:28:29 compute-1 ovn_controller[94902]: 2025-09-30T21:28:29Z|00268|binding|INFO|0395c6ca-d794-4975-8eae-64ebee202c5e: Claiming fa:16:3e:50:a8:ba 10.100.0.9
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.109 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:a8:ba 10.100.0.9'], port_security=['fa:16:3e:50:a8:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e136dd3c-da47-4b62-aadc-c45739d7f389', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4a5f176-fa60-4487-8f3f-18165cdaa575', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '56451656d6c84a6dac27d379e7a887a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '84332154-96df-4a84-8541-a1f6b650db97', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acc219e8-d95a-4208-a068-161cb33bfd23, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=0395c6ca-d794-4975-8eae-64ebee202c5e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.111 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 0395c6ca-d794-4975-8eae-64ebee202c5e in datapath a4a5f176-fa60-4487-8f3f-18165cdaa575 bound to our chassis
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.112 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a4a5f176-fa60-4487-8f3f-18165cdaa575
Sep 30 21:28:29 compute-1 ovn_controller[94902]: 2025-09-30T21:28:29Z|00269|binding|INFO|Setting lport 0395c6ca-d794-4975-8eae-64ebee202c5e ovn-installed in OVS
Sep 30 21:28:29 compute-1 ovn_controller[94902]: 2025-09-30T21:28:29Z|00270|binding|INFO|Setting lport 0395c6ca-d794-4975-8eae-64ebee202c5e up in Southbound
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:29 compute-1 systemd-udevd[231029]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:28:29 compute-1 NetworkManager[51724]: <info>  [1759267709.1418] device (tap0395c6ca-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:28:29 compute-1 NetworkManager[51724]: <info>  [1759267709.1425] device (tap0395c6ca-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:28:29 compute-1 systemd-machined[152783]: New machine qemu-34-instance-00000045.
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.139 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[450be8e9-33e5-4a26-abe3-7ba48ce5c6a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.145 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa4a5f176-f1 in ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.148 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa4a5f176-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.148 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b4818312-8018-4d94-9d6b-0625746ee195]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.149 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[94916579-f54b-4b7d-8eb9-bc8656537a0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.161 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[ab12bb23-2d4c-43f4-bbcb-9d5771996cde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 systemd[1]: Started Virtual Machine qemu-34-instance-00000045.
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.187 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2b67a614-39aa-41cf-98cb-62c2093def0b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.218 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6224de56-01d4-46f5-81c0-7d947334adf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.225 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[54cb2750-ba7b-48a0-a503-0835cdb1a830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 NetworkManager[51724]: <info>  [1759267709.2264] manager: (tapa4a5f176-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/135)
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.258 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[80e62234-ef35-46b1-9abb-169e8894d568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.261 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5961609e-7502-47c0-97ad-dfc5533b985c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 NetworkManager[51724]: <info>  [1759267709.2817] device (tapa4a5f176-f0): carrier: link connected
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.287 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2bdcd24c-4b8f-4646-8ed3-88eefd9e385c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.306 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[115887a4-9e5f-4076-bdb5-de03c48a3e09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa4a5f176-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:2d:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440495, 'reachable_time': 35084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231065, 'error': None, 'target': 'ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.326 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[58231fd6-e068-478b-96d2-7e9694373626]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:2db1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440495, 'tstamp': 440495}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231066, 'error': None, 'target': 'ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.346 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[852f169a-81c2-49ba-82fa-9b4c21bc5f7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa4a5f176-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:2d:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440495, 'reachable_time': 35084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231067, 'error': None, 'target': 'ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.392 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[862a8d01-08f4-429d-947c-3375601e0035]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.460 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[58f27f58-250e-476d-b667-691845d29598]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.463 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4a5f176-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.463 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.464 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4a5f176-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:29 compute-1 NetworkManager[51724]: <info>  [1759267709.4663] manager: (tapa4a5f176-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Sep 30 21:28:29 compute-1 kernel: tapa4a5f176-f0: entered promiscuous mode
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.469 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa4a5f176-f0, col_values=(('external_ids', {'iface-id': '86e582db-4037-4c4e-95cb-d23b717fd846'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:29 compute-1 ovn_controller[94902]: 2025-09-30T21:28:29Z|00271|binding|INFO|Releasing lport 86e582db-4037-4c4e-95cb-d23b717fd846 from this chassis (sb_readonly=0)
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.487 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a4a5f176-fa60-4487-8f3f-18165cdaa575.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a4a5f176-fa60-4487-8f3f-18165cdaa575.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.488 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[47abebce-e172-457e-9e89-376811c2339d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.489 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-a4a5f176-fa60-4487-8f3f-18165cdaa575
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/a4a5f176-fa60-4487-8f3f-18165cdaa575.pid.haproxy
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID a4a5f176-fa60-4487-8f3f-18165cdaa575
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:28:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:29.490 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575', 'env', 'PROCESS_TAG=haproxy-a4a5f176-fa60-4487-8f3f-18165cdaa575', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a4a5f176-fa60-4487-8f3f-18165cdaa575.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.752 2 DEBUG nova.network.neutron [req-a73edb04-920b-4d12-b885-f4ffba4e9458 req-ec869dbb-ca1b-45f6-a6a0-0bb44704a5b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Updated VIF entry in instance network info cache for port 0395c6ca-d794-4975-8eae-64ebee202c5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.753 2 DEBUG nova.network.neutron [req-a73edb04-920b-4d12-b885-f4ffba4e9458 req-ec869dbb-ca1b-45f6-a6a0-0bb44704a5b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Updating instance_info_cache with network_info: [{"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.771 2 DEBUG oslo_concurrency.lockutils [req-a73edb04-920b-4d12-b885-f4ffba4e9458 req-ec869dbb-ca1b-45f6-a6a0-0bb44704a5b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-e136dd3c-da47-4b62-aadc-c45739d7f389" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.882 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267709.882306, e136dd3c-da47-4b62-aadc-c45739d7f389 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.883 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] VM Started (Lifecycle Event)
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.900 2 DEBUG nova.compute.manager [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Received event network-vif-plugged-0395c6ca-d794-4975-8eae-64ebee202c5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.901 2 DEBUG oslo_concurrency.lockutils [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.901 2 DEBUG oslo_concurrency.lockutils [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.901 2 DEBUG oslo_concurrency.lockutils [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.902 2 DEBUG nova.compute.manager [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Processing event network-vif-plugged-0395c6ca-d794-4975-8eae-64ebee202c5e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.902 2 DEBUG nova.compute.manager [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Received event network-vif-plugged-0395c6ca-d794-4975-8eae-64ebee202c5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.902 2 DEBUG oslo_concurrency.lockutils [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.902 2 DEBUG oslo_concurrency.lockutils [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.902 2 DEBUG oslo_concurrency.lockutils [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.903 2 DEBUG nova.compute.manager [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] No waiting events found dispatching network-vif-plugged-0395c6ca-d794-4975-8eae-64ebee202c5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.903 2 WARNING nova.compute.manager [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Received unexpected event network-vif-plugged-0395c6ca-d794-4975-8eae-64ebee202c5e for instance with vm_state building and task_state spawning.
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.903 2 DEBUG nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.909 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.913 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.917 2 INFO nova.virt.libvirt.driver [-] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Instance spawned successfully.
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.918 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.922 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.941 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.941 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267709.882426, e136dd3c-da47-4b62-aadc-c45739d7f389 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.942 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] VM Paused (Lifecycle Event)
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.945 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.945 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.946 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.946 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.946 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.947 2 DEBUG nova.virt.libvirt.driver [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.984 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.988 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267709.908547, e136dd3c-da47-4b62-aadc-c45739d7f389 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:29 compute-1 nova_compute[192795]: 2025-09-30 21:28:29.988 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] VM Resumed (Lifecycle Event)
Sep 30 21:28:30 compute-1 podman[231106]: 2025-09-30 21:28:29.911047222 +0000 UTC m=+0.040444701 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:28:30 compute-1 nova_compute[192795]: 2025-09-30 21:28:30.011 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:30 compute-1 nova_compute[192795]: 2025-09-30 21:28:30.016 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:30 compute-1 podman[231106]: 2025-09-30 21:28:30.030864971 +0000 UTC m=+0.160262410 container create 7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:28:30 compute-1 nova_compute[192795]: 2025-09-30 21:28:30.031 2 INFO nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Took 7.18 seconds to spawn the instance on the hypervisor.
Sep 30 21:28:30 compute-1 nova_compute[192795]: 2025-09-30 21:28:30.031 2 DEBUG nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:30 compute-1 nova_compute[192795]: 2025-09-30 21:28:30.041 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:28:30 compute-1 systemd[1]: Started libpod-conmon-7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae.scope.
Sep 30 21:28:30 compute-1 nova_compute[192795]: 2025-09-30 21:28:30.119 2 INFO nova.compute.manager [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Took 7.81 seconds to build instance.
Sep 30 21:28:30 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:28:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be46aeae4efbe7d834391a66b4d68a9c47c1348ddfb5797c1d916353330349d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:28:30 compute-1 nova_compute[192795]: 2025-09-30 21:28:30.140 2 DEBUG oslo_concurrency.lockutils [None req-c8697f70-558b-47db-9564-910ed2b94523 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.904s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:30 compute-1 podman[231106]: 2025-09-30 21:28:30.15662669 +0000 UTC m=+0.286024119 container init 7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:28:30 compute-1 podman[231106]: 2025-09-30 21:28:30.1629878 +0000 UTC m=+0.292385229 container start 7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:28:30 compute-1 neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575[231121]: [NOTICE]   (231125) : New worker (231127) forked
Sep 30 21:28:30 compute-1 neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575[231121]: [NOTICE]   (231125) : Loading success.
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.379 2 DEBUG oslo_concurrency.lockutils [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "e136dd3c-da47-4b62-aadc-c45739d7f389" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.380 2 DEBUG oslo_concurrency.lockutils [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.380 2 DEBUG oslo_concurrency.lockutils [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.380 2 DEBUG oslo_concurrency.lockutils [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.380 2 DEBUG oslo_concurrency.lockutils [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.397 2 INFO nova.compute.manager [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Terminating instance
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.409 2 DEBUG nova.compute.manager [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:28:31 compute-1 kernel: tap0395c6ca-d7 (unregistering): left promiscuous mode
Sep 30 21:28:31 compute-1 NetworkManager[51724]: <info>  [1759267711.4318] device (tap0395c6ca-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 ovn_controller[94902]: 2025-09-30T21:28:31Z|00272|binding|INFO|Releasing lport 0395c6ca-d794-4975-8eae-64ebee202c5e from this chassis (sb_readonly=0)
Sep 30 21:28:31 compute-1 ovn_controller[94902]: 2025-09-30T21:28:31Z|00273|binding|INFO|Setting lport 0395c6ca-d794-4975-8eae-64ebee202c5e down in Southbound
Sep 30 21:28:31 compute-1 ovn_controller[94902]: 2025-09-30T21:28:31Z|00274|binding|INFO|Removing iface tap0395c6ca-d7 ovn-installed in OVS
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.463 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:a8:ba 10.100.0.9'], port_security=['fa:16:3e:50:a8:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e136dd3c-da47-4b62-aadc-c45739d7f389', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4a5f176-fa60-4487-8f3f-18165cdaa575', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '56451656d6c84a6dac27d379e7a887a3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '84332154-96df-4a84-8541-a1f6b650db97', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acc219e8-d95a-4208-a068-161cb33bfd23, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=0395c6ca-d794-4975-8eae-64ebee202c5e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.465 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 0395c6ca-d794-4975-8eae-64ebee202c5e in datapath a4a5f176-fa60-4487-8f3f-18165cdaa575 unbound from our chassis
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.467 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a4a5f176-fa60-4487-8f3f-18165cdaa575, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.468 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dde3a42e-1196-40b9-a4b2-2c1945ef25ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.469 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575 namespace which is not needed anymore
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000045.scope: Deactivated successfully.
Sep 30 21:28:31 compute-1 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000045.scope: Consumed 2.144s CPU time.
Sep 30 21:28:31 compute-1 systemd-machined[152783]: Machine qemu-34-instance-00000045 terminated.
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575[231121]: [NOTICE]   (231125) : haproxy version is 2.8.14-c23fe91
Sep 30 21:28:31 compute-1 neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575[231121]: [NOTICE]   (231125) : path to executable is /usr/sbin/haproxy
Sep 30 21:28:31 compute-1 neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575[231121]: [WARNING]  (231125) : Exiting Master process...
Sep 30 21:28:31 compute-1 neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575[231121]: [WARNING]  (231125) : Exiting Master process...
Sep 30 21:28:31 compute-1 neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575[231121]: [ALERT]    (231125) : Current worker (231127) exited with code 143 (Terminated)
Sep 30 21:28:31 compute-1 neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575[231121]: [WARNING]  (231125) : All workers exited. Exiting... (0)
Sep 30 21:28:31 compute-1 systemd[1]: libpod-7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae.scope: Deactivated successfully.
Sep 30 21:28:31 compute-1 podman[231157]: 2025-09-30 21:28:31.66183963 +0000 UTC m=+0.060810555 container died 7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.692 2 INFO nova.virt.libvirt.driver [-] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Instance destroyed successfully.
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.693 2 DEBUG nova.objects.instance [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lazy-loading 'resources' on Instance uuid e136dd3c-da47-4b62-aadc-c45739d7f389 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:31 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae-userdata-shm.mount: Deactivated successfully.
Sep 30 21:28:31 compute-1 systemd[1]: var-lib-containers-storage-overlay-be46aeae4efbe7d834391a66b4d68a9c47c1348ddfb5797c1d916353330349d7-merged.mount: Deactivated successfully.
Sep 30 21:28:31 compute-1 podman[231157]: 2025-09-30 21:28:31.709306718 +0000 UTC m=+0.108277633 container cleanup 7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:28:31 compute-1 systemd[1]: libpod-conmon-7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae.scope: Deactivated successfully.
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.780 2 DEBUG nova.virt.libvirt.vif [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:28:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1079223107',display_name='tempest-ImagesNegativeTestJSON-server-1079223107',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1079223107',id=69,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='56451656d6c84a6dac27d379e7a887a3',ramdisk_id='',reservation_id='r-7dsomi2v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-1419038098',owner_user_name='tempest-ImagesNegativeTestJSON-1419038098-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:28:30Z,user_data=None,user_id='f015ae90e1c4450086ba1ede16312cd8',uuid=e136dd3c-da47-4b62-aadc-c45739d7f389,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.780 2 DEBUG nova.network.os_vif_util [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Converting VIF {"id": "0395c6ca-d794-4975-8eae-64ebee202c5e", "address": "fa:16:3e:50:a8:ba", "network": {"id": "a4a5f176-fa60-4487-8f3f-18165cdaa575", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-218276210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "56451656d6c84a6dac27d379e7a887a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0395c6ca-d7", "ovs_interfaceid": "0395c6ca-d794-4975-8eae-64ebee202c5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.781 2 DEBUG nova.network.os_vif_util [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:a8:ba,bridge_name='br-int',has_traffic_filtering=True,id=0395c6ca-d794-4975-8eae-64ebee202c5e,network=Network(a4a5f176-fa60-4487-8f3f-18165cdaa575),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0395c6ca-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.781 2 DEBUG os_vif [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:a8:ba,bridge_name='br-int',has_traffic_filtering=True,id=0395c6ca-d794-4975-8eae-64ebee202c5e,network=Network(a4a5f176-fa60-4487-8f3f-18165cdaa575),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0395c6ca-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.784 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0395c6ca-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 podman[231201]: 2025-09-30 21:28:31.79103043 +0000 UTC m=+0.051500247 container remove 7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.791 2 INFO os_vif [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:a8:ba,bridge_name='br-int',has_traffic_filtering=True,id=0395c6ca-d794-4975-8eae-64ebee202c5e,network=Network(a4a5f176-fa60-4487-8f3f-18165cdaa575),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0395c6ca-d7')
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.791 2 INFO nova.virt.libvirt.driver [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Deleting instance files /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389_del
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.792 2 INFO nova.virt.libvirt.driver [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Deletion of /var/lib/nova/instances/e136dd3c-da47-4b62-aadc-c45739d7f389_del complete
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.798 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c6b7e7b4-8454-4a4f-b1d3-6c417d86bbf4]: (4, ('Tue Sep 30 09:28:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575 (7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae)\n7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae\nTue Sep 30 09:28:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575 (7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae)\n7298db380fab8deff7a998cb7ccee5b1e4920230da4022f8216c291c3bc6b9ae\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.800 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[34fa8ceb-502b-4903-8f4d-33879d684f28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.801 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4a5f176-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 kernel: tapa4a5f176-f0: left promiscuous mode
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.824 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3bdafb61-56d9-49c8-9555-1640f24b0220]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.863 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e58470cc-ea91-4f04-9917-2a2deb259ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.865 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[41ef81c7-7157-4553-bd0f-0e9e53dd0fde]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.883 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f360972c-789f-447e-945d-cf1661c5dc7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440488, 'reachable_time': 41755, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231219, 'error': None, 'target': 'ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.886 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a4a5f176-fa60-4487-8f3f-18165cdaa575 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:28:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:31.886 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[5e72fe5a-8856-4135-beb1-7db9490fa61d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:31 compute-1 systemd[1]: run-netns-ovnmeta\x2da4a5f176\x2dfa60\x2d4487\x2d8f3f\x2d18165cdaa575.mount: Deactivated successfully.
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.909 2 INFO nova.compute.manager [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Took 0.50 seconds to destroy the instance on the hypervisor.
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.910 2 DEBUG oslo.service.loopingcall [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.910 2 DEBUG nova.compute.manager [-] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:28:31 compute-1 nova_compute[192795]: 2025-09-30 21:28:31.910 2 DEBUG nova.network.neutron [-] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:28:32 compute-1 nova_compute[192795]: 2025-09-30 21:28:32.242 2 DEBUG nova.compute.manager [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Received event network-vif-unplugged-0395c6ca-d794-4975-8eae-64ebee202c5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:32 compute-1 nova_compute[192795]: 2025-09-30 21:28:32.242 2 DEBUG oslo_concurrency.lockutils [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:32 compute-1 nova_compute[192795]: 2025-09-30 21:28:32.242 2 DEBUG oslo_concurrency.lockutils [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:32 compute-1 nova_compute[192795]: 2025-09-30 21:28:32.243 2 DEBUG oslo_concurrency.lockutils [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:32 compute-1 nova_compute[192795]: 2025-09-30 21:28:32.243 2 DEBUG nova.compute.manager [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] No waiting events found dispatching network-vif-unplugged-0395c6ca-d794-4975-8eae-64ebee202c5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:32 compute-1 nova_compute[192795]: 2025-09-30 21:28:32.243 2 DEBUG nova.compute.manager [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Received event network-vif-unplugged-0395c6ca-d794-4975-8eae-64ebee202c5e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:28:32 compute-1 nova_compute[192795]: 2025-09-30 21:28:32.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:32.845 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.061 2 DEBUG nova.network.neutron [-] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.114 2 INFO nova.compute.manager [-] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Took 1.20 seconds to deallocate network for instance.
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.196 2 DEBUG nova.compute.manager [req-8bcd712c-25c5-411d-a05d-60c800987045 req-224ed905-7a66-4649-a083-c4b623f06d56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Received event network-vif-deleted-0395c6ca-d794-4975-8eae-64ebee202c5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.200 2 DEBUG oslo_concurrency.lockutils [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.200 2 DEBUG oslo_concurrency.lockutils [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.275 2 DEBUG nova.compute.provider_tree [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.294 2 DEBUG nova.scheduler.client.report [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.329 2 DEBUG oslo_concurrency.lockutils [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.354 2 INFO nova.scheduler.client.report [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Deleted allocations for instance e136dd3c-da47-4b62-aadc-c45739d7f389
Sep 30 21:28:33 compute-1 nova_compute[192795]: 2025-09-30 21:28:33.456 2 DEBUG oslo_concurrency.lockutils [None req-03662aad-bbf2-4aef-aa9b-78aeac9969f5 f015ae90e1c4450086ba1ede16312cd8 56451656d6c84a6dac27d379e7a887a3 - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:34 compute-1 nova_compute[192795]: 2025-09-30 21:28:34.446 2 DEBUG nova.compute.manager [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Received event network-vif-plugged-0395c6ca-d794-4975-8eae-64ebee202c5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:34 compute-1 nova_compute[192795]: 2025-09-30 21:28:34.446 2 DEBUG oslo_concurrency.lockutils [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:34 compute-1 nova_compute[192795]: 2025-09-30 21:28:34.446 2 DEBUG oslo_concurrency.lockutils [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:34 compute-1 nova_compute[192795]: 2025-09-30 21:28:34.447 2 DEBUG oslo_concurrency.lockutils [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e136dd3c-da47-4b62-aadc-c45739d7f389-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:34 compute-1 nova_compute[192795]: 2025-09-30 21:28:34.447 2 DEBUG nova.compute.manager [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] No waiting events found dispatching network-vif-plugged-0395c6ca-d794-4975-8eae-64ebee202c5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:34 compute-1 nova_compute[192795]: 2025-09-30 21:28:34.447 2 WARNING nova.compute.manager [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Received unexpected event network-vif-plugged-0395c6ca-d794-4975-8eae-64ebee202c5e for instance with vm_state deleted and task_state None.
Sep 30 21:28:35 compute-1 nova_compute[192795]: 2025-09-30 21:28:35.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:36 compute-1 nova_compute[192795]: 2025-09-30 21:28:36.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:37 compute-1 nova_compute[192795]: 2025-09-30 21:28:37.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:37 compute-1 ovn_controller[94902]: 2025-09-30T21:28:37Z|00275|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:28:37 compute-1 nova_compute[192795]: 2025-09-30 21:28:37.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:38 compute-1 podman[231220]: 2025-09-30 21:28:38.216676511 +0000 UTC m=+0.057657271 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:28:38 compute-1 podman[231222]: 2025-09-30 21:28:38.228608499 +0000 UTC m=+0.061383810 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:28:38 compute-1 podman[231221]: 2025-09-30 21:28:38.263653595 +0000 UTC m=+0.100241178 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:28:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:38.689 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:38.690 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:38.691 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:41 compute-1 nova_compute[192795]: 2025-09-30 21:28:41.713 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:41 compute-1 podman[231287]: 2025-09-30 21:28:41.809926776 +0000 UTC m=+0.064168755 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:28:41 compute-1 nova_compute[192795]: 2025-09-30 21:28:41.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.210 2 DEBUG nova.compute.manager [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.340 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.340 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.361 2 DEBUG nova.objects.instance [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_requests' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.379 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.380 2 INFO nova.compute.claims [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.380 2 DEBUG nova.objects.instance [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.395 2 DEBUG nova.objects.instance [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.454 2 INFO nova.compute.resource_tracker [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating resource usage from migration eab66b74-3841-4dad-bf4d-d9d24a213f7f
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.537 2 DEBUG nova.compute.provider_tree [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.550 2 DEBUG nova.scheduler.client.report [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.569 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.570 2 INFO nova.compute.manager [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Migrating
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.609 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.610 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:42 compute-1 nova_compute[192795]: 2025-09-30 21:28:42.610 2 DEBUG nova.network.neutron [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.737 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.738 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.738 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.739 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.822 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.887 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.888 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:43 compute-1 nova_compute[192795]: 2025-09-30 21:28:43.952 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.017 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'name': 'tempest-ServerActionsTestJSON-server-394601736', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2af578a858a44374a3dc027bbf7c69f2', 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'hostId': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.018 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.042 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.requests volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.043 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27b8d4ce-9b81-4599-b45c-786792aa0372', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 48, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:28:44.018767', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7111a364-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': 'ba8df8bbb753e042fbd8c737bdce6b1fdf39389a7a9cc263e6e4f347b0ccb897'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:28:44.018767', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7111b2a0-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': 'e2536d5aa9bc1d4652f7185898deeed6c3d89c22ef955bc952c43e959b5626c2'}]}, 'timestamp': '2025-09-30 21:28:44.043658', '_unique_id': '2df6d8a53917448d8db5e0d9f98fcbae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.045 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.048 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.packets volume: 43 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46d3724d-c7fa-4557-b806-a8a7b65199b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 43, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.046160', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '71127aa0-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': '4f0fc564a68dda19a8fdde4d33ebc3735820db205a70d81cc7772a563f2a3ffa'}]}, 'timestamp': '2025-09-30 21:28:44.048830', '_unique_id': '6a7cbeab2aa547ed8bdfe43cf550382e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.049 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.050 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.requests volume: 1216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.050 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '355b7a06-f258-4b38-ad89-82ed7859298d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1216, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:28:44.050630', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7112cd2a-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': '50fe4e177c7890b719b9739045fef61fcc09844e4d1a91a3988404109c554986'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:28:44.050630', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7112d59a-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': '1052749e56097fcc987a7981d811aeb78caaf6779a540382be7bc98b7948a5aa'}]}, 'timestamp': '2025-09-30 21:28:44.051082', '_unique_id': '0f3bd55dc70a4768adf77a5f4410ad74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.051 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.052 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2792fc5-34ff-45e3-93bc-5cb6d213b466', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.052411', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '71131758-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': '420499a4f8e90261dd0c7f6d0c6314961d068ae9805a95a3edb1f33d6edc5bc7'}]}, 'timestamp': '2025-09-30 21:28:44.052783', '_unique_id': '8dcd0715c3144334983b833f01bdb30e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.069 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.usage volume: 30408704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.070 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '384c02e9-0c94-4b56-9a63-b5651ed93b35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30408704, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:28:44.054179', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7115d4ca-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.786971052, 'message_signature': '44f12dfa2c51180081ef93f62c7daeba7fe05a660c2ab9d400a66f23c748ac6e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:28:44.054179', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7115e8de-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.786971052, 'message_signature': '2e295206cc9e0fbd4c18b03029abdaf98c72946cb8ca29fd92d565b3e3702455'}]}, 'timestamp': '2025-09-30 21:28:44.071323', '_unique_id': 'a983b7b41f164efab1ec8d28c1653a7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.072 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.074 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.bytes.delta volume: 4404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e18f0e6d-34f5-4f17-89e9-76d100e498c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 4404, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.074556', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '71167a06-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': '3987d7ad2223e2446adc36262692a58cfbdf9f1331073167e498435da138dd1c'}]}, 'timestamp': '2025-09-30 21:28:44.075095', '_unique_id': 'c22c6cfe2022482ea1e1fb4569c88534'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.076 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.077 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.095 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/memory.usage volume: 42.40625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40643eca-8456-4aff-90c0-1cb7ecef114a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.40625, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'timestamp': '2025-09-30T21:28:44.077362', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '71199e70-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.827653268, 'message_signature': '03dde358ae1660985d138682278792979293393ad2a5ea79c9ad2a00b58b9190'}]}, 'timestamp': '2025-09-30 21:28:44.095740', '_unique_id': 'ddbb04eef88946a986fb563724216976'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.097 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e922224-5ea0-4f01-9f89-897c5bd7f346', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.097915', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '711a055e-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': '748a243359f465fcc542486d6cdf3a4849c63caf7e17241b01c50558ad4ea8ef'}]}, 'timestamp': '2025-09-30 21:28:44.098215', '_unique_id': '234353fbf79b43f48965df8fb68c4ab0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.098 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.099 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.099 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/cpu volume: 12350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fd18c40-8188-4e0d-99b8-45da8aa195cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12350000000, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'timestamp': '2025-09-30T21:28:44.099834', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '711a4f0a-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.827653268, 'message_signature': '84f264894c50420f8e9af8c73007e8e0782d9bede0c21ecf37122947bc22182c'}]}, 'timestamp': '2025-09-30 21:28:44.100114', '_unique_id': '0c81ef1454674ba3a2e9256df177756b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.100 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.101 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.packets volume: 43 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '190f550b-1fe5-450e-90ba-75e51c059d9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 43, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.101525', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '711a90fa-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': '50a8ab3345406d1e9e9eee3a5d094bf13b82f5fbef9f2cb82a4fd0ebc56288fb'}]}, 'timestamp': '2025-09-30 21:28:44.101769', '_unique_id': 'e8ae6743be7b4a4bbf7eb93bf583bddc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.103 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.103 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d382929-3e05-4b75-adde-b8facfbe05ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:28:44.103176', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711ad20e-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.786971052, 'message_signature': '60dede9551ff16e518cd7d4071cf6fc62623054a82e8b9c8e42b03b3e981e505'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:28:44.103176', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '711adb32-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.786971052, 'message_signature': '86ccd1bb1622281be4a9e27d8786da30afa2fa24ceb079a068e4c174a87fae41'}]}, 'timestamp': '2025-09-30 21:28:44.103651', '_unique_id': '08dd112a64ef48c0838648106db3b7da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.104 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.bytes volume: 32081920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd58891ee-d80a-4b2d-a689-d17146c87399', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32081920, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:28:44.104854', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711b17d2-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': '28c63b1e08de236114e0b10d887bc68586142c8e3cca52889917d5ca44799e47'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:28:44.104854', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '711b20ba-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': '229b6c985099a537e548322bff9748aeeca3f4032c2247bd1396bafdab0e66a9'}]}, 'timestamp': '2025-09-30 21:28:44.105461', '_unique_id': '594f907f6de04fcc9b6d5bb8a4b2e29b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.105 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b6f61d6-ec07-40b0-a31d-1791cb5883df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.106990', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '711b66a6-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': 'c93c97d8344fafb545f40aafee5ee0dadb5a21973e98b6582ca36e371535b3c5'}]}, 'timestamp': '2025-09-30 21:28:44.107242', '_unique_id': 'ae4d7db43a8b4d7dbbbcaa34a8d78041'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.107 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.108 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.108 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d303084-a8a0-4174-8607-e09158a9fc65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:28:44.108555', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711ba3d2-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.786971052, 'message_signature': '539f3b857722e4255c32e6150561575eebe9df8ddc8f9f43a7030d98544240b3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:28:44.108555', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '711bb0fc-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.786971052, 'message_signature': '01bc5f65f258ce16ea7eace358ce2b621b2a844c4229273392d58e59680299e3'}]}, 'timestamp': '2025-09-30 21:28:44.109126', '_unique_id': '31c77026fdee4211a2db06157a5c7bb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.109 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.110 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.latency volume: 705592195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.110 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.read.latency volume: 65402834 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3959e8fc-85b7-4b78-a224-0776c1471e28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 705592195, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:28:44.110483', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711beef0-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': '8b69d5413b74537b9534ab20afa3e520a7ed734903579ce12c5bcacdf3834d12'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 65402834, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:28:44.110483', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '711bf77e-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': '03c5d3922de68af77c5986990f4bed966c962a21a081d8ca258f7f0185a6ef96'}]}, 'timestamp': '2025-09-30 21:28:44.110928', '_unique_id': 'c250a6732d5141edb25b28ae5de03fdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.112 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.latency volume: 75925690 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.112 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16da88f9-2ee3-4a2c-a05d-ad54b7261a82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 75925690, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:28:44.112091', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711c2d20-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': 'd84ed3054a819d89e735e659bfb0567d934741d8091de640ef77254228aea284'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:28:44.112091', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '711c36c6-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': '2a3e5e2cfcff23804b606608e2fc4d3339279f6a8858d0219e069bd88a482554'}]}, 'timestamp': '2025-09-30 21:28:44.112562', '_unique_id': 'f3037c748f3346df8cf715d16f90697a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.113 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.bytes volume: 7227 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c89f2ed-4b12-4ba9-8298-88ebd9a8083a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7227, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.113781', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '711c6fb0-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': 'f44edf2bb0686bc66a2e273a99ed4bc7b3c95d05d5e6d613547080b7a78b558c'}]}, 'timestamp': '2025-09-30 21:28:44.114130', '_unique_id': '35f1c03da5974f09b934daf8cf52198f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.115 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.outgoing.bytes volume: 5458 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4730aff-5f5e-4799-a417-d3d367843dfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5458, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.115259', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '711caf8e-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': 'f68f994e05242bd92d089133ac6f3a6ec82a0e94e959505e895f6b2f35a6cc11'}]}, 'timestamp': '2025-09-30 21:28:44.115660', '_unique_id': '862b2b6973ec4c67aa6165470059ab80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.116 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.bytes.delta volume: 5940 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93670b32-05c6-4646-ab9e-0ddaca0ea3a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 5940, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.116930', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '711cea80-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': '18dc412e97c6060cc803d82b209996b260db433102c605623b2420b21b92ee8d'}]}, 'timestamp': '2025-09-30 21:28:44.117202', '_unique_id': 'e6fa27384daf4751bd245d192baa9023'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.117 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.118 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.bytes volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ceaa53e4-55f8-4c53-b62d-8cfb6b3d4897', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-vda', 'timestamp': '2025-09-30T21:28:44.118826', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711d34d6-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': '88d23f2b637845f78d530de3a866c3e2b96f5a55b0197fdfd55099b370bfc754'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd-sda', 'timestamp': '2025-09-30T21:28:44.118826', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'instance-00000038', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '711d3d50-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.751536575, 'message_signature': '37b3f937ab50ab08fc0b88de1909c51398827feeeb38e1645c4098fb35908624'}]}, 'timestamp': '2025-09-30 21:28:44.119271', '_unique_id': '9b7c177677bf48c98ffe89c41486b62f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.119 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.120 12 DEBUG ceilometer.compute.pollsters [-] 128bd4be-4a76-4dbb-aef6-65acd9c11cbd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1938a4fa-25f6-41e6-98b5-8d6eae0f5d8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_name': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_name': None, 'resource_id': 'instance-00000038-128bd4be-4a76-4dbb-aef6-65acd9c11cbd-tap242fb53f-7c', 'timestamp': '2025-09-30T21:28:44.120671', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-394601736', 'name': 'tap242fb53f-7c', 'instance_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'instance_type': 'm1.nano', 'host': 'e9e3e5e6ca5f01a0cdbb7e39dec3026f9d49da5d3efbdee501c90b17', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:e3:f2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap242fb53f-7c'}, 'message_id': '711d7cde-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4419.778954618, 'message_signature': 'd52928db3385ddfd201f69296b5ba5f52982bc7a434e092f4c5c1af52efd884c'}]}, 'timestamp': '2025-09-30 21:28:44.120915', '_unique_id': '5b5b534512a84551808ee8dd4f93f546'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.121 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:28:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:28:44.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.154 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.156 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5594MB free_disk=73.35670852661133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.156 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.157 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.212 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Applying migration context for instance 128bd4be-4a76-4dbb-aef6-65acd9c11cbd as it has an incoming, in-progress migration eab66b74-3841-4dad-bf4d-d9d24a213f7f. Migration status is pre-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.213 2 INFO nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating resource usage from migration eab66b74-3841-4dad-bf4d-d9d24a213f7f
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.247 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Migration eab66b74-3841-4dad-bf4d-d9d24a213f7f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.247 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 128bd4be-4a76-4dbb-aef6-65acd9c11cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.248 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.248 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.298 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.317 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.344 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:28:44 compute-1 nova_compute[192795]: 2025-09-30 21:28:44.345 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:45 compute-1 nova_compute[192795]: 2025-09-30 21:28:45.347 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:45 compute-1 nova_compute[192795]: 2025-09-30 21:28:45.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:45 compute-1 nova_compute[192795]: 2025-09-30 21:28:45.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:28:46 compute-1 nova_compute[192795]: 2025-09-30 21:28:46.122 2 DEBUG nova.network.neutron [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:46 compute-1 nova_compute[192795]: 2025-09-30 21:28:46.185 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:46 compute-1 nova_compute[192795]: 2025-09-30 21:28:46.485 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:28:46 compute-1 nova_compute[192795]: 2025-09-30 21:28:46.490 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:28:46 compute-1 nova_compute[192795]: 2025-09-30 21:28:46.691 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267711.689523, e136dd3c-da47-4b62-aadc-c45739d7f389 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:46 compute-1 nova_compute[192795]: 2025-09-30 21:28:46.692 2 INFO nova.compute.manager [-] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] VM Stopped (Lifecycle Event)
Sep 30 21:28:46 compute-1 nova_compute[192795]: 2025-09-30 21:28:46.719 2 DEBUG nova.compute.manager [None req-1cb486bf-c1bc-411f-87b7-83a0c03f25f1 - - - - - -] [instance: e136dd3c-da47-4b62-aadc-c45739d7f389] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:46 compute-1 nova_compute[192795]: 2025-09-30 21:28:46.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:47 compute-1 nova_compute[192795]: 2025-09-30 21:28:47.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:47 compute-1 nova_compute[192795]: 2025-09-30 21:28:47.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:47 compute-1 nova_compute[192795]: 2025-09-30 21:28:47.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:48 compute-1 kernel: tap242fb53f-7c (unregistering): left promiscuous mode
Sep 30 21:28:48 compute-1 NetworkManager[51724]: <info>  [1759267728.6485] device (tap242fb53f-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:28:48 compute-1 ovn_controller[94902]: 2025-09-30T21:28:48Z|00276|binding|INFO|Releasing lport 242fb53f-7c71-48ef-a180-00bad1488d61 from this chassis (sb_readonly=0)
Sep 30 21:28:48 compute-1 ovn_controller[94902]: 2025-09-30T21:28:48Z|00277|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 down in Southbound
Sep 30 21:28:48 compute-1 ovn_controller[94902]: 2025-09-30T21:28:48Z|00278|binding|INFO|Removing iface tap242fb53f-7c ovn-installed in OVS
Sep 30 21:28:48 compute-1 nova_compute[192795]: 2025-09-30 21:28:48.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:48.671 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:48.672 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:28:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:48.675 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:28:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:48.676 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6e8ad7-4047-484e-81e9-6513b2e64ff5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:48.677 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:28:48 compute-1 nova_compute[192795]: 2025-09-30 21:28:48.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:48 compute-1 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000038.scope: Deactivated successfully.
Sep 30 21:28:48 compute-1 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000038.scope: Consumed 17.448s CPU time.
Sep 30 21:28:48 compute-1 systemd-machined[152783]: Machine qemu-33-instance-00000038 terminated.
Sep 30 21:28:48 compute-1 nova_compute[192795]: 2025-09-30 21:28:48.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:48 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230608]: [NOTICE]   (230612) : haproxy version is 2.8.14-c23fe91
Sep 30 21:28:48 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230608]: [NOTICE]   (230612) : path to executable is /usr/sbin/haproxy
Sep 30 21:28:48 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230608]: [WARNING]  (230612) : Exiting Master process...
Sep 30 21:28:48 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230608]: [WARNING]  (230612) : Exiting Master process...
Sep 30 21:28:48 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230608]: [ALERT]    (230612) : Current worker (230614) exited with code 143 (Terminated)
Sep 30 21:28:48 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230608]: [WARNING]  (230612) : All workers exited. Exiting... (0)
Sep 30 21:28:48 compute-1 systemd[1]: libpod-c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa.scope: Deactivated successfully.
Sep 30 21:28:48 compute-1 podman[231336]: 2025-09-30 21:28:48.849055051 +0000 UTC m=+0.063788584 container died c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:28:48 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa-userdata-shm.mount: Deactivated successfully.
Sep 30 21:28:48 compute-1 systemd[1]: var-lib-containers-storage-overlay-8cdad755e7cb883770076668c4f4a8150bdda80301b6556ce0e64fc280ae75c3-merged.mount: Deactivated successfully.
Sep 30 21:28:48 compute-1 podman[231336]: 2025-09-30 21:28:48.902132379 +0000 UTC m=+0.116865902 container cleanup c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:28:48 compute-1 systemd[1]: libpod-conmon-c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa.scope: Deactivated successfully.
Sep 30 21:28:48 compute-1 podman[231374]: 2025-09-30 21:28:48.992626616 +0000 UTC m=+0.058000051 container remove c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:28:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:49.000 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[52124008-333e-48eb-9b0c-8d28060c430d]: (4, ('Tue Sep 30 09:28:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa)\nc6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa\nTue Sep 30 09:28:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (c6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa)\nc6ece8c1a826bfe7043f229aaff5e63bdcd5d07f620c72e714117b12a107bbfa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:49.002 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[68f2bbc5-9a2b-409f-b3c0-647406ca30a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:49.003 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:49 compute-1 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:49.032 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[58e15f79-84b5-4791-a52c-52b3b21f9b63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:49.059 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0d378c-7f30-4802-a4a2-c86d5b8b9615]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:49.061 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[da939a45-0907-45ad-9357-fd456d15b417]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:49.080 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[764d278b-751a-4f72-b985-b0a606ba751c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433772, 'reachable_time': 27372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231400, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:49.084 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:28:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:49.084 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[67d735ca-b5d2-4e58-8fff-bd007ee71c2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:49 compute-1 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.513 2 INFO nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance shutdown successfully after 3 seconds.
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.521 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance destroyed successfully.
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.523 2 DEBUG nova.virt.libvirt.vif [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:28:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-253023348-network", "vif_mac": "fa:16:3e:65:e3:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.523 2 DEBUG nova.network.os_vif_util [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-253023348-network", "vif_mac": "fa:16:3e:65:e3:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.525 2 DEBUG nova.network.os_vif_util [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.526 2 DEBUG os_vif [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.529 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap242fb53f-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.542 2 INFO os_vif [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.548 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.607 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.609 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.668 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.670 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd_resize/disk /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.692 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "cp -r /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd_resize/disk /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.694 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd_resize/disk.config /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.733 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "cp -r /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd_resize/disk.config /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.735 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd_resize/disk.info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:49 compute-1 nova_compute[192795]: 2025-09-30 21:28:49.760 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "cp -r /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd_resize/disk.info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.info" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:50 compute-1 nova_compute[192795]: 2025-09-30 21:28:50.204 2 DEBUG nova.network.neutron [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Port 242fb53f-7c71-48ef-a180-00bad1488d61 binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Sep 30 21:28:50 compute-1 podman[231411]: 2025-09-30 21:28:50.279235178 +0000 UTC m=+0.103695110 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:28:50 compute-1 podman[231412]: 2025-09-30 21:28:50.311617233 +0000 UTC m=+0.105571261 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 21:28:50 compute-1 podman[231410]: 2025-09-30 21:28:50.320049458 +0000 UTC m=+0.124404414 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vcs-type=git)
Sep 30 21:28:50 compute-1 nova_compute[192795]: 2025-09-30 21:28:50.373 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:50 compute-1 nova_compute[192795]: 2025-09-30 21:28:50.374 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:50 compute-1 nova_compute[192795]: 2025-09-30 21:28:50.374 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:50 compute-1 nova_compute[192795]: 2025-09-30 21:28:50.579 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:50 compute-1 nova_compute[192795]: 2025-09-30 21:28:50.580 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:50 compute-1 nova_compute[192795]: 2025-09-30 21:28:50.580 2 DEBUG nova.network.neutron [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.503 2 DEBUG nova.compute.manager [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.505 2 DEBUG oslo_concurrency.lockutils [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.505 2 DEBUG oslo_concurrency.lockutils [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.506 2 DEBUG oslo_concurrency.lockutils [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.506 2 DEBUG nova.compute.manager [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.507 2 WARNING nova.compute.manager [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state resize_migrated.
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.507 2 DEBUG nova.compute.manager [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.508 2 DEBUG oslo_concurrency.lockutils [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.508 2 DEBUG oslo_concurrency.lockutils [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.509 2 DEBUG oslo_concurrency.lockutils [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.510 2 DEBUG nova.compute.manager [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.510 2 WARNING nova.compute.manager [req-6e2e4ac6-623e-4164-8141-d1ae338cd88e req-41a1a8bd-ad3b-4fd2-ae56-865001f71c45 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state resize_migrated.
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.721 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.898 2 DEBUG nova.network.neutron [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.916 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.920 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.920 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:28:51 compute-1 nova_compute[192795]: 2025-09-30 21:28:51.920 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.049 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.050 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.050 2 INFO nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Creating image(s)
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.052 2 DEBUG nova.objects.instance [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.069 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.139 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.141 2 DEBUG nova.virt.disk.api [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.141 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.205 2 DEBUG oslo_concurrency.processutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.206 2 DEBUG nova.virt.disk.api [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.221 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.222 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Ensure instance console log exists: /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.222 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.223 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.223 2 DEBUG oslo_concurrency.lockutils [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.225 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Start _get_guest_xml network_info=[{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-253023348-network", "vif_mac": "fa:16:3e:65:e3:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.231 2 WARNING nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.235 2 DEBUG nova.virt.libvirt.host [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.235 2 DEBUG nova.virt.libvirt.host [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.239 2 DEBUG nova.virt.libvirt.host [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.240 2 DEBUG nova.virt.libvirt.host [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.241 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.241 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9779bca-1eb6-4567-a36c-b452abeafc70',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.242 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.242 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.242 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.242 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.242 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.243 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.243 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.243 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.243 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.243 2 DEBUG nova.virt.hardware [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.244 2 DEBUG nova.objects.instance [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.260 2 DEBUG nova.virt.libvirt.vif [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-253023348-network", "vif_mac": "fa:16:3e:65:e3:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.261 2 DEBUG nova.network.os_vif_util [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-253023348-network", "vif_mac": "fa:16:3e:65:e3:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.261 2 DEBUG nova.network.os_vif_util [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.265 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <uuid>128bd4be-4a76-4dbb-aef6-65acd9c11cbd</uuid>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <name>instance-00000038</name>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <memory>196608</memory>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerActionsTestJSON-server-394601736</nova:name>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:28:52</nova:creationTime>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <nova:flavor name="m1.micro">
Sep 30 21:28:52 compute-1 nova_compute[192795]:         <nova:memory>192</nova:memory>
Sep 30 21:28:52 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:28:52 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:28:52 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:28:52 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:28:52 compute-1 nova_compute[192795]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:28:52 compute-1 nova_compute[192795]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:28:52 compute-1 nova_compute[192795]:         <nova:port uuid="242fb53f-7c71-48ef-a180-00bad1488d61">
Sep 30 21:28:52 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <system>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <entry name="serial">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <entry name="uuid">128bd4be-4a76-4dbb-aef6-65acd9c11cbd</entry>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     </system>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <os>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   </os>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <features>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   </features>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/disk.config"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:65:e3:f2"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <target dev="tap242fb53f-7c"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd/console.log" append="off"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <video>
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     </video>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:28:52 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:28:52 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:28:52 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:28:52 compute-1 nova_compute[192795]: </domain>
Sep 30 21:28:52 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.266 2 DEBUG nova.virt.libvirt.vif [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-253023348-network", "vif_mac": "fa:16:3e:65:e3:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.266 2 DEBUG nova.network.os_vif_util [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-253023348-network", "vif_mac": "fa:16:3e:65:e3:f2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.267 2 DEBUG nova.network.os_vif_util [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.267 2 DEBUG os_vif [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.269 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.271 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap242fb53f-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.272 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap242fb53f-7c, col_values=(('external_ids', {'iface-id': '242fb53f-7c71-48ef-a180-00bad1488d61', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:e3:f2', 'vm-uuid': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 NetworkManager[51724]: <info>  [1759267732.2751] manager: (tap242fb53f-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.281 2 INFO os_vif [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.341 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.342 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.342 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No VIF found with MAC fa:16:3e:65:e3:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.343 2 INFO nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Using config drive
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 kernel: tap242fb53f-7c: entered promiscuous mode
Sep 30 21:28:52 compute-1 NetworkManager[51724]: <info>  [1759267732.4020] manager: (tap242fb53f-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Sep 30 21:28:52 compute-1 ovn_controller[94902]: 2025-09-30T21:28:52Z|00279|binding|INFO|Claiming lport 242fb53f-7c71-48ef-a180-00bad1488d61 for this chassis.
Sep 30 21:28:52 compute-1 ovn_controller[94902]: 2025-09-30T21:28:52Z|00280|binding|INFO|242fb53f-7c71-48ef-a180-00bad1488d61: Claiming fa:16:3e:65:e3:f2 10.100.0.5
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.412 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '11', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.413 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.415 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:28:52 compute-1 ovn_controller[94902]: 2025-09-30T21:28:52Z|00281|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 ovn-installed in OVS
Sep 30 21:28:52 compute-1 ovn_controller[94902]: 2025-09-30T21:28:52Z|00282|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 up in Southbound
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.429 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6289070a-28bc-4774-93e0-ec0059190956]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.430 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 systemd-udevd[231492]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.434 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.434 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b31e44d9-30c5-4d8f-8f48-eaf2b30a8002]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.436 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a13d5508-b043-4514-b92b-a3353763cc65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 NetworkManager[51724]: <info>  [1759267732.4465] device (tap242fb53f-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:28:52 compute-1 NetworkManager[51724]: <info>  [1759267732.4473] device (tap242fb53f-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.449 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[69ac47cb-73df-4191-8308-c012ad72fb7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 systemd-machined[152783]: New machine qemu-35-instance-00000038.
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.467 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0c73ef-0964-40c0-b394-6fd5aec7e93f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 systemd[1]: Started Virtual Machine qemu-35-instance-00000038.
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.503 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe49331-2c63-451d-aec2-d0b1320dbaff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 NetworkManager[51724]: <info>  [1759267732.5093] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/139)
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.510 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d167c11a-1502-4dee-b338-b24a04850890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 systemd-udevd[231498]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.550 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e6469e5c-d9ba-44dc-899c-e02914db0cd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.555 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[22c2b60c-21f6-4a57-8ecb-2d86ea2d9a33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 NetworkManager[51724]: <info>  [1759267732.5844] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.599 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[439905eb-392a-40c7-8884-9c14111d5b0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.626 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ec1197a0-1df0-4d2d-ab64-710e8d0511b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 87], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442825, 'reachable_time': 40579, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231527, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.654 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bca53202-1024-43d8-89bb-10490aa357cd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442825, 'tstamp': 442825}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231528, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.683 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f106eeeb-8762-4671-94b5-6ea7ea11ad7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 87], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442825, 'reachable_time': 40579, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231529, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.730 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cc48199b-516a-4a11-98cc-51c7b9adb35e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.804 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c78866eb-3a6c-4fa3-ac50-f29eb7154927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.806 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.806 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.807 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:52 compute-1 NetworkManager[51724]: <info>  [1759267732.8104] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Sep 30 21:28:52 compute-1 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.812 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:52 compute-1 ovn_controller[94902]: 2025-09-30T21:28:52Z|00283|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 nova_compute[192795]: 2025-09-30 21:28:52.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.879 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.880 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9c443fbb-442e-41e2-b2dc-54d06f940ae2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.881 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:28:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:28:52.883 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:28:53 compute-1 podman[231564]: 2025-09-30 21:28:53.335536263 +0000 UTC m=+0.054484306 container create 813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:28:53 compute-1 systemd[1]: Started libpod-conmon-813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2.scope.
Sep 30 21:28:53 compute-1 podman[231564]: 2025-09-30 21:28:53.310667879 +0000 UTC m=+0.029615942 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:28:53 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:28:53 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bc0bb6a47bf0b7173b1ac37c8bbfa3babab2eeb4150787c14c55041a65b1f71/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:28:53 compute-1 podman[231564]: 2025-09-30 21:28:53.442358056 +0000 UTC m=+0.161306099 container init 813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:28:53 compute-1 podman[231564]: 2025-09-30 21:28:53.448183681 +0000 UTC m=+0.167131724 container start 813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:28:53 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231581]: [NOTICE]   (231585) : New worker (231587) forked
Sep 30 21:28:53 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231581]: [NOTICE]   (231585) : Loading success.
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.652 2 DEBUG nova.compute.manager [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.652 2 DEBUG oslo_concurrency.lockutils [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.653 2 DEBUG oslo_concurrency.lockutils [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.653 2 DEBUG oslo_concurrency.lockutils [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.653 2 DEBUG nova.compute.manager [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.653 2 WARNING nova.compute.manager [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state resize_finish.
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.654 2 DEBUG nova.compute.manager [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.654 2 DEBUG oslo_concurrency.lockutils [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.654 2 DEBUG oslo_concurrency.lockutils [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.654 2 DEBUG oslo_concurrency.lockutils [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.654 2 DEBUG nova.compute.manager [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.655 2 WARNING nova.compute.manager [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state active and task_state resize_finish.
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.832 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 128bd4be-4a76-4dbb-aef6-65acd9c11cbd due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.833 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267733.8314722, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.833 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Resumed (Lifecycle Event)
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.836 2 DEBUG nova.compute.manager [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.838 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.843 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance running successfully.
Sep 30 21:28:53 compute-1 virtqemud[192217]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.848 2 DEBUG nova.virt.libvirt.guest [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.849 2 DEBUG nova.virt.libvirt.driver [None req-24e30424-830e-4bc2-bb92-119f7d4e0614 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.877 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.878 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.878 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.879 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.885 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.932 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.933 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267733.833381, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.933 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Started (Lifecycle Event)
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.954 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:53 compute-1 nova_compute[192795]: 2025-09-30 21:28:53.958 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:56 compute-1 nova_compute[192795]: 2025-09-30 21:28:56.366 2 DEBUG oslo_concurrency.lockutils [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:56 compute-1 nova_compute[192795]: 2025-09-30 21:28:56.366 2 DEBUG oslo_concurrency.lockutils [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:56 compute-1 nova_compute[192795]: 2025-09-30 21:28:56.367 2 DEBUG nova.compute.manager [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Going to confirm migration 12 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Sep 30 21:28:56 compute-1 nova_compute[192795]: 2025-09-30 21:28:56.419 2 DEBUG nova.objects.instance [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'info_cache' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:56 compute-1 nova_compute[192795]: 2025-09-30 21:28:56.775 2 DEBUG oslo_concurrency.lockutils [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:56 compute-1 nova_compute[192795]: 2025-09-30 21:28:56.776 2 DEBUG oslo_concurrency.lockutils [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:56 compute-1 nova_compute[192795]: 2025-09-30 21:28:56.776 2 DEBUG nova.network.neutron [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:28:57 compute-1 nova_compute[192795]: 2025-09-30 21:28:57.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-1 nova_compute[192795]: 2025-09-30 21:28:57.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.273 2 DEBUG nova.network.neutron [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [{"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.294 2 DEBUG oslo_concurrency.lockutils [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-128bd4be-4a76-4dbb-aef6-65acd9c11cbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.295 2 DEBUG nova.objects.instance [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.308 2 DEBUG oslo_concurrency.lockutils [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.309 2 DEBUG oslo_concurrency.lockutils [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.412 2 DEBUG nova.compute.provider_tree [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.427 2 DEBUG nova.scheduler.client.report [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.479 2 DEBUG oslo_concurrency.lockutils [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.612 2 INFO nova.scheduler.client.report [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Deleted allocation for migration eab66b74-3841-4dad-bf4d-d9d24a213f7f
Sep 30 21:28:58 compute-1 nova_compute[192795]: 2025-09-30 21:28:58.682 2 DEBUG oslo_concurrency.lockutils [None req-ead02578-3361-4b3a-b524-843d162c6140 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:59 compute-1 podman[231598]: 2025-09-30 21:28:59.240131948 +0000 UTC m=+0.069531529 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3)
Sep 30 21:29:01 compute-1 nova_compute[192795]: 2025-09-30 21:29:01.829 2 DEBUG oslo_concurrency.lockutils [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:01 compute-1 nova_compute[192795]: 2025-09-30 21:29:01.830 2 DEBUG oslo_concurrency.lockutils [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:01 compute-1 nova_compute[192795]: 2025-09-30 21:29:01.830 2 DEBUG oslo_concurrency.lockutils [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:01 compute-1 nova_compute[192795]: 2025-09-30 21:29:01.830 2 DEBUG oslo_concurrency.lockutils [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:01 compute-1 nova_compute[192795]: 2025-09-30 21:29:01.831 2 DEBUG oslo_concurrency.lockutils [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:01 compute-1 nova_compute[192795]: 2025-09-30 21:29:01.842 2 INFO nova.compute.manager [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Terminating instance
Sep 30 21:29:01 compute-1 nova_compute[192795]: 2025-09-30 21:29:01.854 2 DEBUG nova.compute.manager [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:29:01 compute-1 kernel: tap242fb53f-7c (unregistering): left promiscuous mode
Sep 30 21:29:01 compute-1 NetworkManager[51724]: <info>  [1759267741.8736] device (tap242fb53f-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:29:01 compute-1 nova_compute[192795]: 2025-09-30 21:29:01.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:01 compute-1 ovn_controller[94902]: 2025-09-30T21:29:01Z|00284|binding|INFO|Releasing lport 242fb53f-7c71-48ef-a180-00bad1488d61 from this chassis (sb_readonly=0)
Sep 30 21:29:01 compute-1 ovn_controller[94902]: 2025-09-30T21:29:01Z|00285|binding|INFO|Setting lport 242fb53f-7c71-48ef-a180-00bad1488d61 down in Southbound
Sep 30 21:29:01 compute-1 ovn_controller[94902]: 2025-09-30T21:29:01Z|00286|binding|INFO|Removing iface tap242fb53f-7c ovn-installed in OVS
Sep 30 21:29:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:01.892 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:e3:f2 10.100.0.5'], port_security=['fa:16:3e:65:e3:f2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '128bd4be-4a76-4dbb-aef6-65acd9c11cbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '12', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=242fb53f-7c71-48ef-a180-00bad1488d61) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:29:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:01.895 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 242fb53f-7c71-48ef-a180-00bad1488d61 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:29:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:01.896 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:29:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:01.897 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0b41ab-9c57-44f7-a2f4-907d42b4a004]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:01.898 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:29:01 compute-1 nova_compute[192795]: 2025-09-30 21:29:01.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:01 compute-1 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000038.scope: Deactivated successfully.
Sep 30 21:29:01 compute-1 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000038.scope: Consumed 9.448s CPU time.
Sep 30 21:29:01 compute-1 systemd-machined[152783]: Machine qemu-35-instance-00000038 terminated.
Sep 30 21:29:02 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231581]: [NOTICE]   (231585) : haproxy version is 2.8.14-c23fe91
Sep 30 21:29:02 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231581]: [NOTICE]   (231585) : path to executable is /usr/sbin/haproxy
Sep 30 21:29:02 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231581]: [WARNING]  (231585) : Exiting Master process...
Sep 30 21:29:02 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231581]: [WARNING]  (231585) : Exiting Master process...
Sep 30 21:29:02 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231581]: [ALERT]    (231585) : Current worker (231587) exited with code 143 (Terminated)
Sep 30 21:29:02 compute-1 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231581]: [WARNING]  (231585) : All workers exited. Exiting... (0)
Sep 30 21:29:02 compute-1 systemd[1]: libpod-813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2.scope: Deactivated successfully.
Sep 30 21:29:02 compute-1 podman[231643]: 2025-09-30 21:29:02.079980532 +0000 UTC m=+0.083283476 container died 813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:29:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-7bc0bb6a47bf0b7173b1ac37c8bbfa3babab2eeb4150787c14c55041a65b1f71-merged.mount: Deactivated successfully.
Sep 30 21:29:02 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2-userdata-shm.mount: Deactivated successfully.
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.152 2 INFO nova.virt.libvirt.driver [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Instance destroyed successfully.
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.152 2 DEBUG nova.objects.instance [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 128bd4be-4a76-4dbb-aef6-65acd9c11cbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:02 compute-1 podman[231643]: 2025-09-30 21:29:02.168416433 +0000 UTC m=+0.171719327 container cleanup 813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.171 2 DEBUG nova.virt.libvirt.vif [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-394601736',display_name='tempest-ServerActionsTestJSON-server-394601736',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-394601736',id=56,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-39izocna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:28:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=128bd4be-4a76-4dbb-aef6-65acd9c11cbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.171 2 DEBUG nova.network.os_vif_util [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "242fb53f-7c71-48ef-a180-00bad1488d61", "address": "fa:16:3e:65:e3:f2", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap242fb53f-7c", "ovs_interfaceid": "242fb53f-7c71-48ef-a180-00bad1488d61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.172 2 DEBUG nova.network.os_vif_util [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.173 2 DEBUG os_vif [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.175 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap242fb53f-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.181 2 INFO os_vif [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:e3:f2,bridge_name='br-int',has_traffic_filtering=True,id=242fb53f-7c71-48ef-a180-00bad1488d61,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap242fb53f-7c')
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.181 2 INFO nova.virt.libvirt.driver [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Deleting instance files /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd_del
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.182 2 INFO nova.virt.libvirt.driver [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Deletion of /var/lib/nova/instances/128bd4be-4a76-4dbb-aef6-65acd9c11cbd_del complete
Sep 30 21:29:02 compute-1 systemd[1]: libpod-conmon-813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2.scope: Deactivated successfully.
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.281 2 INFO nova.compute.manager [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.282 2 DEBUG oslo.service.loopingcall [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.283 2 DEBUG nova.compute.manager [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.283 2 DEBUG nova.network.neutron [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:29:02 compute-1 podman[231691]: 2025-09-30 21:29:02.286681684 +0000 UTC m=+0.090585672 container remove 813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:29:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:02.292 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[82ea5a5a-0935-44b1-902c-9e0df5b46207]: (4, ('Tue Sep 30 09:29:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2)\n813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2\nTue Sep 30 09:29:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2)\n813ed31d15a93fe0e3f065b401afac1c31e20f24b5926d10d8bd243aa35b67f2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:02.295 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f33d18af-f496-4813-a713-6808bc66a552]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:02.296 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:02 compute-1 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:02.313 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b7552000-2e33-467c-a652-bd43d5eac0e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:02.343 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[95d07839-8b8c-4a03-8907-fd3c7151643a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:02.345 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[93dc5f77-a4eb-4100-b614-aed89776c45b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:02.363 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b72fd87b-fd67-44db-a077-9825c1865c8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442817, 'reachable_time': 21207, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231707, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:02 compute-1 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:29:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:02.372 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:29:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:02.373 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ae4a72-5d31-43b9-80bd-cf5acb644202]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.424 2 DEBUG nova.compute.manager [req-d5444461-a1f2-4daa-9597-a7b56b81d843 req-69921f8f-a283-4e33-a4cb-a67fcb78dbaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.425 2 DEBUG oslo_concurrency.lockutils [req-d5444461-a1f2-4daa-9597-a7b56b81d843 req-69921f8f-a283-4e33-a4cb-a67fcb78dbaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.425 2 DEBUG oslo_concurrency.lockutils [req-d5444461-a1f2-4daa-9597-a7b56b81d843 req-69921f8f-a283-4e33-a4cb-a67fcb78dbaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.425 2 DEBUG oslo_concurrency.lockutils [req-d5444461-a1f2-4daa-9597-a7b56b81d843 req-69921f8f-a283-4e33-a4cb-a67fcb78dbaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.426 2 DEBUG nova.compute.manager [req-d5444461-a1f2-4daa-9597-a7b56b81d843 req-69921f8f-a283-4e33-a4cb-a67fcb78dbaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:02 compute-1 nova_compute[192795]: 2025-09-30 21:29:02.426 2 DEBUG nova.compute.manager [req-d5444461-a1f2-4daa-9597-a7b56b81d843 req-69921f8f-a283-4e33-a4cb-a67fcb78dbaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-unplugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:29:03 compute-1 nova_compute[192795]: 2025-09-30 21:29:03.378 2 DEBUG nova.network.neutron [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:03 compute-1 nova_compute[192795]: 2025-09-30 21:29:03.407 2 INFO nova.compute.manager [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Took 1.12 seconds to deallocate network for instance.
Sep 30 21:29:03 compute-1 nova_compute[192795]: 2025-09-30 21:29:03.501 2 DEBUG oslo_concurrency.lockutils [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:03 compute-1 nova_compute[192795]: 2025-09-30 21:29:03.501 2 DEBUG oslo_concurrency.lockutils [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:03 compute-1 nova_compute[192795]: 2025-09-30 21:29:03.510 2 DEBUG oslo_concurrency.lockutils [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:03 compute-1 nova_compute[192795]: 2025-09-30 21:29:03.534 2 INFO nova.scheduler.client.report [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Deleted allocations for instance 128bd4be-4a76-4dbb-aef6-65acd9c11cbd
Sep 30 21:29:03 compute-1 nova_compute[192795]: 2025-09-30 21:29:03.647 2 DEBUG oslo_concurrency.lockutils [None req-a21bc58e-3994-440f-96f2-12b57ac51af4 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:03 compute-1 nova_compute[192795]: 2025-09-30 21:29:03.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:03 compute-1 nova_compute[192795]: 2025-09-30 21:29:03.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:04 compute-1 nova_compute[192795]: 2025-09-30 21:29:04.525 2 DEBUG nova.compute.manager [req-2887e4fc-f67e-47fe-9742-447745e40fae req-b4a08860-d8c4-4d63-ac08-bdbc1009c756 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:04 compute-1 nova_compute[192795]: 2025-09-30 21:29:04.526 2 DEBUG oslo_concurrency.lockutils [req-2887e4fc-f67e-47fe-9742-447745e40fae req-b4a08860-d8c4-4d63-ac08-bdbc1009c756 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:04 compute-1 nova_compute[192795]: 2025-09-30 21:29:04.526 2 DEBUG oslo_concurrency.lockutils [req-2887e4fc-f67e-47fe-9742-447745e40fae req-b4a08860-d8c4-4d63-ac08-bdbc1009c756 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:04 compute-1 nova_compute[192795]: 2025-09-30 21:29:04.526 2 DEBUG oslo_concurrency.lockutils [req-2887e4fc-f67e-47fe-9742-447745e40fae req-b4a08860-d8c4-4d63-ac08-bdbc1009c756 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128bd4be-4a76-4dbb-aef6-65acd9c11cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:04 compute-1 nova_compute[192795]: 2025-09-30 21:29:04.527 2 DEBUG nova.compute.manager [req-2887e4fc-f67e-47fe-9742-447745e40fae req-b4a08860-d8c4-4d63-ac08-bdbc1009c756 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] No waiting events found dispatching network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:04 compute-1 nova_compute[192795]: 2025-09-30 21:29:04.527 2 WARNING nova.compute.manager [req-2887e4fc-f67e-47fe-9742-447745e40fae req-b4a08860-d8c4-4d63-ac08-bdbc1009c756 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received unexpected event network-vif-plugged-242fb53f-7c71-48ef-a180-00bad1488d61 for instance with vm_state deleted and task_state None.
Sep 30 21:29:05 compute-1 nova_compute[192795]: 2025-09-30 21:29:05.046 2 DEBUG nova.compute.manager [req-f98b21d8-092e-4c26-a144-4ce4284147c8 req-0d7edf6e-ad21-4403-8621-f8b0857c0cdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Received event network-vif-deleted-242fb53f-7c71-48ef-a180-00bad1488d61 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:07 compute-1 nova_compute[192795]: 2025-09-30 21:29:07.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:07 compute-1 nova_compute[192795]: 2025-09-30 21:29:07.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:09 compute-1 podman[231711]: 2025-09-30 21:29:09.233216291 +0000 UTC m=+0.063878894 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:29:09 compute-1 podman[231709]: 2025-09-30 21:29:09.266225535 +0000 UTC m=+0.099979556 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:29:09 compute-1 podman[231710]: 2025-09-30 21:29:09.287653574 +0000 UTC m=+0.111915843 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:29:12 compute-1 nova_compute[192795]: 2025-09-30 21:29:12.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:12 compute-1 podman[231776]: 2025-09-30 21:29:12.242563961 +0000 UTC m=+0.073140437 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true)
Sep 30 21:29:12 compute-1 nova_compute[192795]: 2025-09-30 21:29:12.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-1 nova_compute[192795]: 2025-09-30 21:29:17.149 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267742.1489756, 128bd4be-4a76-4dbb-aef6-65acd9c11cbd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:17 compute-1 nova_compute[192795]: 2025-09-30 21:29:17.150 2 INFO nova.compute.manager [-] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] VM Stopped (Lifecycle Event)
Sep 30 21:29:17 compute-1 nova_compute[192795]: 2025-09-30 21:29:17.169 2 DEBUG nova.compute.manager [None req-26460358-589f-4c9b-8f0b-c52f04c5a5b8 - - - - - -] [instance: 128bd4be-4a76-4dbb-aef6-65acd9c11cbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:17 compute-1 nova_compute[192795]: 2025-09-30 21:29:17.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-1 nova_compute[192795]: 2025-09-30 21:29:17.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:21 compute-1 podman[231796]: 2025-09-30 21:29:21.231760175 +0000 UTC m=+0.069555362 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Sep 30 21:29:21 compute-1 podman[231797]: 2025-09-30 21:29:21.253261837 +0000 UTC m=+0.083373485 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:29:21 compute-1 podman[231798]: 2025-09-30 21:29:21.280999146 +0000 UTC m=+0.099669135 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:29:22 compute-1 nova_compute[192795]: 2025-09-30 21:29:22.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:22 compute-1 nova_compute[192795]: 2025-09-30 21:29:22.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:27 compute-1 nova_compute[192795]: 2025-09-30 21:29:27.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:27 compute-1 nova_compute[192795]: 2025-09-30 21:29:27.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:30 compute-1 podman[231856]: 2025-09-30 21:29:30.213406186 +0000 UTC m=+0.057799450 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 21:29:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:30.754 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:29:30 compute-1 nova_compute[192795]: 2025-09-30 21:29:30.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:30.755 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:29:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:30.756 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:32 compute-1 nova_compute[192795]: 2025-09-30 21:29:32.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:32 compute-1 nova_compute[192795]: 2025-09-30 21:29:32.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:37 compute-1 nova_compute[192795]: 2025-09-30 21:29:37.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:37 compute-1 nova_compute[192795]: 2025-09-30 21:29:37.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:38.690 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:38.690 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:38.690 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:40 compute-1 podman[231877]: 2025-09-30 21:29:40.213615709 +0000 UTC m=+0.058543918 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:29:40 compute-1 podman[231879]: 2025-09-30 21:29:40.224206659 +0000 UTC m=+0.059301464 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:29:40 compute-1 podman[231878]: 2025-09-30 21:29:40.348141911 +0000 UTC m=+0.175789235 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Sep 30 21:29:41 compute-1 nova_compute[192795]: 2025-09-30 21:29:41.927 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "925ae136-46c0-4920-93f9-36b983cd4f52" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:41 compute-1 nova_compute[192795]: 2025-09-30 21:29:41.928 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:41 compute-1 nova_compute[192795]: 2025-09-30 21:29:41.946 2 DEBUG nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.208 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.208 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.215 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.215 2 INFO nova.compute.claims [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.345 2 DEBUG nova.compute.provider_tree [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.375 2 DEBUG nova.scheduler.client.report [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.407 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.408 2 DEBUG nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.473 2 DEBUG nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.474 2 DEBUG nova.network.neutron [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.499 2 INFO nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.525 2 DEBUG nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.657 2 DEBUG nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.658 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.659 2 INFO nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Creating image(s)
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.659 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "/var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.660 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "/var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.660 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "/var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.676 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.738 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.739 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.740 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.750 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.793 2 DEBUG nova.policy [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'beaafecda25944bc8ae62afa508a5722', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3241d41d3774429b8e779e08d0fea5ab', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.809 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.810 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.849 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.850 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.851 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.930 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.933 2 DEBUG nova.virt.disk.api [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Checking if we can resize image /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:29:42 compute-1 nova_compute[192795]: 2025-09-30 21:29:42.934 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:43 compute-1 nova_compute[192795]: 2025-09-30 21:29:43.032 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:43 compute-1 nova_compute[192795]: 2025-09-30 21:29:43.034 2 DEBUG nova.virt.disk.api [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Cannot resize image /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:29:43 compute-1 nova_compute[192795]: 2025-09-30 21:29:43.035 2 DEBUG nova.objects.instance [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lazy-loading 'migration_context' on Instance uuid 925ae136-46c0-4920-93f9-36b983cd4f52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:43 compute-1 nova_compute[192795]: 2025-09-30 21:29:43.051 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:29:43 compute-1 nova_compute[192795]: 2025-09-30 21:29:43.051 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Ensure instance console log exists: /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:29:43 compute-1 nova_compute[192795]: 2025-09-30 21:29:43.052 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:43 compute-1 nova_compute[192795]: 2025-09-30 21:29:43.053 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:43 compute-1 nova_compute[192795]: 2025-09-30 21:29:43.053 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:43 compute-1 podman[231963]: 2025-09-30 21:29:43.240006037 +0000 UTC m=+0.080402423 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:29:43 compute-1 nova_compute[192795]: 2025-09-30 21:29:43.968 2 DEBUG nova.network.neutron [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Successfully created port: 3398e381-8092-498a-bc78-691e3883aa7b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.714 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.714 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.715 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.715 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.896 2 DEBUG nova.network.neutron [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Successfully updated port: 3398e381-8092-498a-bc78-691e3883aa7b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.915 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "refresh_cache-925ae136-46c0-4920-93f9-36b983cd4f52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.915 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquired lock "refresh_cache-925ae136-46c0-4920-93f9-36b983cd4f52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:44 compute-1 nova_compute[192795]: 2025-09-30 21:29:44.916 2 DEBUG nova.network.neutron [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.014 2 DEBUG nova.compute.manager [req-1a20a246-1dd0-4b76-a84d-e2d63a2eb449 req-38d2c0dc-2bd1-4b29-aad1-844057110c93 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Received event network-changed-3398e381-8092-498a-bc78-691e3883aa7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.014 2 DEBUG nova.compute.manager [req-1a20a246-1dd0-4b76-a84d-e2d63a2eb449 req-38d2c0dc-2bd1-4b29-aad1-844057110c93 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Refreshing instance network info cache due to event network-changed-3398e381-8092-498a-bc78-691e3883aa7b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.014 2 DEBUG oslo_concurrency.lockutils [req-1a20a246-1dd0-4b76-a84d-e2d63a2eb449 req-38d2c0dc-2bd1-4b29-aad1-844057110c93 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-925ae136-46c0-4920-93f9-36b983cd4f52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.021 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.022 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5712MB free_disk=73.38565444946289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.022 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.023 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.096 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 925ae136-46c0-4920-93f9-36b983cd4f52 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.097 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.098 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.131 2 DEBUG nova.network.neutron [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.149 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.173 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.202 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.203 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.203 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.204 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.368 2 DEBUG nova.compute.manager [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.486 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.487 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.512 2 DEBUG nova.objects.instance [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'pci_requests' on Instance uuid 4f678545-d6f3-45f7-8bac-260f6079e85f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.532 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.532 2 INFO nova.compute.claims [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.533 2 DEBUG nova.objects.instance [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'resources' on Instance uuid 4f678545-d6f3-45f7-8bac-260f6079e85f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.552 2 DEBUG nova.objects.instance [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'pci_devices' on Instance uuid 4f678545-d6f3-45f7-8bac-260f6079e85f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.612 2 INFO nova.compute.resource_tracker [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Updating resource usage from migration fd6382ef-859a-4875-b295-74bf13d7f9bc
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.613 2 DEBUG nova.compute.resource_tracker [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Starting to track incoming migration fd6382ef-859a-4875-b295-74bf13d7f9bc with flavor c9779bca-1eb6-4567-a36c-b452abeafc70 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.731 2 DEBUG nova.compute.provider_tree [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.762 2 DEBUG nova.scheduler.client.report [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.791 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:45 compute-1 nova_compute[192795]: 2025-09-30 21:29:45.791 2 INFO nova.compute.manager [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Migrating
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.068 2 DEBUG nova.network.neutron [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Updating instance_info_cache with network_info: [{"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.095 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Releasing lock "refresh_cache-925ae136-46c0-4920-93f9-36b983cd4f52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.096 2 DEBUG nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Instance network_info: |[{"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.097 2 DEBUG oslo_concurrency.lockutils [req-1a20a246-1dd0-4b76-a84d-e2d63a2eb449 req-38d2c0dc-2bd1-4b29-aad1-844057110c93 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-925ae136-46c0-4920-93f9-36b983cd4f52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.097 2 DEBUG nova.network.neutron [req-1a20a246-1dd0-4b76-a84d-e2d63a2eb449 req-38d2c0dc-2bd1-4b29-aad1-844057110c93 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Refreshing network info cache for port 3398e381-8092-498a-bc78-691e3883aa7b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.100 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Start _get_guest_xml network_info=[{"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.104 2 WARNING nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.108 2 DEBUG nova.virt.libvirt.host [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.109 2 DEBUG nova.virt.libvirt.host [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.113 2 DEBUG nova.virt.libvirt.host [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.113 2 DEBUG nova.virt.libvirt.host [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.114 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.114 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.115 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.115 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.115 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.116 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.116 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.116 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.116 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.117 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.117 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.117 2 DEBUG nova.virt.hardware [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.121 2 DEBUG nova.virt.libvirt.vif [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:29:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-476572418',display_name='tempest-ImagesOneServerNegativeTestJSON-server-476572418',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-476572418',id=77,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3241d41d3774429b8e779e08d0fea5ab',ramdisk_id='',reservation_id='r-w2s0ujkz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1411663101',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1411663101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:29:42Z,user_data=None,user_id='beaafecda25944bc8ae62afa508a5722',uuid=925ae136-46c0-4920-93f9-36b983cd4f52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.121 2 DEBUG nova.network.os_vif_util [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Converting VIF {"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.122 2 DEBUG nova.network.os_vif_util [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:12:84,bridge_name='br-int',has_traffic_filtering=True,id=3398e381-8092-498a-bc78-691e3883aa7b,network=Network(11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3398e381-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.123 2 DEBUG nova.objects.instance [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lazy-loading 'pci_devices' on Instance uuid 925ae136-46c0-4920-93f9-36b983cd4f52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.146 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <uuid>925ae136-46c0-4920-93f9-36b983cd4f52</uuid>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <name>instance-0000004d</name>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-476572418</nova:name>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:29:46</nova:creationTime>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:29:46 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:29:46 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:29:46 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:29:46 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:29:46 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:29:46 compute-1 nova_compute[192795]:         <nova:user uuid="beaafecda25944bc8ae62afa508a5722">tempest-ImagesOneServerNegativeTestJSON-1411663101-project-member</nova:user>
Sep 30 21:29:46 compute-1 nova_compute[192795]:         <nova:project uuid="3241d41d3774429b8e779e08d0fea5ab">tempest-ImagesOneServerNegativeTestJSON-1411663101</nova:project>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:29:46 compute-1 nova_compute[192795]:         <nova:port uuid="3398e381-8092-498a-bc78-691e3883aa7b">
Sep 30 21:29:46 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <system>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <entry name="serial">925ae136-46c0-4920-93f9-36b983cd4f52</entry>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <entry name="uuid">925ae136-46c0-4920-93f9-36b983cd4f52</entry>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     </system>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <os>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   </os>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <features>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   </features>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk.config"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:85:12:84"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <target dev="tap3398e381-80"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/console.log" append="off"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <video>
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     </video>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:29:46 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:29:46 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:29:46 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:29:46 compute-1 nova_compute[192795]: </domain>
Sep 30 21:29:46 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.148 2 DEBUG nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Preparing to wait for external event network-vif-plugged-3398e381-8092-498a-bc78-691e3883aa7b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.148 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.149 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.149 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.150 2 DEBUG nova.virt.libvirt.vif [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:29:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-476572418',display_name='tempest-ImagesOneServerNegativeTestJSON-server-476572418',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-476572418',id=77,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3241d41d3774429b8e779e08d0fea5ab',ramdisk_id='',reservation_id='r-w2s0ujkz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1411663101',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1411663101-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:29:42Z,user_data=None,user_id='beaafecda25944bc8ae62afa508a5722',uuid=925ae136-46c0-4920-93f9-36b983cd4f52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.150 2 DEBUG nova.network.os_vif_util [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Converting VIF {"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.151 2 DEBUG nova.network.os_vif_util [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:12:84,bridge_name='br-int',has_traffic_filtering=True,id=3398e381-8092-498a-bc78-691e3883aa7b,network=Network(11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3398e381-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.151 2 DEBUG os_vif [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:12:84,bridge_name='br-int',has_traffic_filtering=True,id=3398e381-8092-498a-bc78-691e3883aa7b,network=Network(11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3398e381-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.152 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.152 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.155 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3398e381-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.156 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3398e381-80, col_values=(('external_ids', {'iface-id': '3398e381-8092-498a-bc78-691e3883aa7b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:12:84', 'vm-uuid': '925ae136-46c0-4920-93f9-36b983cd4f52'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-1 NetworkManager[51724]: <info>  [1759267786.1593] manager: (tap3398e381-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.166 2 INFO os_vif [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:12:84,bridge_name='br-int',has_traffic_filtering=True,id=3398e381-8092-498a-bc78-691e3883aa7b,network=Network(11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3398e381-80')
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.208 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.210 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.210 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] No VIF found with MAC fa:16:3e:85:12:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.211 2 INFO nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Using config drive
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.215 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.215 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.572 2 INFO nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Creating config drive at /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk.config
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.577 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsh9s4jz0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.721 2 DEBUG oslo_concurrency.processutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsh9s4jz0" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:46 compute-1 kernel: tap3398e381-80: entered promiscuous mode
Sep 30 21:29:46 compute-1 ovn_controller[94902]: 2025-09-30T21:29:46Z|00287|binding|INFO|Claiming lport 3398e381-8092-498a-bc78-691e3883aa7b for this chassis.
Sep 30 21:29:46 compute-1 ovn_controller[94902]: 2025-09-30T21:29:46Z|00288|binding|INFO|3398e381-8092-498a-bc78-691e3883aa7b: Claiming fa:16:3e:85:12:84 10.100.0.8
Sep 30 21:29:46 compute-1 NetworkManager[51724]: <info>  [1759267786.8307] manager: (tap3398e381-80): new Tun device (/org/freedesktop/NetworkManager/Devices/142)
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.842 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:12:84 10.100.0.8'], port_security=['fa:16:3e:85:12:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '925ae136-46c0-4920-93f9-36b983cd4f52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3241d41d3774429b8e779e08d0fea5ab', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b25a45af-8444-4037-b0ab-0c1fe1275aaa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b782aabf-3878-470c-b18e-f62a4ce65158, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=3398e381-8092-498a-bc78-691e3883aa7b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.843 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 3398e381-8092-498a-bc78-691e3883aa7b in datapath 11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16 bound to our chassis
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.845 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.865 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[96f9bfb3-6202-4784-9533-2b2ad9161f16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.866 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap11e0b1a8-c1 in ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.868 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap11e0b1a8-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.868 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[35eb0eb8-6b9b-4a4f-acce-298e4db37701]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.869 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[61bd58ae-f94e-4865-8f6c-f0cec3910376]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:46 compute-1 systemd-udevd[232006]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.883 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b26014-9c3d-4423-8d24-2b7a62c5493b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:46 compute-1 systemd-machined[152783]: New machine qemu-36-instance-0000004d.
Sep 30 21:29:46 compute-1 NetworkManager[51724]: <info>  [1759267786.8989] device (tap3398e381-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:29:46 compute-1 NetworkManager[51724]: <info>  [1759267786.9005] device (tap3398e381-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-1 ovn_controller[94902]: 2025-09-30T21:29:46Z|00289|binding|INFO|Setting lport 3398e381-8092-498a-bc78-691e3883aa7b ovn-installed in OVS
Sep 30 21:29:46 compute-1 ovn_controller[94902]: 2025-09-30T21:29:46Z|00290|binding|INFO|Setting lport 3398e381-8092-498a-bc78-691e3883aa7b up in Southbound
Sep 30 21:29:46 compute-1 systemd[1]: Started Virtual Machine qemu-36-instance-0000004d.
Sep 30 21:29:46 compute-1 nova_compute[192795]: 2025-09-30 21:29:46.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.915 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d5268d-cb8d-41f0-942c-9d208daf5218]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.963 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2034e297-b6c5-486a-ba17-06db8ad6d9c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:46.969 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[467cd0b9-305e-490c-90f6-b73db1d7b1cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:46 compute-1 systemd-udevd[232010]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:29:46 compute-1 NetworkManager[51724]: <info>  [1759267786.9709] manager: (tap11e0b1a8-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/143)
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.008 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a181a1de-218b-47ed-914e-8256c363e0f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.013 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[742a3a10-ec10-4a9e-9809-493b97434b52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-1 NetworkManager[51724]: <info>  [1759267787.0409] device (tap11e0b1a8-c0): carrier: link connected
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.046 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b39acbe9-6c10-4dfd-aae0-9e262b59e3f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.062 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb59219-be77-4a3e-a2af-09014ff7742b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap11e0b1a8-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:c3:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448271, 'reachable_time': 16670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232038, 'error': None, 'target': 'ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.095 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[987019ce-79b6-49d0-ac19-5df2d0d7eb12]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:c3dd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448271, 'tstamp': 448271}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232039, 'error': None, 'target': 'ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.109 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4af8c070-b997-4623-8515-5f9796066b1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap11e0b1a8-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:c3:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448271, 'reachable_time': 16670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232040, 'error': None, 'target': 'ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.141 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e2d73b-c8cd-49af-9615-52a43013d031]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.193 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[024a65f2-2ed5-42a3-99c9-d217f542c3d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.195 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11e0b1a8-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.195 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.195 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11e0b1a8-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:47 compute-1 kernel: tap11e0b1a8-c0: entered promiscuous mode
Sep 30 21:29:47 compute-1 NetworkManager[51724]: <info>  [1759267787.1981] manager: (tap11e0b1a8-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.203 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap11e0b1a8-c0, col_values=(('external_ids', {'iface-id': '55706479-29c7-4c2b-8a51-1dee48d60691'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:47 compute-1 ovn_controller[94902]: 2025-09-30T21:29:47Z|00291|binding|INFO|Releasing lport 55706479-29c7-4c2b-8a51-1dee48d60691 from this chassis (sb_readonly=0)
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.217 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.218 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[655b51dc-5af2-4d55-8a90-868a7ffab8f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.219 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16.pid.haproxy
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:29:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:47.219 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16', 'env', 'PROCESS_TAG=haproxy-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.262 2 DEBUG nova.compute.manager [req-0fac20cc-06df-4a51-85e3-0fe0b4d8b6d6 req-8045aa3c-6999-4b86-966f-753f2ca29afb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Received event network-vif-plugged-3398e381-8092-498a-bc78-691e3883aa7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.263 2 DEBUG oslo_concurrency.lockutils [req-0fac20cc-06df-4a51-85e3-0fe0b4d8b6d6 req-8045aa3c-6999-4b86-966f-753f2ca29afb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.263 2 DEBUG oslo_concurrency.lockutils [req-0fac20cc-06df-4a51-85e3-0fe0b4d8b6d6 req-8045aa3c-6999-4b86-966f-753f2ca29afb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.263 2 DEBUG oslo_concurrency.lockutils [req-0fac20cc-06df-4a51-85e3-0fe0b4d8b6d6 req-8045aa3c-6999-4b86-966f-753f2ca29afb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.264 2 DEBUG nova.compute.manager [req-0fac20cc-06df-4a51-85e3-0fe0b4d8b6d6 req-8045aa3c-6999-4b86-966f-753f2ca29afb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Processing event network-vif-plugged-3398e381-8092-498a-bc78-691e3883aa7b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:47 compute-1 podman[232072]: 2025-09-30 21:29:47.608126292 +0000 UTC m=+0.057227621 container create 3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:29:47 compute-1 systemd[1]: Started libpod-conmon-3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa.scope.
Sep 30 21:29:47 compute-1 podman[232072]: 2025-09-30 21:29:47.575078787 +0000 UTC m=+0.024180136 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:29:47 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:29:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8c51e0380baad82aadae0200efee482af52b050c51cb06847269e4a3b0ea7a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:29:47 compute-1 podman[232072]: 2025-09-30 21:29:47.688991479 +0000 UTC m=+0.138092838 container init 3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:29:47 compute-1 podman[232072]: 2025-09-30 21:29:47.695065002 +0000 UTC m=+0.144166331 container start 3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:29:47 compute-1 sshd-session[232086]: Accepted publickey for nova from 192.168.122.102 port 58608 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:29:47 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:29:47 compute-1 neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16[232090]: [NOTICE]   (232095) : New worker (232098) forked
Sep 30 21:29:47 compute-1 neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16[232090]: [NOTICE]   (232095) : Loading success.
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.720 2 DEBUG nova.network.neutron [req-1a20a246-1dd0-4b76-a84d-e2d63a2eb449 req-38d2c0dc-2bd1-4b29-aad1-844057110c93 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Updated VIF entry in instance network info cache for port 3398e381-8092-498a-bc78-691e3883aa7b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.721 2 DEBUG nova.network.neutron [req-1a20a246-1dd0-4b76-a84d-e2d63a2eb449 req-38d2c0dc-2bd1-4b29-aad1-844057110c93 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Updating instance_info_cache with network_info: [{"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:47 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:29:47 compute-1 systemd-logind[793]: New session 49 of user nova.
Sep 30 21:29:47 compute-1 nova_compute[192795]: 2025-09-30 21:29:47.741 2 DEBUG oslo_concurrency.lockutils [req-1a20a246-1dd0-4b76-a84d-e2d63a2eb449 req-38d2c0dc-2bd1-4b29-aad1-844057110c93 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-925ae136-46c0-4920-93f9-36b983cd4f52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:47 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:29:47 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:29:47 compute-1 systemd[232113]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:29:47 compute-1 systemd[232113]: Queued start job for default target Main User Target.
Sep 30 21:29:47 compute-1 systemd[232113]: Created slice User Application Slice.
Sep 30 21:29:47 compute-1 systemd[232113]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:29:47 compute-1 systemd[232113]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:29:47 compute-1 systemd[232113]: Reached target Paths.
Sep 30 21:29:47 compute-1 systemd[232113]: Reached target Timers.
Sep 30 21:29:47 compute-1 systemd[232113]: Starting D-Bus User Message Bus Socket...
Sep 30 21:29:47 compute-1 systemd[232113]: Starting Create User's Volatile Files and Directories...
Sep 30 21:29:47 compute-1 systemd[232113]: Finished Create User's Volatile Files and Directories.
Sep 30 21:29:47 compute-1 systemd[232113]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:29:47 compute-1 systemd[232113]: Reached target Sockets.
Sep 30 21:29:47 compute-1 systemd[232113]: Reached target Basic System.
Sep 30 21:29:47 compute-1 systemd[232113]: Reached target Main User Target.
Sep 30 21:29:47 compute-1 systemd[232113]: Startup finished in 165ms.
Sep 30 21:29:47 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:29:47 compute-1 systemd[1]: Started Session 49 of User nova.
Sep 30 21:29:47 compute-1 sshd-session[232086]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:29:48 compute-1 sshd-session[232130]: Received disconnect from 192.168.122.102 port 58608:11: disconnected by user
Sep 30 21:29:48 compute-1 sshd-session[232130]: Disconnected from user nova 192.168.122.102 port 58608
Sep 30 21:29:48 compute-1 sshd-session[232086]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:29:48 compute-1 systemd-logind[793]: Session 49 logged out. Waiting for processes to exit.
Sep 30 21:29:48 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Sep 30 21:29:48 compute-1 systemd-logind[793]: Removed session 49.
Sep 30 21:29:48 compute-1 sshd-session[232132]: Accepted publickey for nova from 192.168.122.102 port 58624 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:29:48 compute-1 systemd-logind[793]: New session 51 of user nova.
Sep 30 21:29:48 compute-1 systemd[1]: Started Session 51 of User nova.
Sep 30 21:29:48 compute-1 sshd-session[232132]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.205 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267788.2055116, 925ae136-46c0-4920-93f9-36b983cd4f52 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.206 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] VM Started (Lifecycle Event)
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.208 2 DEBUG nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.215 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.219 2 INFO nova.virt.libvirt.driver [-] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Instance spawned successfully.
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.219 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.239 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.244 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.252 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.253 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.253 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.253 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.254 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.254 2 DEBUG nova.virt.libvirt.driver [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:48 compute-1 sshd-session[232135]: Received disconnect from 192.168.122.102 port 58624:11: disconnected by user
Sep 30 21:29:48 compute-1 sshd-session[232135]: Disconnected from user nova 192.168.122.102 port 58624
Sep 30 21:29:48 compute-1 sshd-session[232132]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:29:48 compute-1 systemd[1]: session-51.scope: Deactivated successfully.
Sep 30 21:29:48 compute-1 systemd-logind[793]: Session 51 logged out. Waiting for processes to exit.
Sep 30 21:29:48 compute-1 systemd-logind[793]: Removed session 51.
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.287 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.288 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267788.2064424, 925ae136-46c0-4920-93f9-36b983cd4f52 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.288 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] VM Paused (Lifecycle Event)
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.423 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.428 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267788.213717, 925ae136-46c0-4920-93f9-36b983cd4f52 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.428 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] VM Resumed (Lifecycle Event)
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.451 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.462 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.479 2 INFO nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Took 5.82 seconds to spawn the instance on the hypervisor.
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.479 2 DEBUG nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.509 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.570 2 INFO nova.compute.manager [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Took 6.56 seconds to build instance.
Sep 30 21:29:48 compute-1 nova_compute[192795]: 2025-09-30 21:29:48.586 2 DEBUG oslo_concurrency.lockutils [None req-7d4bc3a0-33ac-4e72-98fb-4e6a805cc3da beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:49 compute-1 nova_compute[192795]: 2025-09-30 21:29:49.343 2 DEBUG nova.compute.manager [req-4829c003-84cf-4b57-a7e0-93089b6b449c req-1ffd3d33-b59f-4b4f-a581-335b9bf082f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Received event network-vif-plugged-3398e381-8092-498a-bc78-691e3883aa7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:49 compute-1 nova_compute[192795]: 2025-09-30 21:29:49.344 2 DEBUG oslo_concurrency.lockutils [req-4829c003-84cf-4b57-a7e0-93089b6b449c req-1ffd3d33-b59f-4b4f-a581-335b9bf082f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:49 compute-1 nova_compute[192795]: 2025-09-30 21:29:49.344 2 DEBUG oslo_concurrency.lockutils [req-4829c003-84cf-4b57-a7e0-93089b6b449c req-1ffd3d33-b59f-4b4f-a581-335b9bf082f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:49 compute-1 nova_compute[192795]: 2025-09-30 21:29:49.344 2 DEBUG oslo_concurrency.lockutils [req-4829c003-84cf-4b57-a7e0-93089b6b449c req-1ffd3d33-b59f-4b4f-a581-335b9bf082f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:49 compute-1 nova_compute[192795]: 2025-09-30 21:29:49.345 2 DEBUG nova.compute.manager [req-4829c003-84cf-4b57-a7e0-93089b6b449c req-1ffd3d33-b59f-4b4f-a581-335b9bf082f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] No waiting events found dispatching network-vif-plugged-3398e381-8092-498a-bc78-691e3883aa7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:49 compute-1 nova_compute[192795]: 2025-09-30 21:29:49.345 2 WARNING nova.compute.manager [req-4829c003-84cf-4b57-a7e0-93089b6b449c req-1ffd3d33-b59f-4b4f-a581-335b9bf082f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Received unexpected event network-vif-plugged-3398e381-8092-498a-bc78-691e3883aa7b for instance with vm_state active and task_state None.
Sep 30 21:29:49 compute-1 nova_compute[192795]: 2025-09-30 21:29:49.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.882 2 DEBUG oslo_concurrency.lockutils [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "925ae136-46c0-4920-93f9-36b983cd4f52" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.882 2 DEBUG oslo_concurrency.lockutils [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.883 2 DEBUG oslo_concurrency.lockutils [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.883 2 DEBUG oslo_concurrency.lockutils [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.883 2 DEBUG oslo_concurrency.lockutils [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.894 2 INFO nova.compute.manager [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Terminating instance
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.908 2 DEBUG nova.compute.manager [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:29:50 compute-1 kernel: tap3398e381-80 (unregistering): left promiscuous mode
Sep 30 21:29:50 compute-1 NetworkManager[51724]: <info>  [1759267790.9323] device (tap3398e381-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:29:50 compute-1 ovn_controller[94902]: 2025-09-30T21:29:50Z|00292|binding|INFO|Releasing lport 3398e381-8092-498a-bc78-691e3883aa7b from this chassis (sb_readonly=0)
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:50 compute-1 ovn_controller[94902]: 2025-09-30T21:29:50Z|00293|binding|INFO|Setting lport 3398e381-8092-498a-bc78-691e3883aa7b down in Southbound
Sep 30 21:29:50 compute-1 ovn_controller[94902]: 2025-09-30T21:29:50Z|00294|binding|INFO|Removing iface tap3398e381-80 ovn-installed in OVS
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:50.962 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:12:84 10.100.0.8'], port_security=['fa:16:3e:85:12:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '925ae136-46c0-4920-93f9-36b983cd4f52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3241d41d3774429b8e779e08d0fea5ab', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b25a45af-8444-4037-b0ab-0c1fe1275aaa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b782aabf-3878-470c-b18e-f62a4ce65158, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=3398e381-8092-498a-bc78-691e3883aa7b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:29:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:50.964 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 3398e381-8092-498a-bc78-691e3883aa7b in datapath 11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16 unbound from our chassis
Sep 30 21:29:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:50.966 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:29:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:50.968 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4fbab0e2-d951-4aba-976e-cf0988279efe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:50.969 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16 namespace which is not needed anymore
Sep 30 21:29:50 compute-1 nova_compute[192795]: 2025-09-30 21:29:50.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:50 compute-1 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Sep 30 21:29:50 compute-1 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004d.scope: Consumed 4.045s CPU time.
Sep 30 21:29:51 compute-1 systemd-machined[152783]: Machine qemu-36-instance-0000004d terminated.
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:51 compute-1 neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16[232090]: [NOTICE]   (232095) : haproxy version is 2.8.14-c23fe91
Sep 30 21:29:51 compute-1 neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16[232090]: [NOTICE]   (232095) : path to executable is /usr/sbin/haproxy
Sep 30 21:29:51 compute-1 neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16[232090]: [WARNING]  (232095) : Exiting Master process...
Sep 30 21:29:51 compute-1 neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16[232090]: [WARNING]  (232095) : Exiting Master process...
Sep 30 21:29:51 compute-1 neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16[232090]: [ALERT]    (232095) : Current worker (232098) exited with code 143 (Terminated)
Sep 30 21:29:51 compute-1 neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16[232090]: [WARNING]  (232095) : All workers exited. Exiting... (0)
Sep 30 21:29:51 compute-1 systemd[1]: libpod-3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa.scope: Deactivated successfully.
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:51 compute-1 podman[232164]: 2025-09-30 21:29:51.163166228 +0000 UTC m=+0.068750865 container died 3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.280 2 INFO nova.virt.libvirt.driver [-] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Instance destroyed successfully.
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.281 2 DEBUG nova.objects.instance [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lazy-loading 'resources' on Instance uuid 925ae136-46c0-4920-93f9-36b983cd4f52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:51 compute-1 systemd[1]: var-lib-containers-storage-overlay-e8c51e0380baad82aadae0200efee482af52b050c51cb06847269e4a3b0ea7a5-merged.mount: Deactivated successfully.
Sep 30 21:29:51 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa-userdata-shm.mount: Deactivated successfully.
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.296 2 DEBUG nova.virt.libvirt.vif [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:29:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-476572418',display_name='tempest-ImagesOneServerNegativeTestJSON-server-476572418',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-476572418',id=77,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:29:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3241d41d3774429b8e779e08d0fea5ab',ramdisk_id='',reservation_id='r-w2s0ujkz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1411663101',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1411663101-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:29:48Z,user_data=None,user_id='beaafecda25944bc8ae62afa508a5722',uuid=925ae136-46c0-4920-93f9-36b983cd4f52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.297 2 DEBUG nova.network.os_vif_util [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Converting VIF {"id": "3398e381-8092-498a-bc78-691e3883aa7b", "address": "fa:16:3e:85:12:84", "network": {"id": "11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-752664792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3241d41d3774429b8e779e08d0fea5ab", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3398e381-80", "ovs_interfaceid": "3398e381-8092-498a-bc78-691e3883aa7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.298 2 DEBUG nova.network.os_vif_util [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:12:84,bridge_name='br-int',has_traffic_filtering=True,id=3398e381-8092-498a-bc78-691e3883aa7b,network=Network(11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3398e381-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.298 2 DEBUG os_vif [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:12:84,bridge_name='br-int',has_traffic_filtering=True,id=3398e381-8092-498a-bc78-691e3883aa7b,network=Network(11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3398e381-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.301 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3398e381-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:51 compute-1 podman[232164]: 2025-09-30 21:29:51.30598706 +0000 UTC m=+0.211571667 container cleanup 3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.311 2 INFO os_vif [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:12:84,bridge_name='br-int',has_traffic_filtering=True,id=3398e381-8092-498a-bc78-691e3883aa7b,network=Network(11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3398e381-80')
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.312 2 INFO nova.virt.libvirt.driver [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Deleting instance files /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52_del
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.313 2 INFO nova.virt.libvirt.driver [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Deletion of /var/lib/nova/instances/925ae136-46c0-4920-93f9-36b983cd4f52_del complete
Sep 30 21:29:51 compute-1 systemd[1]: libpod-conmon-3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa.scope: Deactivated successfully.
Sep 30 21:29:51 compute-1 podman[232205]: 2025-09-30 21:29:51.364827017 +0000 UTC m=+0.064248887 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:29:51 compute-1 podman[232209]: 2025-09-30 21:29:51.36609311 +0000 UTC m=+0.065139327 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:29:51 compute-1 podman[232232]: 2025-09-30 21:29:51.378397131 +0000 UTC m=+0.047122948 container remove 3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:29:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:51.384 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e9247dc8-02de-4e7c-b0c5-8ddfd3bea28e]: (4, ('Tue Sep 30 09:29:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16 (3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa)\n3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa\nTue Sep 30 09:29:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16 (3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa)\n3cda8854bbe59dc70b67bdd77dd2910a23d53bcbffdfb8519fa77f35e03faefa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:51.385 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cbac9a94-02c9-4c9c-94a2-d3ced211d380]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:51.386 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11e0b1a8-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:51 compute-1 podman[232210]: 2025-09-30 21:29:51.392579646 +0000 UTC m=+0.082969961 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:29:51 compute-1 kernel: tap11e0b1a8-c0: left promiscuous mode
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.400 2 INFO nova.compute.manager [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Took 0.49 seconds to destroy the instance on the hypervisor.
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.400 2 DEBUG oslo.service.loopingcall [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.401 2 DEBUG nova.compute.manager [-] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.401 2 DEBUG nova.network.neutron [-] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:51.403 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0165de26-98d4-4b47-b303-9c7c1498cff6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:51.430 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7ea78ba7-92e6-457a-b367-2f78be687ba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:51.431 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[77045000-d1d2-4afe-bd22-d76701f57113]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:51.449 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5be67e6b-d78f-461a-991a-d892acfba557]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448262, 'reachable_time': 23272, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232287, 'error': None, 'target': 'ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:51 compute-1 systemd[1]: run-netns-ovnmeta\x2d11e0b1a8\x2dc3cf\x2d401a\x2dbf27\x2d9f90a4a8cf16.mount: Deactivated successfully.
Sep 30 21:29:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:51.454 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-11e0b1a8-c3cf-401a-bf27-9f90a4a8cf16 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:29:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:29:51.454 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8eda50-16db-4164-b612-69bc7fc7e039]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.634 2 DEBUG nova.compute.manager [req-71e98c3a-4e0c-49bb-858f-f0b46e02d180 req-e55c8f13-b8f0-4e94-a6b0-9278e0ab0a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Received event network-vif-unplugged-3398e381-8092-498a-bc78-691e3883aa7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.634 2 DEBUG oslo_concurrency.lockutils [req-71e98c3a-4e0c-49bb-858f-f0b46e02d180 req-e55c8f13-b8f0-4e94-a6b0-9278e0ab0a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.635 2 DEBUG oslo_concurrency.lockutils [req-71e98c3a-4e0c-49bb-858f-f0b46e02d180 req-e55c8f13-b8f0-4e94-a6b0-9278e0ab0a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.635 2 DEBUG oslo_concurrency.lockutils [req-71e98c3a-4e0c-49bb-858f-f0b46e02d180 req-e55c8f13-b8f0-4e94-a6b0-9278e0ab0a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.636 2 DEBUG nova.compute.manager [req-71e98c3a-4e0c-49bb-858f-f0b46e02d180 req-e55c8f13-b8f0-4e94-a6b0-9278e0ab0a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] No waiting events found dispatching network-vif-unplugged-3398e381-8092-498a-bc78-691e3883aa7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:51 compute-1 nova_compute[192795]: 2025-09-30 21:29:51.636 2 DEBUG nova.compute.manager [req-71e98c3a-4e0c-49bb-858f-f0b46e02d180 req-e55c8f13-b8f0-4e94-a6b0-9278e0ab0a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Received event network-vif-unplugged-3398e381-8092-498a-bc78-691e3883aa7b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.549 2 DEBUG nova.network.neutron [-] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.566 2 INFO nova.compute.manager [-] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Took 1.16 seconds to deallocate network for instance.
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.651 2 DEBUG oslo_concurrency.lockutils [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.653 2 DEBUG oslo_concurrency.lockutils [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.848 2 DEBUG nova.compute.provider_tree [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.865 2 DEBUG nova.scheduler.client.report [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.895 2 DEBUG oslo_concurrency.lockutils [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.923 2 INFO nova.scheduler.client.report [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Deleted allocations for instance 925ae136-46c0-4920-93f9-36b983cd4f52
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.938 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-925ae136-46c0-4920-93f9-36b983cd4f52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.939 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-925ae136-46c0-4920-93f9-36b983cd4f52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.939 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:29:52 compute-1 nova_compute[192795]: 2025-09-30 21:29:52.940 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 925ae136-46c0-4920-93f9-36b983cd4f52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.036 2 DEBUG oslo_concurrency.lockutils [None req-cf5ab45e-17fe-43d1-a611-5c02b209254b beaafecda25944bc8ae62afa508a5722 3241d41d3774429b8e779e08d0fea5ab - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.143 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.515 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.564 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-925ae136-46c0-4920-93f9-36b983cd4f52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.565 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.720 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.823 2 DEBUG nova.compute.manager [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Received event network-vif-plugged-3398e381-8092-498a-bc78-691e3883aa7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.824 2 DEBUG oslo_concurrency.lockutils [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.825 2 DEBUG oslo_concurrency.lockutils [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.825 2 DEBUG oslo_concurrency.lockutils [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "925ae136-46c0-4920-93f9-36b983cd4f52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.827 2 DEBUG nova.compute.manager [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] No waiting events found dispatching network-vif-plugged-3398e381-8092-498a-bc78-691e3883aa7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.827 2 WARNING nova.compute.manager [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Received unexpected event network-vif-plugged-3398e381-8092-498a-bc78-691e3883aa7b for instance with vm_state deleted and task_state None.
Sep 30 21:29:53 compute-1 nova_compute[192795]: 2025-09-30 21:29:53.828 2 DEBUG nova.compute.manager [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Received event network-vif-deleted-3398e381-8092-498a-bc78-691e3883aa7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:56 compute-1 nova_compute[192795]: 2025-09-30 21:29:56.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:57 compute-1 nova_compute[192795]: 2025-09-30 21:29:57.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:58 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:29:58 compute-1 systemd[232113]: Activating special unit Exit the Session...
Sep 30 21:29:58 compute-1 systemd[232113]: Stopped target Main User Target.
Sep 30 21:29:58 compute-1 systemd[232113]: Stopped target Basic System.
Sep 30 21:29:58 compute-1 systemd[232113]: Stopped target Paths.
Sep 30 21:29:58 compute-1 systemd[232113]: Stopped target Sockets.
Sep 30 21:29:58 compute-1 systemd[232113]: Stopped target Timers.
Sep 30 21:29:58 compute-1 systemd[232113]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:29:58 compute-1 systemd[232113]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:29:58 compute-1 systemd[232113]: Closed D-Bus User Message Bus Socket.
Sep 30 21:29:58 compute-1 systemd[232113]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:29:58 compute-1 systemd[232113]: Removed slice User Application Slice.
Sep 30 21:29:58 compute-1 systemd[232113]: Reached target Shutdown.
Sep 30 21:29:58 compute-1 systemd[232113]: Finished Exit the Session.
Sep 30 21:29:58 compute-1 systemd[232113]: Reached target Exit the Session.
Sep 30 21:29:58 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:29:58 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:29:58 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:29:58 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:29:58 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:29:58 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:29:58 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:29:59 compute-1 nova_compute[192795]: 2025-09-30 21:29:59.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:00 compute-1 nova_compute[192795]: 2025-09-30 21:30:00.705 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:00 compute-1 nova_compute[192795]: 2025-09-30 21:30:00.958 2 DEBUG nova.compute.manager [req-c5e2d1e7-7205-4257-8df6-18059a72cd36 req-6e225406-f918-4185-a0e5-645ffe6e5cc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received event network-vif-unplugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:00 compute-1 nova_compute[192795]: 2025-09-30 21:30:00.959 2 DEBUG oslo_concurrency.lockutils [req-c5e2d1e7-7205-4257-8df6-18059a72cd36 req-6e225406-f918-4185-a0e5-645ffe6e5cc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:00 compute-1 nova_compute[192795]: 2025-09-30 21:30:00.959 2 DEBUG oslo_concurrency.lockutils [req-c5e2d1e7-7205-4257-8df6-18059a72cd36 req-6e225406-f918-4185-a0e5-645ffe6e5cc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:00 compute-1 nova_compute[192795]: 2025-09-30 21:30:00.960 2 DEBUG oslo_concurrency.lockutils [req-c5e2d1e7-7205-4257-8df6-18059a72cd36 req-6e225406-f918-4185-a0e5-645ffe6e5cc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:00 compute-1 nova_compute[192795]: 2025-09-30 21:30:00.960 2 DEBUG nova.compute.manager [req-c5e2d1e7-7205-4257-8df6-18059a72cd36 req-6e225406-f918-4185-a0e5-645ffe6e5cc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] No waiting events found dispatching network-vif-unplugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:00 compute-1 nova_compute[192795]: 2025-09-30 21:30:00.960 2 WARNING nova.compute.manager [req-c5e2d1e7-7205-4257-8df6-18059a72cd36 req-6e225406-f918-4185-a0e5-645ffe6e5cc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received unexpected event network-vif-unplugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:30:01 compute-1 podman[232290]: 2025-09-30 21:30:01.265557023 +0000 UTC m=+0.093470388 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:30:01 compute-1 nova_compute[192795]: 2025-09-30 21:30:01.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:01 compute-1 nova_compute[192795]: 2025-09-30 21:30:01.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:01 compute-1 sshd-session[232311]: Accepted publickey for nova from 192.168.122.102 port 46562 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:30:01 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:30:01 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:30:01 compute-1 systemd-logind[793]: New session 52 of user nova.
Sep 30 21:30:01 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:30:01 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:30:01 compute-1 systemd[232315]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:30:01 compute-1 systemd[232315]: Queued start job for default target Main User Target.
Sep 30 21:30:02 compute-1 systemd[232315]: Created slice User Application Slice.
Sep 30 21:30:02 compute-1 systemd[232315]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:30:02 compute-1 systemd[232315]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:30:02 compute-1 systemd[232315]: Reached target Paths.
Sep 30 21:30:02 compute-1 systemd[232315]: Reached target Timers.
Sep 30 21:30:02 compute-1 systemd[232315]: Starting D-Bus User Message Bus Socket...
Sep 30 21:30:02 compute-1 systemd[232315]: Starting Create User's Volatile Files and Directories...
Sep 30 21:30:02 compute-1 systemd[232315]: Finished Create User's Volatile Files and Directories.
Sep 30 21:30:02 compute-1 systemd[232315]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:30:02 compute-1 systemd[232315]: Reached target Sockets.
Sep 30 21:30:02 compute-1 systemd[232315]: Reached target Basic System.
Sep 30 21:30:02 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:30:02 compute-1 systemd[232315]: Reached target Main User Target.
Sep 30 21:30:02 compute-1 systemd[232315]: Startup finished in 180ms.
Sep 30 21:30:02 compute-1 systemd[1]: Started Session 52 of User nova.
Sep 30 21:30:02 compute-1 sshd-session[232311]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:30:02 compute-1 nova_compute[192795]: 2025-09-30 21:30:02.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:02 compute-1 sshd-session[232331]: Received disconnect from 192.168.122.102 port 46562:11: disconnected by user
Sep 30 21:30:02 compute-1 sshd-session[232331]: Disconnected from user nova 192.168.122.102 port 46562
Sep 30 21:30:02 compute-1 sshd-session[232311]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:30:02 compute-1 systemd[1]: session-52.scope: Deactivated successfully.
Sep 30 21:30:02 compute-1 systemd-logind[793]: Session 52 logged out. Waiting for processes to exit.
Sep 30 21:30:02 compute-1 systemd-logind[793]: Removed session 52.
Sep 30 21:30:02 compute-1 sshd-session[232333]: Accepted publickey for nova from 192.168.122.102 port 46568 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:30:02 compute-1 systemd-logind[793]: New session 54 of user nova.
Sep 30 21:30:02 compute-1 systemd[1]: Started Session 54 of User nova.
Sep 30 21:30:02 compute-1 sshd-session[232333]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:30:02 compute-1 sshd-session[232336]: Received disconnect from 192.168.122.102 port 46568:11: disconnected by user
Sep 30 21:30:02 compute-1 sshd-session[232336]: Disconnected from user nova 192.168.122.102 port 46568
Sep 30 21:30:02 compute-1 sshd-session[232333]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:30:02 compute-1 systemd[1]: session-54.scope: Deactivated successfully.
Sep 30 21:30:02 compute-1 systemd-logind[793]: Session 54 logged out. Waiting for processes to exit.
Sep 30 21:30:02 compute-1 systemd-logind[793]: Removed session 54.
Sep 30 21:30:03 compute-1 sshd-session[232338]: Accepted publickey for nova from 192.168.122.102 port 46582 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:30:03 compute-1 systemd-logind[793]: New session 55 of user nova.
Sep 30 21:30:03 compute-1 systemd[1]: Started Session 55 of User nova.
Sep 30 21:30:03 compute-1 sshd-session[232338]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:30:03 compute-1 sshd-session[232341]: Received disconnect from 192.168.122.102 port 46582:11: disconnected by user
Sep 30 21:30:03 compute-1 sshd-session[232341]: Disconnected from user nova 192.168.122.102 port 46582
Sep 30 21:30:03 compute-1 sshd-session[232338]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:30:03 compute-1 systemd[1]: session-55.scope: Deactivated successfully.
Sep 30 21:30:03 compute-1 systemd-logind[793]: Session 55 logged out. Waiting for processes to exit.
Sep 30 21:30:03 compute-1 systemd-logind[793]: Removed session 55.
Sep 30 21:30:03 compute-1 nova_compute[192795]: 2025-09-30 21:30:03.265 2 DEBUG nova.compute.manager [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received event network-vif-plugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:03 compute-1 nova_compute[192795]: 2025-09-30 21:30:03.265 2 DEBUG oslo_concurrency.lockutils [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:03 compute-1 nova_compute[192795]: 2025-09-30 21:30:03.266 2 DEBUG oslo_concurrency.lockutils [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:03 compute-1 nova_compute[192795]: 2025-09-30 21:30:03.266 2 DEBUG oslo_concurrency.lockutils [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:03 compute-1 nova_compute[192795]: 2025-09-30 21:30:03.266 2 DEBUG nova.compute.manager [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] No waiting events found dispatching network-vif-plugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:03 compute-1 nova_compute[192795]: 2025-09-30 21:30:03.266 2 WARNING nova.compute.manager [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received unexpected event network-vif-plugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:30:03 compute-1 nova_compute[192795]: 2025-09-30 21:30:03.899 2 INFO nova.network.neutron [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Updating port b40318a3-ceae-4ea1-8530-6e274dd81ed1 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 21:30:05 compute-1 nova_compute[192795]: 2025-09-30 21:30:05.023 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "refresh_cache-4f678545-d6f3-45f7-8bac-260f6079e85f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:05 compute-1 nova_compute[192795]: 2025-09-30 21:30:05.023 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquired lock "refresh_cache-4f678545-d6f3-45f7-8bac-260f6079e85f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:05 compute-1 nova_compute[192795]: 2025-09-30 21:30:05.024 2 DEBUG nova.network.neutron [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:30:05 compute-1 nova_compute[192795]: 2025-09-30 21:30:05.252 2 DEBUG nova.compute.manager [req-cb67b3a4-752f-4d09-986d-b01a8052c29e req-50123cc6-9c0e-499a-8d78-8df465bbca69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received event network-changed-b40318a3-ceae-4ea1-8530-6e274dd81ed1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:05 compute-1 nova_compute[192795]: 2025-09-30 21:30:05.253 2 DEBUG nova.compute.manager [req-cb67b3a4-752f-4d09-986d-b01a8052c29e req-50123cc6-9c0e-499a-8d78-8df465bbca69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Refreshing instance network info cache due to event network-changed-b40318a3-ceae-4ea1-8530-6e274dd81ed1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:30:05 compute-1 nova_compute[192795]: 2025-09-30 21:30:05.253 2 DEBUG oslo_concurrency.lockutils [req-cb67b3a4-752f-4d09-986d-b01a8052c29e req-50123cc6-9c0e-499a-8d78-8df465bbca69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-4f678545-d6f3-45f7-8bac-260f6079e85f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:06 compute-1 nova_compute[192795]: 2025-09-30 21:30:06.278 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267791.2764754, 925ae136-46c0-4920-93f9-36b983cd4f52 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:06 compute-1 nova_compute[192795]: 2025-09-30 21:30:06.279 2 INFO nova.compute.manager [-] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] VM Stopped (Lifecycle Event)
Sep 30 21:30:06 compute-1 nova_compute[192795]: 2025-09-30 21:30:06.299 2 DEBUG nova.compute.manager [None req-dcf325db-d3a6-44bb-88a1-f300ab4597e7 - - - - - -] [instance: 925ae136-46c0-4920-93f9-36b983cd4f52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:06 compute-1 nova_compute[192795]: 2025-09-30 21:30:06.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-1 nova_compute[192795]: 2025-09-30 21:30:06.904 2 DEBUG nova.network.neutron [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Updating instance_info_cache with network_info: [{"id": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "address": "fa:16:3e:e3:58:1c", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40318a3-ce", "ovs_interfaceid": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:06 compute-1 nova_compute[192795]: 2025-09-30 21:30:06.931 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Releasing lock "refresh_cache-4f678545-d6f3-45f7-8bac-260f6079e85f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:06 compute-1 nova_compute[192795]: 2025-09-30 21:30:06.938 2 DEBUG oslo_concurrency.lockutils [req-cb67b3a4-752f-4d09-986d-b01a8052c29e req-50123cc6-9c0e-499a-8d78-8df465bbca69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-4f678545-d6f3-45f7-8bac-260f6079e85f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:06 compute-1 nova_compute[192795]: 2025-09-30 21:30:06.939 2 DEBUG nova.network.neutron [req-cb67b3a4-752f-4d09-986d-b01a8052c29e req-50123cc6-9c0e-499a-8d78-8df465bbca69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Refreshing network info cache for port b40318a3-ceae-4ea1-8530-6e274dd81ed1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.096 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.098 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.099 2 INFO nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Creating image(s)
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.101 2 DEBUG nova.objects.instance [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4f678545-d6f3-45f7-8bac-260f6079e85f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.122 2 DEBUG oslo_concurrency.processutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.190 2 DEBUG oslo_concurrency.processutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.192 2 DEBUG nova.virt.disk.api [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Checking if we can resize image /var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.193 2 DEBUG oslo_concurrency.processutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.256 2 DEBUG oslo_concurrency.processutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.257 2 DEBUG nova.virt.disk.api [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Cannot resize image /var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.272 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.273 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Ensure instance console log exists: /var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.274 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.274 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.275 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.280 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Start _get_guest_xml network_info=[{"id": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "address": "fa:16:3e:e3:58:1c", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:e3:58:1c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40318a3-ce", "ovs_interfaceid": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.288 2 WARNING nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.297 2 DEBUG nova.virt.libvirt.host [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.298 2 DEBUG nova.virt.libvirt.host [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.302 2 DEBUG nova.virt.libvirt.host [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.303 2 DEBUG nova.virt.libvirt.host [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.305 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.305 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9779bca-1eb6-4567-a36c-b452abeafc70',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.306 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.306 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.307 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.307 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.307 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.308 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.308 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.308 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.309 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.309 2 DEBUG nova.virt.hardware [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.309 2 DEBUG nova.objects.instance [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4f678545-d6f3-45f7-8bac-260f6079e85f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.333 2 DEBUG oslo_concurrency.processutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.405 2 DEBUG oslo_concurrency.processutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk.config --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.406 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "/var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.407 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "/var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.408 2 DEBUG oslo_concurrency.lockutils [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "/var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.409 2 DEBUG nova.virt.libvirt.vif [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:29:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1448324042',display_name='tempest-ServerDiskConfigTestJSON-server-1448324042',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1448324042',id=76,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:29:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-an5by0yd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:30:03Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=4f678545-d6f3-45f7-8bac-260f6079e85f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "address": "fa:16:3e:e3:58:1c", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:e3:58:1c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40318a3-ce", "ovs_interfaceid": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.410 2 DEBUG nova.network.os_vif_util [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "address": "fa:16:3e:e3:58:1c", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:e3:58:1c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40318a3-ce", "ovs_interfaceid": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.411 2 DEBUG nova.network.os_vif_util [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:58:1c,bridge_name='br-int',has_traffic_filtering=True,id=b40318a3-ceae-4ea1-8530-6e274dd81ed1,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40318a3-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.414 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <uuid>4f678545-d6f3-45f7-8bac-260f6079e85f</uuid>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <name>instance-0000004c</name>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <memory>196608</memory>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-1448324042</nova:name>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:30:07</nova:creationTime>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <nova:flavor name="m1.micro">
Sep 30 21:30:07 compute-1 nova_compute[192795]:         <nova:memory>192</nova:memory>
Sep 30 21:30:07 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:30:07 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:30:07 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:30:07 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:30:07 compute-1 nova_compute[192795]:         <nova:user uuid="648f7bb37eeb4003825636f9a7c1f92a">tempest-ServerDiskConfigTestJSON-1133643549-project-member</nova:user>
Sep 30 21:30:07 compute-1 nova_compute[192795]:         <nova:project uuid="72559935caa44fd9b779b6770f00199f">tempest-ServerDiskConfigTestJSON-1133643549</nova:project>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:30:07 compute-1 nova_compute[192795]:         <nova:port uuid="b40318a3-ceae-4ea1-8530-6e274dd81ed1">
Sep 30 21:30:07 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <system>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <entry name="serial">4f678545-d6f3-45f7-8bac-260f6079e85f</entry>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <entry name="uuid">4f678545-d6f3-45f7-8bac-260f6079e85f</entry>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     </system>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <os>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   </os>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <features>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   </features>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/disk.config"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:e3:58:1c"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <target dev="tapb40318a3-ce"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f/console.log" append="off"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <video>
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     </video>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:30:07 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:30:07 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:30:07 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:30:07 compute-1 nova_compute[192795]: </domain>
Sep 30 21:30:07 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.415 2 DEBUG nova.virt.libvirt.vif [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:29:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1448324042',display_name='tempest-ServerDiskConfigTestJSON-server-1448324042',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1448324042',id=76,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:29:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-an5by0yd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:30:03Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=4f678545-d6f3-45f7-8bac-260f6079e85f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "address": "fa:16:3e:e3:58:1c", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:e3:58:1c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40318a3-ce", "ovs_interfaceid": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.415 2 DEBUG nova.network.os_vif_util [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "address": "fa:16:3e:e3:58:1c", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:e3:58:1c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40318a3-ce", "ovs_interfaceid": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.416 2 DEBUG nova.network.os_vif_util [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:58:1c,bridge_name='br-int',has_traffic_filtering=True,id=b40318a3-ceae-4ea1-8530-6e274dd81ed1,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40318a3-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.416 2 DEBUG os_vif [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:58:1c,bridge_name='br-int',has_traffic_filtering=True,id=b40318a3-ceae-4ea1-8530-6e274dd81ed1,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40318a3-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.419 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.419 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.423 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb40318a3-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.424 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb40318a3-ce, col_values=(('external_ids', {'iface-id': 'b40318a3-ceae-4ea1-8530-6e274dd81ed1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e3:58:1c', 'vm-uuid': '4f678545-d6f3-45f7-8bac-260f6079e85f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-1 NetworkManager[51724]: <info>  [1759267807.4269] manager: (tapb40318a3-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.433 2 INFO os_vif [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:58:1c,bridge_name='br-int',has_traffic_filtering=True,id=b40318a3-ceae-4ea1-8530-6e274dd81ed1,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40318a3-ce')
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.494 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.496 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.496 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] No VIF found with MAC fa:16:3e:e3:58:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.497 2 INFO nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Using config drive
Sep 30 21:30:07 compute-1 kernel: tapb40318a3-ce: entered promiscuous mode
Sep 30 21:30:07 compute-1 NetworkManager[51724]: <info>  [1759267807.5907] manager: (tapb40318a3-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-1 ovn_controller[94902]: 2025-09-30T21:30:07Z|00295|binding|INFO|Claiming lport b40318a3-ceae-4ea1-8530-6e274dd81ed1 for this chassis.
Sep 30 21:30:07 compute-1 ovn_controller[94902]: 2025-09-30T21:30:07Z|00296|binding|INFO|b40318a3-ceae-4ea1-8530-6e274dd81ed1: Claiming fa:16:3e:e3:58:1c 10.100.0.5
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.614 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:58:1c 10.100.0.5'], port_security=['fa:16:3e:e3:58:1c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4f678545-d6f3-45f7-8bac-260f6079e85f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a145b225-510f-43a7-8cc6-fccae3ed647e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72559935caa44fd9b779b6770f00199f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3a098193-23af-4fd8-a818-c9a9c1a46706', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4fe4d7-2316-4403-a18b-3b0227898f0d, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=b40318a3-ceae-4ea1-8530-6e274dd81ed1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.616 103861 INFO neutron.agent.ovn.metadata.agent [-] Port b40318a3-ceae-4ea1-8530-6e274dd81ed1 in datapath a145b225-510f-43a7-8cc6-fccae3ed647e bound to our chassis
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.620 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.636 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d6fd51-e65e-4e4e-9e9e-0bea257247e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.637 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa145b225-51 in ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.639 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa145b225-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.640 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f015137a-b23b-4f5c-8280-7623cdf9cc0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.641 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[983394ac-5f1b-4031-baf4-5d62c5bfa1c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 systemd-udevd[232369]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.662 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[64f1188b-2ef2-4099-87e4-018dfda12d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 NetworkManager[51724]: <info>  [1759267807.6692] device (tapb40318a3-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:30:07 compute-1 systemd-machined[152783]: New machine qemu-37-instance-0000004c.
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-1 NetworkManager[51724]: <info>  [1759267807.6737] device (tapb40318a3-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:30:07 compute-1 ovn_controller[94902]: 2025-09-30T21:30:07Z|00297|binding|INFO|Setting lport b40318a3-ceae-4ea1-8530-6e274dd81ed1 ovn-installed in OVS
Sep 30 21:30:07 compute-1 ovn_controller[94902]: 2025-09-30T21:30:07Z|00298|binding|INFO|Setting lport b40318a3-ceae-4ea1-8530-6e274dd81ed1 up in Southbound
Sep 30 21:30:07 compute-1 nova_compute[192795]: 2025-09-30 21:30:07.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.685 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5bffa443-11bc-4849-a552-c2df001a17d0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 systemd[1]: Started Virtual Machine qemu-37-instance-0000004c.
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.722 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f74a1642-9ae9-4152-ae15-b5a9d28284ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.727 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5ba658-dba6-4fcb-a38e-a8dfc765ed56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 systemd-udevd[232374]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:07 compute-1 NetworkManager[51724]: <info>  [1759267807.7300] manager: (tapa145b225-50): new Veth device (/org/freedesktop/NetworkManager/Devices/147)
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.764 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[32aabcdd-2b46-45fe-9bd3-501eed90d486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.768 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec758a6-2e47-4415-8dcd-b42c2786be38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 NetworkManager[51724]: <info>  [1759267807.7970] device (tapa145b225-50): carrier: link connected
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.803 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[bc44196b-20c6-4916-b02b-b720f8894b3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.825 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f96847-8ddb-483e-ad9c-589351225482]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa145b225-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:43:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450347, 'reachable_time': 29318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232402, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.851 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4d807e12-fcf4-4384-8d84-ee2ee9e244b0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:43a1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450347, 'tstamp': 450347}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232403, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.878 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[770cf3ab-5f9f-4f1c-a0b2-92fa94fbf22b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa145b225-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:43:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450347, 'reachable_time': 29318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232406, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:07.923 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[095c5267-a9eb-40db-89eb-d4fb86387f47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:08.001 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2455cdf7-3e9d-4b41-93f8-3a17e854e0d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:08.002 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa145b225-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:08.003 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:08.003 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa145b225-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:08 compute-1 NetworkManager[51724]: <info>  [1759267808.0059] manager: (tapa145b225-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Sep 30 21:30:08 compute-1 kernel: tapa145b225-50: entered promiscuous mode
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:08.010 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa145b225-50, col_values=(('external_ids', {'iface-id': '0f20179e-1a66-48c7-97c7-a3ccb2b25749'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:08 compute-1 ovn_controller[94902]: 2025-09-30T21:30:08Z|00299|binding|INFO|Releasing lport 0f20179e-1a66-48c7-97c7-a3ccb2b25749 from this chassis (sb_readonly=0)
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:08.014 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:08.015 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f13ecf69-eaf1-48b0-b912-7813211c4e3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:08.016 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:30:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:08.016 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'env', 'PROCESS_TAG=haproxy-a145b225-510f-43a7-8cc6-fccae3ed647e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a145b225-510f-43a7-8cc6-fccae3ed647e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.379 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267808.3789659, 4f678545-d6f3-45f7-8bac-260f6079e85f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.380 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] VM Resumed (Lifecycle Event)
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.382 2 DEBUG nova.compute.manager [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.386 2 INFO nova.virt.libvirt.driver [-] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Instance running successfully.
Sep 30 21:30:08 compute-1 virtqemud[192217]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.389 2 DEBUG nova.virt.libvirt.guest [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.389 2 DEBUG nova.virt.libvirt.driver [None req-bb5b3a28-6c99-4fca-9a72-ffddfed68a1b 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Sep 30 21:30:08 compute-1 podman[232444]: 2025-09-30 21:30:08.446919975 +0000 UTC m=+0.063302584 container create 21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:30:08 compute-1 systemd[1]: Started libpod-conmon-21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48.scope.
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.487 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.492 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:08 compute-1 podman[232444]: 2025-09-30 21:30:08.41043246 +0000 UTC m=+0.026815089 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:30:08 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:30:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fd350a9d87dbdff467217c68dffa959749f94606639cd6cd66dbe7d0168f027/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:30:08 compute-1 podman[232444]: 2025-09-30 21:30:08.55922742 +0000 UTC m=+0.175610109 container init 21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:30:08 compute-1 podman[232444]: 2025-09-30 21:30:08.5669657 +0000 UTC m=+0.183348349 container start 21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:30:08 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232459]: [NOTICE]   (232463) : New worker (232465) forked
Sep 30 21:30:08 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232459]: [NOTICE]   (232463) : Loading success.
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.625 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.626 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267808.3817782, 4f678545-d6f3-45f7-8bac-260f6079e85f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.627 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] VM Started (Lifecycle Event)
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.659 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.664 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.697 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.910 2 DEBUG nova.network.neutron [req-cb67b3a4-752f-4d09-986d-b01a8052c29e req-50123cc6-9c0e-499a-8d78-8df465bbca69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Updated VIF entry in instance network info cache for port b40318a3-ceae-4ea1-8530-6e274dd81ed1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.911 2 DEBUG nova.network.neutron [req-cb67b3a4-752f-4d09-986d-b01a8052c29e req-50123cc6-9c0e-499a-8d78-8df465bbca69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Updating instance_info_cache with network_info: [{"id": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "address": "fa:16:3e:e3:58:1c", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40318a3-ce", "ovs_interfaceid": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:08 compute-1 nova_compute[192795]: 2025-09-30 21:30:08.934 2 DEBUG oslo_concurrency.lockutils [req-cb67b3a4-752f-4d09-986d-b01a8052c29e req-50123cc6-9c0e-499a-8d78-8df465bbca69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-4f678545-d6f3-45f7-8bac-260f6079e85f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.335 2 DEBUG nova.compute.manager [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received event network-vif-plugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.336 2 DEBUG oslo_concurrency.lockutils [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.336 2 DEBUG oslo_concurrency.lockutils [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.337 2 DEBUG oslo_concurrency.lockutils [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.337 2 DEBUG nova.compute.manager [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] No waiting events found dispatching network-vif-plugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.337 2 WARNING nova.compute.manager [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received unexpected event network-vif-plugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 for instance with vm_state resized and task_state None.
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.338 2 DEBUG nova.compute.manager [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received event network-vif-plugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.338 2 DEBUG oslo_concurrency.lockutils [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.338 2 DEBUG oslo_concurrency.lockutils [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.338 2 DEBUG oslo_concurrency.lockutils [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.339 2 DEBUG nova.compute.manager [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] No waiting events found dispatching network-vif-plugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:09 compute-1 nova_compute[192795]: 2025-09-30 21:30:09.339 2 WARNING nova.compute.manager [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received unexpected event network-vif-plugged-b40318a3-ceae-4ea1-8530-6e274dd81ed1 for instance with vm_state resized and task_state None.
Sep 30 21:30:11 compute-1 podman[232475]: 2025-09-30 21:30:11.236454382 +0000 UTC m=+0.067613594 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, container_name=multipathd)
Sep 30 21:30:11 compute-1 podman[232477]: 2025-09-30 21:30:11.251482707 +0000 UTC m=+0.079926604 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:30:11 compute-1 podman[232476]: 2025-09-30 21:30:11.29448674 +0000 UTC m=+0.127016330 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Sep 30 21:30:12 compute-1 nova_compute[192795]: 2025-09-30 21:30:12.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:13 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:30:13 compute-1 systemd[232315]: Activating special unit Exit the Session...
Sep 30 21:30:13 compute-1 systemd[232315]: Stopped target Main User Target.
Sep 30 21:30:13 compute-1 systemd[232315]: Stopped target Basic System.
Sep 30 21:30:13 compute-1 systemd[232315]: Stopped target Paths.
Sep 30 21:30:13 compute-1 systemd[232315]: Stopped target Sockets.
Sep 30 21:30:13 compute-1 systemd[232315]: Stopped target Timers.
Sep 30 21:30:13 compute-1 systemd[232315]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:30:13 compute-1 systemd[232315]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:30:13 compute-1 systemd[232315]: Closed D-Bus User Message Bus Socket.
Sep 30 21:30:13 compute-1 systemd[232315]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:30:13 compute-1 systemd[232315]: Removed slice User Application Slice.
Sep 30 21:30:13 compute-1 systemd[232315]: Reached target Shutdown.
Sep 30 21:30:13 compute-1 systemd[232315]: Finished Exit the Session.
Sep 30 21:30:13 compute-1 systemd[232315]: Reached target Exit the Session.
Sep 30 21:30:13 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:30:13 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:30:13 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:30:13 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:30:13 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:30:13 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:30:13 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:30:14 compute-1 podman[232545]: 2025-09-30 21:30:14.251752434 +0000 UTC m=+0.082682023 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, container_name=ceilometer_agent_compute)
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.308 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquiring lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.309 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.338 2 DEBUG nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.509 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.510 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.517 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.518 2 INFO nova.compute.claims [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.604 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.632 2 WARNING nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 1 instances on the hypervisor.
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.632 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Triggering sync for uuid 4f678545-d6f3-45f7-8bac-260f6079e85f _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.633 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Triggering sync for uuid f54ded8f-9992-46a9-af52-0cfa1b80a50a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.633 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "4f678545-d6f3-45f7-8bac-260f6079e85f" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.633 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.634 2 INFO nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] During sync_power_state the instance has a pending task (deleting). Skip.
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.634 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.634 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.677 2 DEBUG oslo_concurrency.lockutils [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "4f678545-d6f3-45f7-8bac-260f6079e85f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.678 2 DEBUG oslo_concurrency.lockutils [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.678 2 DEBUG oslo_concurrency.lockutils [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.680 2 DEBUG oslo_concurrency.lockutils [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.680 2 DEBUG oslo_concurrency.lockutils [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.690 2 DEBUG nova.compute.provider_tree [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.712 2 INFO nova.compute.manager [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Terminating instance
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.726 2 DEBUG nova.scheduler.client.report [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.748 2 DEBUG nova.compute.manager [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:30:17 compute-1 kernel: tapb40318a3-ce (unregistering): left promiscuous mode
Sep 30 21:30:17 compute-1 NetworkManager[51724]: <info>  [1759267817.7813] device (tapb40318a3-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.784 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.787 2 DEBUG nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:17 compute-1 ovn_controller[94902]: 2025-09-30T21:30:17Z|00300|binding|INFO|Releasing lport b40318a3-ceae-4ea1-8530-6e274dd81ed1 from this chassis (sb_readonly=0)
Sep 30 21:30:17 compute-1 ovn_controller[94902]: 2025-09-30T21:30:17Z|00301|binding|INFO|Setting lport b40318a3-ceae-4ea1-8530-6e274dd81ed1 down in Southbound
Sep 30 21:30:17 compute-1 ovn_controller[94902]: 2025-09-30T21:30:17Z|00302|binding|INFO|Removing iface tapb40318a3-ce ovn-installed in OVS
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:17.812 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:58:1c 10.100.0.5'], port_security=['fa:16:3e:e3:58:1c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4f678545-d6f3-45f7-8bac-260f6079e85f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a145b225-510f-43a7-8cc6-fccae3ed647e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72559935caa44fd9b779b6770f00199f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3a098193-23af-4fd8-a818-c9a9c1a46706', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4fe4d7-2316-4403-a18b-3b0227898f0d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=b40318a3-ceae-4ea1-8530-6e274dd81ed1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:17.813 103861 INFO neutron.agent.ovn.metadata.agent [-] Port b40318a3-ceae-4ea1-8530-6e274dd81ed1 in datapath a145b225-510f-43a7-8cc6-fccae3ed647e unbound from our chassis
Sep 30 21:30:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:17.814 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a145b225-510f-43a7-8cc6-fccae3ed647e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:30:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:17.816 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[78dbedb7-5cd4-4d74-a788-6f7663610c66]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:17.817 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e namespace which is not needed anymore
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:17 compute-1 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Sep 30 21:30:17 compute-1 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004c.scope: Consumed 10.166s CPU time.
Sep 30 21:30:17 compute-1 systemd-machined[152783]: Machine qemu-37-instance-0000004c terminated.
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.879 2 DEBUG nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.881 2 DEBUG nova.network.neutron [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.924 2 INFO nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.957 2 DEBUG nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232459]: [NOTICE]   (232463) : haproxy version is 2.8.14-c23fe91
Sep 30 21:30:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232459]: [NOTICE]   (232463) : path to executable is /usr/sbin/haproxy
Sep 30 21:30:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232459]: [WARNING]  (232463) : Exiting Master process...
Sep 30 21:30:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232459]: [ALERT]    (232463) : Current worker (232465) exited with code 143 (Terminated)
Sep 30 21:30:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232459]: [WARNING]  (232463) : All workers exited. Exiting... (0)
Sep 30 21:30:17 compute-1 nova_compute[192795]: 2025-09-30 21:30:17.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:17 compute-1 systemd[1]: libpod-21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48.scope: Deactivated successfully.
Sep 30 21:30:17 compute-1 conmon[232459]: conmon 21e6237b5d7ca6adfbc9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48.scope/container/memory.events
Sep 30 21:30:17 compute-1 podman[232590]: 2025-09-30 21:30:17.992848673 +0000 UTC m=+0.060013148 container died 21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:30:18 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48-userdata-shm.mount: Deactivated successfully.
Sep 30 21:30:18 compute-1 systemd[1]: var-lib-containers-storage-overlay-2fd350a9d87dbdff467217c68dffa959749f94606639cd6cd66dbe7d0168f027-merged.mount: Deactivated successfully.
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.044 2 INFO nova.virt.libvirt.driver [-] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Instance destroyed successfully.
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.044 2 DEBUG nova.objects.instance [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'resources' on Instance uuid 4f678545-d6f3-45f7-8bac-260f6079e85f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:18 compute-1 podman[232590]: 2025-09-30 21:30:18.061536995 +0000 UTC m=+0.128701380 container cleanup 21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:30:18 compute-1 systemd[1]: libpod-conmon-21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48.scope: Deactivated successfully.
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.076 2 DEBUG nova.virt.libvirt.vif [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:29:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1448324042',display_name='tempest-ServerDiskConfigTestJSON-server-1448324042',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1448324042',id=76,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-an5by0yd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:30:12Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=4f678545-d6f3-45f7-8bac-260f6079e85f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "address": "fa:16:3e:e3:58:1c", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40318a3-ce", "ovs_interfaceid": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.077 2 DEBUG nova.network.os_vif_util [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "address": "fa:16:3e:e3:58:1c", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40318a3-ce", "ovs_interfaceid": "b40318a3-ceae-4ea1-8530-6e274dd81ed1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.078 2 DEBUG nova.network.os_vif_util [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e3:58:1c,bridge_name='br-int',has_traffic_filtering=True,id=b40318a3-ceae-4ea1-8530-6e274dd81ed1,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40318a3-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.078 2 DEBUG os_vif [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:58:1c,bridge_name='br-int',has_traffic_filtering=True,id=b40318a3-ceae-4ea1-8530-6e274dd81ed1,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40318a3-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.081 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb40318a3-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.087 2 INFO os_vif [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:58:1c,bridge_name='br-int',has_traffic_filtering=True,id=b40318a3-ceae-4ea1-8530-6e274dd81ed1,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40318a3-ce')
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.088 2 INFO nova.virt.libvirt.driver [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Deleting instance files /var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f_del
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.089 2 INFO nova.virt.libvirt.driver [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Deletion of /var/lib/nova/instances/4f678545-d6f3-45f7-8bac-260f6079e85f_del complete
Sep 30 21:30:18 compute-1 podman[232636]: 2025-09-30 21:30:18.13409374 +0000 UTC m=+0.046074862 container remove 21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.136 2 DEBUG nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.138 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.138 2 INFO nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Creating image(s)
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.138 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquiring lock "/var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.139 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "/var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.139 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "/var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:18.143 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6e18faec-bd41-4e4f-b687-1c2852322e91]: (4, ('Tue Sep 30 09:30:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e (21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48)\n21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48\nTue Sep 30 09:30:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e (21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48)\n21e6237b5d7ca6adfbc9fdbd9a5c946dfb4ddb17bf4709bcc5ae51c97f499a48\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:18.145 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[69aa08ed-254d-4704-9ec4-4dcf6afc781c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:18.146 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa145b225-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:18 compute-1 kernel: tapa145b225-50: left promiscuous mode
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.156 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:18.164 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b67c48aa-a665-4e77-a81c-d9915da2620e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:18.207 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8f27e61d-33f2-466b-9d72-67ed4b583aba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:18.208 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e2f9d9-da3f-4a38-8f96-52ca95fe60be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.212 2 INFO nova.compute.manager [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Took 0.46 seconds to destroy the instance on the hypervisor.
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.213 2 DEBUG oslo.service.loopingcall [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.213 2 DEBUG nova.compute.manager [-] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.214 2 DEBUG nova.network.neutron [-] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.219 2 DEBUG nova.network.neutron [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.219 2 DEBUG nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.223 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.223 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.224 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:18.226 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cfd23ddf-0c58-4185-b1cf-dffcb8fcb801]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450338, 'reachable_time': 29351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232652, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:18.230 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:30:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:18.230 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[034c64db-9c63-4d66-a7d4-b8913e7cf759]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:18 compute-1 systemd[1]: run-netns-ovnmeta\x2da145b225\x2d510f\x2d43a7\x2d8cc6\x2dfccae3ed647e.mount: Deactivated successfully.
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.235 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.296 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.297 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.340 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.341 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.342 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.403 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.406 2 DEBUG nova.virt.disk.api [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Checking if we can resize image /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.407 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.464 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.465 2 DEBUG nova.virt.disk.api [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Cannot resize image /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.465 2 DEBUG nova.objects.instance [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lazy-loading 'migration_context' on Instance uuid f54ded8f-9992-46a9-af52-0cfa1b80a50a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.483 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.484 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Ensure instance console log exists: /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.485 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.485 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.485 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.487 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.493 2 WARNING nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.501 2 DEBUG nova.virt.libvirt.host [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.502 2 DEBUG nova.virt.libvirt.host [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.505 2 DEBUG nova.virt.libvirt.host [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.505 2 DEBUG nova.virt.libvirt.host [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.507 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.507 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.507 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.508 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.508 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.508 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.508 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.509 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.509 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.509 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.509 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.510 2 DEBUG nova.virt.hardware [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.513 2 DEBUG nova.objects.instance [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lazy-loading 'pci_devices' on Instance uuid f54ded8f-9992-46a9-af52-0cfa1b80a50a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.555 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <uuid>f54ded8f-9992-46a9-af52-0cfa1b80a50a</uuid>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <name>instance-0000004f</name>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <nova:name>tempest-ListImageFiltersTestJSON-server-777079542</nova:name>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:30:18</nova:creationTime>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:30:18 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:30:18 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:30:18 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:30:18 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:30:18 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:30:18 compute-1 nova_compute[192795]:         <nova:user uuid="5a7a790edd5341f99e0eea28163b2823">tempest-ListImageFiltersTestJSON-1719875855-project-member</nova:user>
Sep 30 21:30:18 compute-1 nova_compute[192795]:         <nova:project uuid="a29c3539646b4519bc06a42b7e32df43">tempest-ListImageFiltersTestJSON-1719875855</nova:project>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <nova:ports/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <system>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <entry name="serial">f54ded8f-9992-46a9-af52-0cfa1b80a50a</entry>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <entry name="uuid">f54ded8f-9992-46a9-af52-0cfa1b80a50a</entry>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     </system>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <os>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   </os>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <features>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   </features>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.config"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/console.log" append="off"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <video>
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     </video>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:30:18 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:30:18 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:30:18 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:30:18 compute-1 nova_compute[192795]: </domain>
Sep 30 21:30:18 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.623 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.624 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.624 2 INFO nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Using config drive
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.869 2 INFO nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Creating config drive at /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.config
Sep 30 21:30:18 compute-1 nova_compute[192795]: 2025-09-30 21:30:18.876 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeifpgyax execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:19 compute-1 nova_compute[192795]: 2025-09-30 21:30:19.020 2 DEBUG oslo_concurrency.processutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeifpgyax" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:19 compute-1 systemd-machined[152783]: New machine qemu-38-instance-0000004f.
Sep 30 21:30:19 compute-1 systemd[1]: Started Virtual Machine qemu-38-instance-0000004f.
Sep 30 21:30:19 compute-1 nova_compute[192795]: 2025-09-30 21:30:19.501 2 DEBUG nova.network.neutron [-] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:19 compute-1 nova_compute[192795]: 2025-09-30 21:30:19.560 2 INFO nova.compute.manager [-] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Took 1.35 seconds to deallocate network for instance.
Sep 30 21:30:19 compute-1 nova_compute[192795]: 2025-09-30 21:30:19.794 2 DEBUG oslo_concurrency.lockutils [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:19 compute-1 nova_compute[192795]: 2025-09-30 21:30:19.795 2 DEBUG oslo_concurrency.lockutils [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:19 compute-1 nova_compute[192795]: 2025-09-30 21:30:19.800 2 DEBUG oslo_concurrency.lockutils [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:19 compute-1 nova_compute[192795]: 2025-09-30 21:30:19.836 2 INFO nova.scheduler.client.report [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Deleted allocations for instance 4f678545-d6f3-45f7-8bac-260f6079e85f
Sep 30 21:30:19 compute-1 nova_compute[192795]: 2025-09-30 21:30:19.978 2 DEBUG oslo_concurrency.lockutils [None req-23e4b6d7-036d-4d6e-bc0b-3d48cc7879f3 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "4f678545-d6f3-45f7-8bac-260f6079e85f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.214 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267820.213982, f54ded8f-9992-46a9-af52-0cfa1b80a50a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.215 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] VM Resumed (Lifecycle Event)
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.217 2 DEBUG nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.218 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.221 2 INFO nova.virt.libvirt.driver [-] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Instance spawned successfully.
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.222 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.238 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.242 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.260 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.260 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.261 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.261 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.261 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.262 2 DEBUG nova.virt.libvirt.driver [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.322 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.323 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267820.2168198, f54ded8f-9992-46a9-af52-0cfa1b80a50a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.323 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] VM Started (Lifecycle Event)
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.382 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.387 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.428 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.459 2 INFO nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Took 2.32 seconds to spawn the instance on the hypervisor.
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.460 2 DEBUG nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.610 2 INFO nova.compute.manager [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Took 3.18 seconds to build instance.
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.642 2 DEBUG oslo_concurrency.lockutils [None req-c33ae30d-03ab-4a77-8074-5c5654b54471 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.643 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 3.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:20 compute-1 nova_compute[192795]: 2025-09-30 21:30:20.675 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:21 compute-1 nova_compute[192795]: 2025-09-30 21:30:21.255 2 DEBUG nova.compute.manager [req-164a66bb-d1bd-419f-a3b4-f92378802bc4 req-efdd8528-2c80-4483-a0cb-f853dca322da dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Received event network-vif-deleted-b40318a3-ceae-4ea1-8530-6e274dd81ed1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:22 compute-1 podman[232695]: 2025-09-30 21:30:22.233203464 +0000 UTC m=+0.063543362 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:30:22 compute-1 podman[232696]: 2025-09-30 21:30:22.233510844 +0000 UTC m=+0.063736518 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Sep 30 21:30:22 compute-1 podman[232694]: 2025-09-30 21:30:22.257375888 +0000 UTC m=+0.092847436 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 21:30:22 compute-1 nova_compute[192795]: 2025-09-30 21:30:22.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:23 compute-1 nova_compute[192795]: 2025-09-30 21:30:23.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:27 compute-1 nova_compute[192795]: 2025-09-30 21:30:27.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:28 compute-1 nova_compute[192795]: 2025-09-30 21:30:28.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:32 compute-1 podman[232772]: 2025-09-30 21:30:32.222422212 +0000 UTC m=+0.061635881 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Sep 30 21:30:32 compute-1 nova_compute[192795]: 2025-09-30 21:30:32.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:32 compute-1 nova_compute[192795]: 2025-09-30 21:30:32.501 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "b2a2285f-4f58-421c-a234-d42cebc7e645" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:32 compute-1 nova_compute[192795]: 2025-09-30 21:30:32.501 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:32 compute-1 nova_compute[192795]: 2025-09-30 21:30:32.525 2 DEBUG nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.036 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267818.034559, 4f678545-d6f3-45f7-8bac-260f6079e85f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.036 2 INFO nova.compute.manager [-] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] VM Stopped (Lifecycle Event)
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.655 2 DEBUG nova.compute.manager [None req-9573bfb6-b77c-4c9e-9ea8-f2c5f39d088d - - - - - -] [instance: 4f678545-d6f3-45f7-8bac-260f6079e85f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.788 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.789 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.799 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.800 2 INFO nova.compute.claims [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.925 2 DEBUG nova.compute.manager [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:33 compute-1 nova_compute[192795]: 2025-09-30 21:30:33.988 2 INFO nova.compute.manager [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] instance snapshotting
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.021 2 DEBUG nova.compute.provider_tree [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.037 2 DEBUG nova.scheduler.client.report [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.066 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.067 2 DEBUG nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.147 2 DEBUG nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.148 2 DEBUG nova.network.neutron [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.175 2 INFO nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.222 2 DEBUG nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.321 2 INFO nova.virt.libvirt.driver [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Beginning live snapshot process
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.342 2 DEBUG nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.344 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.344 2 INFO nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Creating image(s)
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.345 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "/var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.345 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "/var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.346 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "/var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.357 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.420 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.421 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.422 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.432 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.494 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.496 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.552 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.554 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.555 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:34 compute-1 virtqemud[192217]: invalid argument: disk vda does not have an active block job
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.601 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.620 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.621 2 DEBUG nova.virt.disk.api [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Checking if we can resize image /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.621 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.663 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.664 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.689 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.691 2 DEBUG nova.virt.disk.api [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Cannot resize image /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.691 2 DEBUG nova.objects.instance [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lazy-loading 'migration_context' on Instance uuid b2a2285f-4f58-421c-a234-d42cebc7e645 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.715 2 DEBUG nova.policy [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b78230ec3561401bac41d6a12631e379', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '635d1a8848cd4101b935e658c05f9037', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.719 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.719 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Ensure instance console log exists: /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.719 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.720 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.720 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.737 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json -f qcow2" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.748 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.821 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.822 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp7rfm58wx/1c363206be854675a6326ecd0bfc3136.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.857 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp7rfm58wx/1c363206be854675a6326ecd0bfc3136.delta 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.858 2 INFO nova.virt.libvirt.driver [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:30:34 compute-1 nova_compute[192795]: 2025-09-30 21:30:34.918 2 DEBUG nova.virt.libvirt.guest [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] COPY block job progress, current cursor: 0 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:30:35 compute-1 nova_compute[192795]: 2025-09-30 21:30:35.422 2 DEBUG nova.virt.libvirt.guest [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] COPY block job progress, current cursor: 75235328 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:30:35 compute-1 nova_compute[192795]: 2025-09-30 21:30:35.425 2 INFO nova.virt.libvirt.driver [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:30:35 compute-1 nova_compute[192795]: 2025-09-30 21:30:35.462 2 DEBUG nova.privsep.utils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:30:35 compute-1 nova_compute[192795]: 2025-09-30 21:30:35.463 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp7rfm58wx/1c363206be854675a6326ecd0bfc3136.delta /var/lib/nova/instances/snapshots/tmp7rfm58wx/1c363206be854675a6326ecd0bfc3136 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:35 compute-1 nova_compute[192795]: 2025-09-30 21:30:35.854 2 DEBUG oslo_concurrency.processutils [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp7rfm58wx/1c363206be854675a6326ecd0bfc3136.delta /var/lib/nova/instances/snapshots/tmp7rfm58wx/1c363206be854675a6326ecd0bfc3136" returned: 0 in 0.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:35 compute-1 nova_compute[192795]: 2025-09-30 21:30:35.861 2 INFO nova.virt.libvirt.driver [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Snapshot extracted, beginning image upload
Sep 30 21:30:37 compute-1 nova_compute[192795]: 2025-09-30 21:30:37.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:37 compute-1 nova_compute[192795]: 2025-09-30 21:30:37.649 2 DEBUG nova.network.neutron [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Successfully created port: 2881fce3-7ad4-48e8-9a8d-db239d0022f7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:30:38 compute-1 nova_compute[192795]: 2025-09-30 21:30:38.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:38.690 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:38.693 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:38.694 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:40 compute-1 nova_compute[192795]: 2025-09-30 21:30:40.101 2 INFO nova.virt.libvirt.driver [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Snapshot image upload complete
Sep 30 21:30:40 compute-1 nova_compute[192795]: 2025-09-30 21:30:40.102 2 INFO nova.compute.manager [None req-181d4cde-bf14-4eed-9497-a3a47612dc9b 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Took 6.10 seconds to snapshot the instance on the hypervisor.
Sep 30 21:30:40 compute-1 nova_compute[192795]: 2025-09-30 21:30:40.763 2 DEBUG nova.network.neutron [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Successfully updated port: 2881fce3-7ad4-48e8-9a8d-db239d0022f7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:30:40 compute-1 nova_compute[192795]: 2025-09-30 21:30:40.794 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "refresh_cache-b2a2285f-4f58-421c-a234-d42cebc7e645" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:40 compute-1 nova_compute[192795]: 2025-09-30 21:30:40.794 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquired lock "refresh_cache-b2a2285f-4f58-421c-a234-d42cebc7e645" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:40 compute-1 nova_compute[192795]: 2025-09-30 21:30:40.794 2 DEBUG nova.network.neutron [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:30:40 compute-1 nova_compute[192795]: 2025-09-30 21:30:40.922 2 DEBUG nova.compute.manager [req-9bee4ecc-1504-47a4-ac9a-b7b5acebfbf4 req-2aea34c0-d41b-4c16-9978-4c7cf27208b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received event network-changed-2881fce3-7ad4-48e8-9a8d-db239d0022f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:40 compute-1 nova_compute[192795]: 2025-09-30 21:30:40.923 2 DEBUG nova.compute.manager [req-9bee4ecc-1504-47a4-ac9a-b7b5acebfbf4 req-2aea34c0-d41b-4c16-9978-4c7cf27208b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Refreshing instance network info cache due to event network-changed-2881fce3-7ad4-48e8-9a8d-db239d0022f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:30:40 compute-1 nova_compute[192795]: 2025-09-30 21:30:40.924 2 DEBUG oslo_concurrency.lockutils [req-9bee4ecc-1504-47a4-ac9a-b7b5acebfbf4 req-2aea34c0-d41b-4c16-9978-4c7cf27208b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b2a2285f-4f58-421c-a234-d42cebc7e645" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:41 compute-1 nova_compute[192795]: 2025-09-30 21:30:41.706 2 DEBUG nova.network.neutron [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:30:42 compute-1 podman[232840]: 2025-09-30 21:30:42.235798873 +0000 UTC m=+0.064931751 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:30:42 compute-1 podman[232838]: 2025-09-30 21:30:42.25188822 +0000 UTC m=+0.089897069 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:30:42 compute-1 podman[232839]: 2025-09-30 21:30:42.297021553 +0000 UTC m=+0.120418855 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:30:42 compute-1 nova_compute[192795]: 2025-09-30 21:30:42.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.762 2 DEBUG nova.network.neutron [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Updating instance_info_cache with network_info: [{"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.783 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Releasing lock "refresh_cache-b2a2285f-4f58-421c-a234-d42cebc7e645" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.783 2 DEBUG nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Instance network_info: |[{"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.784 2 DEBUG oslo_concurrency.lockutils [req-9bee4ecc-1504-47a4-ac9a-b7b5acebfbf4 req-2aea34c0-d41b-4c16-9978-4c7cf27208b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b2a2285f-4f58-421c-a234-d42cebc7e645" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.784 2 DEBUG nova.network.neutron [req-9bee4ecc-1504-47a4-ac9a-b7b5acebfbf4 req-2aea34c0-d41b-4c16-9978-4c7cf27208b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Refreshing network info cache for port 2881fce3-7ad4-48e8-9a8d-db239d0022f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.787 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Start _get_guest_xml network_info=[{"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.793 2 WARNING nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.802 2 DEBUG nova.virt.libvirt.host [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.803 2 DEBUG nova.virt.libvirt.host [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.808 2 DEBUG nova.virt.libvirt.host [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.809 2 DEBUG nova.virt.libvirt.host [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.810 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.810 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.810 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.810 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.810 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.811 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.811 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.811 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.811 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.811 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.811 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.811 2 DEBUG nova.virt.hardware [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.815 2 DEBUG nova.virt.libvirt.vif [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=82,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFN7KIh9qkgYEW6aThUuoTAsfcOLNa55Wul/dSiKIZ6So0n1pI8fHMF8UmXfjLdVKB6SeKF645kYs+A/DgulmwQczcwL+uJu4BEcndBwtEwJpe973HTi196O56mrV1E+Ow==',key_name='tempest-keypair-873952531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='635d1a8848cd4101b935e658c05f9037',ramdisk_id='',reservation_id='r-7g2on38v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-467131096',owner_user_name='tempest-ServersV294TestFqdnHostnames-467131096-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:30:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b78230ec3561401bac41d6a12631e379',uuid=b2a2285f-4f58-421c-a234-d42cebc7e645,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.815 2 DEBUG nova.network.os_vif_util [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Converting VIF {"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.816 2 DEBUG nova.network.os_vif_util [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:c6:1f,bridge_name='br-int',has_traffic_filtering=True,id=2881fce3-7ad4-48e8-9a8d-db239d0022f7,network=Network(196d7954-47b3-470a-991c-3398bf1a7372),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2881fce3-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.817 2 DEBUG nova.objects.instance [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lazy-loading 'pci_devices' on Instance uuid b2a2285f-4f58-421c-a234-d42cebc7e645 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.836 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <uuid>b2a2285f-4f58-421c-a234-d42cebc7e645</uuid>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <name>instance-00000052</name>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <nova:name>guest-instance-1</nova:name>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:30:43</nova:creationTime>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:30:43 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:30:43 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:30:43 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:30:43 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:30:43 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:30:43 compute-1 nova_compute[192795]:         <nova:user uuid="b78230ec3561401bac41d6a12631e379">tempest-ServersV294TestFqdnHostnames-467131096-project-member</nova:user>
Sep 30 21:30:43 compute-1 nova_compute[192795]:         <nova:project uuid="635d1a8848cd4101b935e658c05f9037">tempest-ServersV294TestFqdnHostnames-467131096</nova:project>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:30:43 compute-1 nova_compute[192795]:         <nova:port uuid="2881fce3-7ad4-48e8-9a8d-db239d0022f7">
Sep 30 21:30:43 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <system>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <entry name="serial">b2a2285f-4f58-421c-a234-d42cebc7e645</entry>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <entry name="uuid">b2a2285f-4f58-421c-a234-d42cebc7e645</entry>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     </system>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <os>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   </os>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <features>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   </features>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk.config"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:61:c6:1f"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <target dev="tap2881fce3-7a"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/console.log" append="off"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <video>
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     </video>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:30:43 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:30:43 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:30:43 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:30:43 compute-1 nova_compute[192795]: </domain>
Sep 30 21:30:43 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.836 2 DEBUG nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Preparing to wait for external event network-vif-plugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.836 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.837 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.837 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.838 2 DEBUG nova.virt.libvirt.vif [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=82,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFN7KIh9qkgYEW6aThUuoTAsfcOLNa55Wul/dSiKIZ6So0n1pI8fHMF8UmXfjLdVKB6SeKF645kYs+A/DgulmwQczcwL+uJu4BEcndBwtEwJpe973HTi196O56mrV1E+Ow==',key_name='tempest-keypair-873952531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='635d1a8848cd4101b935e658c05f9037',ramdisk_id='',reservation_id='r-7g2on38v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-467131096',owner_user_name='tempest-ServersV294TestFqdnHostnames-467131096-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:30:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b78230ec3561401bac41d6a12631e379',uuid=b2a2285f-4f58-421c-a234-d42cebc7e645,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.838 2 DEBUG nova.network.os_vif_util [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Converting VIF {"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.839 2 DEBUG nova.network.os_vif_util [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:c6:1f,bridge_name='br-int',has_traffic_filtering=True,id=2881fce3-7ad4-48e8-9a8d-db239d0022f7,network=Network(196d7954-47b3-470a-991c-3398bf1a7372),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2881fce3-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.839 2 DEBUG os_vif [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c6:1f,bridge_name='br-int',has_traffic_filtering=True,id=2881fce3-7ad4-48e8-9a8d-db239d0022f7,network=Network(196d7954-47b3-470a-991c-3398bf1a7372),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2881fce3-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.841 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.841 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.846 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2881fce3-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.847 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2881fce3-7a, col_values=(('external_ids', {'iface-id': '2881fce3-7ad4-48e8-9a8d-db239d0022f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:c6:1f', 'vm-uuid': 'b2a2285f-4f58-421c-a234-d42cebc7e645'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-1 NetworkManager[51724]: <info>  [1759267843.8497] manager: (tap2881fce3-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.857 2 INFO os_vif [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c6:1f,bridge_name='br-int',has_traffic_filtering=True,id=2881fce3-7ad4-48e8-9a8d-db239d0022f7,network=Network(196d7954-47b3-470a-991c-3398bf1a7372),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2881fce3-7a')
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.944 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.944 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.945 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] No VIF found with MAC fa:16:3e:61:c6:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:30:43 compute-1 nova_compute[192795]: 2025-09-30 21:30:43.945 2 INFO nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Using config drive
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.019 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a29c3539646b4519bc06a42b7e32df43', 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'hostId': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.022 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b2a2285f-4f58-421c-a234-d42cebc7e645', 'name': 'guest-instance-1', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000052', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '635d1a8848cd4101b935e658c05f9037', 'user_id': 'b78230ec3561401bac41d6a12631e379', 'hostId': '0b1ecff6bc899aeee3829d37d0cf07d1c22f8f3cd10ade2eb0568a5c', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.022 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.025 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.025 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.044 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.read.bytes volume: 30493184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.044 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.046 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39133067-5ecf-4fa6-a71e-ab79ecd2c303', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30493184, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-vda', 'timestamp': '2025-09-30T21:30:44.025695', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8986f60-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': 'eefb60355098b5f1029cb862449e138ea5df4b796fb294fe5ed5c850f801dad8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-sda', 'timestamp': '2025-09-30T21:30:44.025695', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8987b72-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '01a7e0690823d415b94b732385f718d8692d4747996274ab769688520d584e5a'}]}, 'timestamp': '2025-09-30 21:30:44.046282', '_unique_id': 'd17365938b9e451ca806b793a2260345'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.049 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.051 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.051 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.051 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.052 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.068 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/memory.usage volume: 41.0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.069 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17170d38-e05d-4832-9f55-5ed119bb4edc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 41.0, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'timestamp': '2025-09-30T21:30:44.052121', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b89c0fda-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.800843996, 'message_signature': 'fbc15b38c506a5c116ab586ad86e1da05752bc704e11eca18bb3abd914cf5fc3'}]}, 'timestamp': '2025-09-30 21:30:44.069621', '_unique_id': '183b7b10221e431b9cfe613328a12b0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.070 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.071 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.071 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.write.latency volume: 3495536284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.071 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.072 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e7dfba4-4a1e-4331-b645-2ec2af3f9230', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3495536284, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-vda', 'timestamp': '2025-09-30T21:30:44.071407', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b89c849c-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '32823330a5b4160a79aa336d591b327a2a85337d615df21c335c418ebf8630f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-sda', 'timestamp': '2025-09-30T21:30:44.071407', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b89c8dac-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '8b831a379b7a71319b5126b3067e4150c9b45ab328f6a93c92a2c7b4eda235ae'}]}, 'timestamp': '2025-09-30 21:30:44.072518', '_unique_id': 'e3d63b7c17db49bcaffd8400c7defb7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-777079542>, <NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-777079542>, <NovaLikeServer: guest-instance-1>]
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.073 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.074 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.074 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-777079542>, <NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-777079542>, <NovaLikeServer: guest-instance-1>]
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.074 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.083 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.083 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.084 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de4c210f-a817-43ac-b7b2-4db74652199c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-vda', 'timestamp': '2025-09-30T21:30:44.074383', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b89e4e8a-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.807131106, 'message_signature': '09116470d6375288321f92b86aa1731de9e8f5982f16743f098d9ce7e56ae52c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-sda', 'timestamp': '2025-09-30T21:30:44.074383', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b89e5966-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.807131106, 'message_signature': 'bea13fe7f183291015af0eb7cfd8e14a705f770ede7c4ac6f2652c712b87fcb3'}]}, 'timestamp': '2025-09-30 21:30:44.084217', '_unique_id': '6898f6adf5ea4346adf291940f95d409'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.085 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.086 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.086 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.086 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.write.requests volume: 304 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.087 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.087 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10ac78a9-18dc-466a-8aeb-92719dc3d795', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 304, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-vda', 'timestamp': '2025-09-30T21:30:44.086879', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b89ee07a-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '17145711d65a09306f9a07a0bf873122d38ebbe549f2cafb5aea39fa31d2ae1d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-sda', 'timestamp': '2025-09-30T21:30:44.086879', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b89ee868-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '225e2826a9c3a7414956f96bcd62abdeeba5ccbca181a97a0bae055843d19fc5'}]}, 'timestamp': '2025-09-30 21:30:44.087901', '_unique_id': '1c7cea6939384a9c8caaea6415d22033'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.088 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.089 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.089 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.089 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-777079542>, <NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-777079542>, <NovaLikeServer: guest-instance-1>]
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.089 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.089 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.write.bytes volume: 72835072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.089 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.090 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e601a58-4304-473d-99d4-cd7c7fd217e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72835072, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-vda', 'timestamp': '2025-09-30T21:30:44.089526', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b89f47c2-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '0e0ce3dc6fae53fd859385660612588b2871f920e7e4cf0722f660db44eb17dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-sda', 'timestamp': '2025-09-30T21:30:44.089526', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b89f4fc4-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': 'd44c07e249bd3edc36de7d0363d9eb3b7a3aa868b8147739d89cb06426bc661a'}]}, 'timestamp': '2025-09-30 21:30:44.090553', '_unique_id': '72683bed5b4c460299f12f2bfb3e6363'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.092 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.092 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.092 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/cpu volume: 11620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff1efd11-faef-4e6e-b75f-c2bf9bc40d22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11620000000, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'timestamp': '2025-09-30T21:30:44.092450', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b89fbb62-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.800843996, 'message_signature': '63bb06ead6e2a610743cc3d023f753dc37c3e55c758aa07075928423e7b3be38'}]}, 'timestamp': '2025-09-30 21:30:44.093245', '_unique_id': '17aaa11f7fbf43beae2b771d61d2aec6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.093 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.094 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.read.latency volume: 586023818 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.094 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.read.latency volume: 94321131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.095 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '552828a1-2cfe-404a-bcf9-d1616fc551d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 586023818, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-vda', 'timestamp': '2025-09-30T21:30:44.094453', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8a0093c-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '0ee1a9c55874da2103e6c70649e44cd2c8bb5ac117741abdc06564c92d608312'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 94321131, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-sda', 'timestamp': '2025-09-30T21:30:44.094453', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8a01396-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '1bf4e4afe1822ae95b897eac12abbdc8ed1beb6eafbc571ebd81bfa5024871db'}]}, 'timestamp': '2025-09-30 21:30:44.095552', '_unique_id': '0e8aca4c888f4b91a2a5ebce2d10677c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.097 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.097 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.098 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-777079542>, <NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-777079542>, <NovaLikeServer: guest-instance-1>]
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.098 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.098 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.098 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb7d07d9-aa27-4ef5-95bf-e236fb7f2278', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-vda', 'timestamp': '2025-09-30T21:30:44.098257', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8a09f8c-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '9f665bc6630d3b2073a4142df522fef9c711cb08029226e8dc5809a320214ff3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-sda', 'timestamp': '2025-09-30T21:30:44.098257', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8a0a8ce-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.758443036, 'message_signature': '92d3c1891cdeca914e4f1e14ea5b3aea44968f32fb2a99d3ce5eb23c69184c9a'}]}, 'timestamp': '2025-09-30 21:30:44.099321', '_unique_id': '28a85cddd67e49b782155d2d32204ceb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.099 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.100 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.100 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f038f1d-6b82-4e26-8eaf-2b26b3904a46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-vda', 'timestamp': '2025-09-30T21:30:44.100499', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8a0f52c-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.807131106, 'message_signature': '8afd1083c67b6b3b1e7e2d6ab635582e9ad9ea8eda7cc8ec925d75c9edd9b1ea'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-sda', 'timestamp': '2025-09-30T21:30:44.100499', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8a0ff0e-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.807131106, 'message_signature': '2bdf180486dbce1ebac08a0292f3c7f60a6eb0d3c919870ff7ed51d3dc869fca'}]}, 'timestamp': '2025-09-30 21:30:44.101495', '_unique_id': '7a78b32042e347b5a084ca03b4554a99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.101 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.103 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.103 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.103 12 DEBUG ceilometer.compute.pollsters [-] f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '775096b7-85f0-44d0-a0e0-1a7f5e6cc8ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-vda', 'timestamp': '2025-09-30T21:30:44.103453', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8a168fe-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.807131106, 'message_signature': 'a58d32ca7cab62d5356a16ee7d63f1dd075bd938d88b4deede027e154969ad3b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5a7a790edd5341f99e0eea28163b2823', 'user_name': None, 'project_id': 'a29c3539646b4519bc06a42b7e32df43', 'project_name': None, 'resource_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a-sda', 'timestamp': '2025-09-30T21:30:44.103453', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-777079542', 'name': 'instance-0000004f', 'instance_id': 'f54ded8f-9992-46a9-af52-0cfa1b80a50a', 'instance_type': 'm1.nano', 'host': '4ab20a8903fdd8bd1485275ef143b14af81d816f263098de8e0f6c62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8a1718c-9e44-11f0-984a-fa163e8033fc', 'monotonic_time': 4539.807131106, 'message_signature': 'd509043e8df8f95a621f9de47d05a8c5471d1c43c02c7ecea095ceb70764c23b'}]}, 'timestamp': '2025-09-30 21:30:44.104453', '_unique_id': 'b01284336f7f4829ac75bf4d8f67951e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.104 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:30:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:30:44.106 12 DEBUG ceilometer.compute.pollsters [-] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000052, id=b2a2285f-4f58-421c-a234-d42cebc7e645>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.715 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.715 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.716 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.716 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.790 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.847 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.848 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.912 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:44 compute-1 nova_compute[192795]: 2025-09-30 21:30:44.918 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.013 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.015 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.076 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.078 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000052, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk.config'
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.161 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.163 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.215 2 INFO nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Creating config drive at /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk.config
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.221 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqwf30_ro execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:45 compute-1 podman[232921]: 2025-09-30 21:30:45.245638196 +0000 UTC m=+0.079846775 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, io.buildah.version=1.41.3)
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.284 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.285 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5538MB free_disk=73.3569564819336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.286 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.286 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.353 2 DEBUG oslo_concurrency.processutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqwf30_ro" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.413 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance f54ded8f-9992-46a9-af52-0cfa1b80a50a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.415 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance b2a2285f-4f58-421c-a234-d42cebc7e645 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.415 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.415 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:30:45 compute-1 kernel: tap2881fce3-7a: entered promiscuous mode
Sep 30 21:30:45 compute-1 NetworkManager[51724]: <info>  [1759267845.4381] manager: (tap2881fce3-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/150)
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:45 compute-1 ovn_controller[94902]: 2025-09-30T21:30:45Z|00303|binding|INFO|Claiming lport 2881fce3-7ad4-48e8-9a8d-db239d0022f7 for this chassis.
Sep 30 21:30:45 compute-1 ovn_controller[94902]: 2025-09-30T21:30:45Z|00304|binding|INFO|2881fce3-7ad4-48e8-9a8d-db239d0022f7: Claiming fa:16:3e:61:c6:1f 10.100.0.5
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:45 compute-1 systemd-udevd[232958]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.476 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:c6:1f 10.100.0.5'], port_security=['fa:16:3e:61:c6:1f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b2a2285f-4f58-421c-a234-d42cebc7e645', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-196d7954-47b3-470a-991c-3398bf1a7372', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '635d1a8848cd4101b935e658c05f9037', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53e22dca-c5b6-4e3b-a687-e4ccb8d37523', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5a90e13-3205-4d6c-9861-d027fc810892, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=2881fce3-7ad4-48e8-9a8d-db239d0022f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.477 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 2881fce3-7ad4-48e8-9a8d-db239d0022f7 in datapath 196d7954-47b3-470a-991c-3398bf1a7372 bound to our chassis
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.479 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 196d7954-47b3-470a-991c-3398bf1a7372
Sep 30 21:30:45 compute-1 NetworkManager[51724]: <info>  [1759267845.4843] device (tap2881fce3-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:30:45 compute-1 NetworkManager[51724]: <info>  [1759267845.4854] device (tap2881fce3-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:30:45 compute-1 systemd-machined[152783]: New machine qemu-39-instance-00000052.
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.492 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1c41aafb-5f44-4b44-ba5c-0ec3beacf2d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.493 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap196d7954-41 in ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.496 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap196d7954-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.496 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[05233439-e760-42e6-a163-c2d01aa09a3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.497 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cfed4176-9106-4920-9354-e7b1f4ad2b33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_controller[94902]: 2025-09-30T21:30:45Z|00305|binding|INFO|Setting lport 2881fce3-7ad4-48e8-9a8d-db239d0022f7 ovn-installed in OVS
Sep 30 21:30:45 compute-1 ovn_controller[94902]: 2025-09-30T21:30:45Z|00306|binding|INFO|Setting lport 2881fce3-7ad4-48e8-9a8d-db239d0022f7 up in Southbound
Sep 30 21:30:45 compute-1 systemd[1]: Started Virtual Machine qemu-39-instance-00000052.
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.509 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[35d286fa-05bb-419a-9a56-b40b7ec924a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.535 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe2be43-fb87-4bfa-80a1-9b30703b14cd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.563 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1300c4-aabf-44c3-8d57-8ea8323a5b34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 NetworkManager[51724]: <info>  [1759267845.5722] manager: (tap196d7954-40): new Veth device (/org/freedesktop/NetworkManager/Devices/151)
Sep 30 21:30:45 compute-1 systemd-udevd[232963]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.569 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[37f9d013-f184-44a0-926e-771459832296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.574 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.604 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.610 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[253cea1a-2657-461b-a920-96be1146813e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.614 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[1f96dd88-1ac2-4c59-a2a1-d20e32b4dbb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.634 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.635 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:45 compute-1 NetworkManager[51724]: <info>  [1759267845.6499] device (tap196d7954-40): carrier: link connected
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.657 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[0faa8214-1544-463d-b0e9-7d19082d54d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.676 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1eb88f-b1ad-4b39-8e79-81d0f1524793]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap196d7954-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:50:fc:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454132, 'reachable_time': 35170, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232993, 'error': None, 'target': 'ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.698 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8697a0ef-7244-4015-bdf2-15c5d2885eb7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe50:fcaa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454132, 'tstamp': 454132}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232994, 'error': None, 'target': 'ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.724 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa8c9b9-2481-430b-a558-168b142f173a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap196d7954-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:50:fc:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454132, 'reachable_time': 35170, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232995, 'error': None, 'target': 'ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.763 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[664167d2-0d61-483a-8d63-f4f75ad4b312]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.819 2 DEBUG nova.compute.manager [req-d09d3995-cece-4fc1-a28c-62cd1d25e694 req-0b017d8a-7b60-4485-a35e-eb7b629ec357 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received event network-vif-plugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.819 2 DEBUG oslo_concurrency.lockutils [req-d09d3995-cece-4fc1-a28c-62cd1d25e694 req-0b017d8a-7b60-4485-a35e-eb7b629ec357 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.820 2 DEBUG oslo_concurrency.lockutils [req-d09d3995-cece-4fc1-a28c-62cd1d25e694 req-0b017d8a-7b60-4485-a35e-eb7b629ec357 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.820 2 DEBUG oslo_concurrency.lockutils [req-d09d3995-cece-4fc1-a28c-62cd1d25e694 req-0b017d8a-7b60-4485-a35e-eb7b629ec357 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.820 2 DEBUG nova.compute.manager [req-d09d3995-cece-4fc1-a28c-62cd1d25e694 req-0b017d8a-7b60-4485-a35e-eb7b629ec357 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Processing event network-vif-plugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.837 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3d5905-b1dc-4eda-89c3-8c3a3498f68c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.839 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap196d7954-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.839 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.840 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap196d7954-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:45 compute-1 NetworkManager[51724]: <info>  [1759267845.8430] manager: (tap196d7954-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Sep 30 21:30:45 compute-1 kernel: tap196d7954-40: entered promiscuous mode
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.846 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap196d7954-40, col_values=(('external_ids', {'iface-id': 'ace1fb58-6a09-4b1b-b023-5354df3264e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:45 compute-1 ovn_controller[94902]: 2025-09-30T21:30:45Z|00307|binding|INFO|Releasing lport ace1fb58-6a09-4b1b-b023-5354df3264e1 from this chassis (sb_readonly=0)
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.855 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/196d7954-47b3-470a-991c-3398bf1a7372.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/196d7954-47b3-470a-991c-3398bf1a7372.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.857 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1d791547-d12f-44e0-973a-10e7cd157857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.858 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-196d7954-47b3-470a-991c-3398bf1a7372
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/196d7954-47b3-470a-991c-3398bf1a7372.pid.haproxy
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 196d7954-47b3-470a-991c-3398bf1a7372
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:30:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:45.858 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372', 'env', 'PROCESS_TAG=haproxy-196d7954-47b3-470a-991c-3398bf1a7372', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/196d7954-47b3-470a-991c-3398bf1a7372.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:30:45 compute-1 nova_compute[192795]: 2025-09-30 21:30:45.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:46 compute-1 podman[233034]: 2025-09-30 21:30:46.275163001 +0000 UTC m=+0.062403103 container create 20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.298 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267846.297548, b2a2285f-4f58-421c-a234-d42cebc7e645 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.298 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] VM Started (Lifecycle Event)
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.304 2 DEBUG nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.308 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.312 2 INFO nova.virt.libvirt.driver [-] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Instance spawned successfully.
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.313 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:30:46 compute-1 systemd[1]: Started libpod-conmon-20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8.scope.
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.328 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:46 compute-1 podman[233034]: 2025-09-30 21:30:46.242776533 +0000 UTC m=+0.030016655 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.337 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.342 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.343 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.343 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.343 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.344 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.344 2 DEBUG nova.virt.libvirt.driver [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:46 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:30:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82f427e7e14bf0f59a312dd73dc68ebac850c8a9a44c1f7bd512b69d752f51bf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:30:46 compute-1 podman[233034]: 2025-09-30 21:30:46.372540941 +0000 UTC m=+0.159781083 container init 20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.378 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.379 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267846.3014684, b2a2285f-4f58-421c-a234-d42cebc7e645 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.379 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] VM Paused (Lifecycle Event)
Sep 30 21:30:46 compute-1 podman[233034]: 2025-09-30 21:30:46.3802659 +0000 UTC m=+0.167506012 container start 20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:30:46 compute-1 neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372[233050]: [NOTICE]   (233054) : New worker (233056) forked
Sep 30 21:30:46 compute-1 neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372[233050]: [NOTICE]   (233054) : Loading success.
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.432 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.438 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267846.307457, b2a2285f-4f58-421c-a234-d42cebc7e645 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.438 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] VM Resumed (Lifecycle Event)
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.461 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.466 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.476 2 INFO nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Took 12.13 seconds to spawn the instance on the hypervisor.
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.476 2 DEBUG nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.533 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.572 2 DEBUG nova.network.neutron [req-9bee4ecc-1504-47a4-ac9a-b7b5acebfbf4 req-2aea34c0-d41b-4c16-9978-4c7cf27208b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Updated VIF entry in instance network info cache for port 2881fce3-7ad4-48e8-9a8d-db239d0022f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:30:46 compute-1 nova_compute[192795]: 2025-09-30 21:30:46.573 2 DEBUG nova.network.neutron [req-9bee4ecc-1504-47a4-ac9a-b7b5acebfbf4 req-2aea34c0-d41b-4c16-9978-4c7cf27208b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Updating instance_info_cache with network_info: [{"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:30:47.166 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:47 compute-1 nova_compute[192795]: 2025-09-30 21:30:47.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:47 compute-1 nova_compute[192795]: 2025-09-30 21:30:47.635 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:47 compute-1 nova_compute[192795]: 2025-09-30 21:30:47.636 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:47 compute-1 nova_compute[192795]: 2025-09-30 21:30:47.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:47 compute-1 nova_compute[192795]: 2025-09-30 21:30:47.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.861 2 DEBUG oslo_concurrency.lockutils [req-9bee4ecc-1504-47a4-ac9a-b7b5acebfbf4 req-2aea34c0-d41b-4c16-9978-4c7cf27208b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b2a2285f-4f58-421c-a234-d42cebc7e645" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.868 2 DEBUG nova.compute.manager [req-ee12443a-f801-4153-8b0a-f85ed37850b4 req-fb6ccf03-95eb-42b7-a196-8fa17e8f5150 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received event network-vif-plugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.868 2 DEBUG oslo_concurrency.lockutils [req-ee12443a-f801-4153-8b0a-f85ed37850b4 req-fb6ccf03-95eb-42b7-a196-8fa17e8f5150 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.869 2 DEBUG oslo_concurrency.lockutils [req-ee12443a-f801-4153-8b0a-f85ed37850b4 req-fb6ccf03-95eb-42b7-a196-8fa17e8f5150 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.870 2 DEBUG oslo_concurrency.lockutils [req-ee12443a-f801-4153-8b0a-f85ed37850b4 req-fb6ccf03-95eb-42b7-a196-8fa17e8f5150 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.870 2 DEBUG nova.compute.manager [req-ee12443a-f801-4153-8b0a-f85ed37850b4 req-fb6ccf03-95eb-42b7-a196-8fa17e8f5150 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] No waiting events found dispatching network-vif-plugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.871 2 WARNING nova.compute.manager [req-ee12443a-f801-4153-8b0a-f85ed37850b4 req-fb6ccf03-95eb-42b7-a196-8fa17e8f5150 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received unexpected event network-vif-plugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 for instance with vm_state active and task_state None.
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.919 2 INFO nova.compute.manager [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Took 15.20 seconds to build instance.
Sep 30 21:30:48 compute-1 nova_compute[192795]: 2025-09-30 21:30:48.972 2 DEBUG oslo_concurrency.lockutils [None req-8265db11-92c7-49f4-8113-738d7a8a43c9 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.317 2 DEBUG nova.compute.manager [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.487 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.488 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.530 2 DEBUG nova.objects.instance [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'pci_requests' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.561 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.561 2 INFO nova.compute.claims [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.562 2 DEBUG nova.objects.instance [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'resources' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.586 2 DEBUG nova.objects.instance [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'pci_devices' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.652 2 INFO nova.compute.resource_tracker [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating resource usage from migration 4741a102-4774-4853-b0df-0b90788b2a9d
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.653 2 DEBUG nova.compute.resource_tracker [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Starting to track incoming migration 4741a102-4774-4853-b0df-0b90788b2a9d with flavor c9779bca-1eb6-4567-a36c-b452abeafc70 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.805 2 DEBUG nova.compute.provider_tree [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.827 2 DEBUG nova.scheduler.client.report [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.851 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.363s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:49 compute-1 nova_compute[192795]: 2025-09-30 21:30:49.851 2 INFO nova.compute.manager [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Migrating
Sep 30 21:30:51 compute-1 nova_compute[192795]: 2025-09-30 21:30:51.422 2 DEBUG nova.compute.manager [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:51 compute-1 nova_compute[192795]: 2025-09-30 21:30:51.507 2 INFO nova.compute.manager [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] instance snapshotting
Sep 30 21:30:51 compute-1 nova_compute[192795]: 2025-09-30 21:30:51.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:51 compute-1 nova_compute[192795]: 2025-09-30 21:30:51.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:51 compute-1 NetworkManager[51724]: <info>  [1759267851.8519] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Sep 30 21:30:51 compute-1 NetworkManager[51724]: <info>  [1759267851.8530] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Sep 30 21:30:51 compute-1 nova_compute[192795]: 2025-09-30 21:30:51.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:51 compute-1 ovn_controller[94902]: 2025-09-30T21:30:51Z|00308|binding|INFO|Releasing lport ace1fb58-6a09-4b1b-b023-5354df3264e1 from this chassis (sb_readonly=0)
Sep 30 21:30:51 compute-1 ovn_controller[94902]: 2025-09-30T21:30:51Z|00309|binding|INFO|Releasing lport ace1fb58-6a09-4b1b-b023-5354df3264e1 from this chassis (sb_readonly=0)
Sep 30 21:30:51 compute-1 nova_compute[192795]: 2025-09-30 21:30:51.917 2 INFO nova.virt.libvirt.driver [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Beginning live snapshot process
Sep 30 21:30:51 compute-1 nova_compute[192795]: 2025-09-30 21:30:51.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:52 compute-1 nova_compute[192795]: 2025-09-30 21:30:52.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:52 compute-1 virtqemud[192217]: invalid argument: disk vda does not have an active block job
Sep 30 21:30:52 compute-1 nova_compute[192795]: 2025-09-30 21:30:52.761 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:52 compute-1 nova_compute[192795]: 2025-09-30 21:30:52.826 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:52 compute-1 nova_compute[192795]: 2025-09-30 21:30:52.827 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:52 compute-1 nova_compute[192795]: 2025-09-30 21:30:52.888 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a/disk --force-share --output=json -f qcow2" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:52 compute-1 nova_compute[192795]: 2025-09-30 21:30:52.901 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:52 compute-1 nova_compute[192795]: 2025-09-30 21:30:52.957 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:52 compute-1 nova_compute[192795]: 2025-09-30 21:30:52.959 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpt3da7lve/e0540e1949bd47198a9ef9aa3544c301.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.001 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpt3da7lve/e0540e1949bd47198a9ef9aa3544c301.delta 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.002 2 INFO nova.virt.libvirt.driver [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.096 2 DEBUG nova.virt.libvirt.guest [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] COPY block job progress, current cursor: 0 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:30:53 compute-1 podman[233081]: 2025-09-30 21:30:53.10958561 +0000 UTC m=+0.059488053 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent)
Sep 30 21:30:53 compute-1 podman[233080]: 2025-09-30 21:30:53.121244176 +0000 UTC m=+0.077597625 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:30:53 compute-1 podman[233078]: 2025-09-30 21:30:53.152412971 +0000 UTC m=+0.108959935 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9)
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.602 2 DEBUG nova.virt.libvirt.guest [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] COPY block job progress, current cursor: 73662464 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.841 2 DEBUG nova.compute.manager [req-c3cdb518-c202-444d-9e78-dc5a4e84f54d req-2e989d88-cc7c-4e4c-9744-ed8e7432d825 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received event network-changed-2881fce3-7ad4-48e8-9a8d-db239d0022f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.842 2 DEBUG nova.compute.manager [req-c3cdb518-c202-444d-9e78-dc5a4e84f54d req-2e989d88-cc7c-4e4c-9744-ed8e7432d825 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Refreshing instance network info cache due to event network-changed-2881fce3-7ad4-48e8-9a8d-db239d0022f7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.843 2 DEBUG oslo_concurrency.lockutils [req-c3cdb518-c202-444d-9e78-dc5a4e84f54d req-2e989d88-cc7c-4e4c-9744-ed8e7432d825 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b2a2285f-4f58-421c-a234-d42cebc7e645" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.844 2 DEBUG oslo_concurrency.lockutils [req-c3cdb518-c202-444d-9e78-dc5a4e84f54d req-2e989d88-cc7c-4e4c-9744-ed8e7432d825 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b2a2285f-4f58-421c-a234-d42cebc7e645" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.844 2 DEBUG nova.network.neutron [req-c3cdb518-c202-444d-9e78-dc5a4e84f54d req-2e989d88-cc7c-4e4c-9744-ed8e7432d825 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Refreshing network info cache for port 2881fce3-7ad4-48e8-9a8d-db239d0022f7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:30:53 compute-1 nova_compute[192795]: 2025-09-30 21:30:53.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.108 2 DEBUG nova.virt.libvirt.guest [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] COPY block job progress, current cursor: 75366400 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.112 2 INFO nova.virt.libvirt.driver [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.178 2 DEBUG nova.privsep.utils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.180 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpt3da7lve/e0540e1949bd47198a9ef9aa3544c301.delta /var/lib/nova/instances/snapshots/tmpt3da7lve/e0540e1949bd47198a9ef9aa3544c301 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.722 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-f54ded8f-9992-46a9-af52-0cfa1b80a50a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.722 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-f54ded8f-9992-46a9-af52-0cfa1b80a50a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.723 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:30:54 compute-1 nova_compute[192795]: 2025-09-30 21:30:54.723 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f54ded8f-9992-46a9-af52-0cfa1b80a50a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:55 compute-1 nova_compute[192795]: 2025-09-30 21:30:55.006 2 DEBUG oslo_concurrency.processutils [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpt3da7lve/e0540e1949bd47198a9ef9aa3544c301.delta /var/lib/nova/instances/snapshots/tmpt3da7lve/e0540e1949bd47198a9ef9aa3544c301" returned: 0 in 0.826s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:55 compute-1 nova_compute[192795]: 2025-09-30 21:30:55.020 2 INFO nova.virt.libvirt.driver [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Snapshot extracted, beginning image upload
Sep 30 21:30:56 compute-1 nova_compute[192795]: 2025-09-30 21:30:56.336 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:30:56 compute-1 sshd-session[233162]: Accepted publickey for nova from 192.168.122.100 port 47280 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:30:56 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:30:56 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:30:56 compute-1 systemd-logind[793]: New session 56 of user nova.
Sep 30 21:30:56 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:30:56 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:30:56 compute-1 systemd[233166]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:30:56 compute-1 systemd[233166]: Queued start job for default target Main User Target.
Sep 30 21:30:56 compute-1 systemd[233166]: Created slice User Application Slice.
Sep 30 21:30:56 compute-1 systemd[233166]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:30:56 compute-1 systemd[233166]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:30:56 compute-1 systemd[233166]: Reached target Paths.
Sep 30 21:30:56 compute-1 systemd[233166]: Reached target Timers.
Sep 30 21:30:56 compute-1 systemd[233166]: Starting D-Bus User Message Bus Socket...
Sep 30 21:30:56 compute-1 systemd[233166]: Starting Create User's Volatile Files and Directories...
Sep 30 21:30:56 compute-1 systemd[233166]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:30:56 compute-1 systemd[233166]: Reached target Sockets.
Sep 30 21:30:56 compute-1 systemd[233166]: Finished Create User's Volatile Files and Directories.
Sep 30 21:30:56 compute-1 systemd[233166]: Reached target Basic System.
Sep 30 21:30:56 compute-1 systemd[233166]: Reached target Main User Target.
Sep 30 21:30:56 compute-1 systemd[233166]: Startup finished in 176ms.
Sep 30 21:30:56 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:30:56 compute-1 systemd[1]: Started Session 56 of User nova.
Sep 30 21:30:56 compute-1 sshd-session[233162]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:30:56 compute-1 sshd-session[233181]: Received disconnect from 192.168.122.100 port 47280:11: disconnected by user
Sep 30 21:30:56 compute-1 sshd-session[233181]: Disconnected from user nova 192.168.122.100 port 47280
Sep 30 21:30:56 compute-1 sshd-session[233162]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:30:56 compute-1 systemd[1]: session-56.scope: Deactivated successfully.
Sep 30 21:30:56 compute-1 systemd-logind[793]: Session 56 logged out. Waiting for processes to exit.
Sep 30 21:30:56 compute-1 systemd-logind[793]: Removed session 56.
Sep 30 21:30:56 compute-1 nova_compute[192795]: 2025-09-30 21:30:56.928 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:56 compute-1 sshd-session[233183]: Accepted publickey for nova from 192.168.122.100 port 47284 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:30:56 compute-1 nova_compute[192795]: 2025-09-30 21:30:56.950 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-f54ded8f-9992-46a9-af52-0cfa1b80a50a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:56 compute-1 nova_compute[192795]: 2025-09-30 21:30:56.951 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:30:56 compute-1 nova_compute[192795]: 2025-09-30 21:30:56.952 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:56 compute-1 systemd-logind[793]: New session 58 of user nova.
Sep 30 21:30:56 compute-1 systemd[1]: Started Session 58 of User nova.
Sep 30 21:30:56 compute-1 sshd-session[233183]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:30:57 compute-1 sshd-session[233186]: Received disconnect from 192.168.122.100 port 47284:11: disconnected by user
Sep 30 21:30:57 compute-1 sshd-session[233186]: Disconnected from user nova 192.168.122.100 port 47284
Sep 30 21:30:57 compute-1 sshd-session[233183]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:30:57 compute-1 systemd[1]: session-58.scope: Deactivated successfully.
Sep 30 21:30:57 compute-1 systemd-logind[793]: Session 58 logged out. Waiting for processes to exit.
Sep 30 21:30:57 compute-1 systemd-logind[793]: Removed session 58.
Sep 30 21:30:57 compute-1 nova_compute[192795]: 2025-09-30 21:30:57.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:57 compute-1 nova_compute[192795]: 2025-09-30 21:30:57.760 2 DEBUG nova.network.neutron [req-c3cdb518-c202-444d-9e78-dc5a4e84f54d req-2e989d88-cc7c-4e4c-9744-ed8e7432d825 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Updated VIF entry in instance network info cache for port 2881fce3-7ad4-48e8-9a8d-db239d0022f7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:30:57 compute-1 nova_compute[192795]: 2025-09-30 21:30:57.761 2 DEBUG nova.network.neutron [req-c3cdb518-c202-444d-9e78-dc5a4e84f54d req-2e989d88-cc7c-4e4c-9744-ed8e7432d825 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Updating instance_info_cache with network_info: [{"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:57 compute-1 nova_compute[192795]: 2025-09-30 21:30:57.802 2 DEBUG oslo_concurrency.lockutils [req-c3cdb518-c202-444d-9e78-dc5a4e84f54d req-2e989d88-cc7c-4e4c-9744-ed8e7432d825 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b2a2285f-4f58-421c-a234-d42cebc7e645" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:58 compute-1 nova_compute[192795]: 2025-09-30 21:30:58.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-1 ovn_controller[94902]: 2025-09-30T21:30:59Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:61:c6:1f 10.100.0.5
Sep 30 21:30:59 compute-1 ovn_controller[94902]: 2025-09-30T21:30:59Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:61:c6:1f 10.100.0.5
Sep 30 21:31:00 compute-1 sshd-session[233202]: Accepted publickey for nova from 192.168.122.100 port 57542 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:31:00 compute-1 systemd-logind[793]: New session 59 of user nova.
Sep 30 21:31:00 compute-1 systemd[1]: Started Session 59 of User nova.
Sep 30 21:31:00 compute-1 sshd-session[233202]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:31:00 compute-1 nova_compute[192795]: 2025-09-30 21:31:00.528 2 INFO nova.virt.libvirt.driver [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Snapshot image upload complete
Sep 30 21:31:00 compute-1 nova_compute[192795]: 2025-09-30 21:31:00.535 2 INFO nova.compute.manager [None req-7a0b4b50-bdd8-43b2-ad51-97e6a23a34b9 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Took 9.01 seconds to snapshot the instance on the hypervisor.
Sep 30 21:31:00 compute-1 sshd-session[233205]: Received disconnect from 192.168.122.100 port 57542:11: disconnected by user
Sep 30 21:31:00 compute-1 sshd-session[233205]: Disconnected from user nova 192.168.122.100 port 57542
Sep 30 21:31:00 compute-1 sshd-session[233202]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:31:00 compute-1 systemd[1]: session-59.scope: Deactivated successfully.
Sep 30 21:31:00 compute-1 systemd-logind[793]: Session 59 logged out. Waiting for processes to exit.
Sep 30 21:31:00 compute-1 systemd-logind[793]: Removed session 59.
Sep 30 21:31:00 compute-1 sshd-session[233207]: Accepted publickey for nova from 192.168.122.100 port 57548 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:31:00 compute-1 systemd-logind[793]: New session 60 of user nova.
Sep 30 21:31:00 compute-1 systemd[1]: Started Session 60 of User nova.
Sep 30 21:31:00 compute-1 sshd-session[233207]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:31:01 compute-1 sshd-session[233210]: Received disconnect from 192.168.122.100 port 57548:11: disconnected by user
Sep 30 21:31:01 compute-1 sshd-session[233210]: Disconnected from user nova 192.168.122.100 port 57548
Sep 30 21:31:01 compute-1 sshd-session[233207]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:31:01 compute-1 systemd[1]: session-60.scope: Deactivated successfully.
Sep 30 21:31:01 compute-1 systemd-logind[793]: Session 60 logged out. Waiting for processes to exit.
Sep 30 21:31:01 compute-1 systemd-logind[793]: Removed session 60.
Sep 30 21:31:01 compute-1 sshd-session[233212]: Accepted publickey for nova from 192.168.122.100 port 57554 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:31:01 compute-1 systemd-logind[793]: New session 61 of user nova.
Sep 30 21:31:01 compute-1 systemd[1]: Started Session 61 of User nova.
Sep 30 21:31:01 compute-1 sshd-session[233212]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:31:01 compute-1 sshd-session[233215]: Received disconnect from 192.168.122.100 port 57554:11: disconnected by user
Sep 30 21:31:01 compute-1 sshd-session[233215]: Disconnected from user nova 192.168.122.100 port 57554
Sep 30 21:31:01 compute-1 sshd-session[233212]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:31:01 compute-1 systemd[1]: session-61.scope: Deactivated successfully.
Sep 30 21:31:01 compute-1 systemd-logind[793]: Session 61 logged out. Waiting for processes to exit.
Sep 30 21:31:01 compute-1 systemd-logind[793]: Removed session 61.
Sep 30 21:31:01 compute-1 nova_compute[192795]: 2025-09-30 21:31:01.523 2 DEBUG nova.compute.manager [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-unplugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:01 compute-1 nova_compute[192795]: 2025-09-30 21:31:01.524 2 DEBUG oslo_concurrency.lockutils [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:01 compute-1 nova_compute[192795]: 2025-09-30 21:31:01.525 2 DEBUG oslo_concurrency.lockutils [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:01 compute-1 nova_compute[192795]: 2025-09-30 21:31:01.525 2 DEBUG oslo_concurrency.lockutils [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:01 compute-1 nova_compute[192795]: 2025-09-30 21:31:01.525 2 DEBUG nova.compute.manager [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-unplugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:01 compute-1 nova_compute[192795]: 2025-09-30 21:31:01.526 2 WARNING nova.compute.manager [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-unplugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state active and task_state resize_migrating.
Sep 30 21:31:02 compute-1 nova_compute[192795]: 2025-09-30 21:31:02.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:02 compute-1 nova_compute[192795]: 2025-09-30 21:31:02.712 2 INFO nova.network.neutron [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 21:31:03 compute-1 podman[233217]: 2025-09-30 21:31:03.281532311 +0000 UTC m=+0.099555460 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 21:31:03 compute-1 nova_compute[192795]: 2025-09-30 21:31:03.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:04 compute-1 nova_compute[192795]: 2025-09-30 21:31:04.412 2 DEBUG nova.compute.manager [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:04 compute-1 nova_compute[192795]: 2025-09-30 21:31:04.412 2 DEBUG oslo_concurrency.lockutils [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:04 compute-1 nova_compute[192795]: 2025-09-30 21:31:04.412 2 DEBUG oslo_concurrency.lockutils [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:04 compute-1 nova_compute[192795]: 2025-09-30 21:31:04.412 2 DEBUG oslo_concurrency.lockutils [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:04 compute-1 nova_compute[192795]: 2025-09-30 21:31:04.413 2 DEBUG nova.compute.manager [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:04 compute-1 nova_compute[192795]: 2025-09-30 21:31:04.413 2 WARNING nova.compute.manager [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state active and task_state resize_migrated.
Sep 30 21:31:05 compute-1 nova_compute[192795]: 2025-09-30 21:31:05.029 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:31:05 compute-1 nova_compute[192795]: 2025-09-30 21:31:05.030 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquired lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:31:05 compute-1 nova_compute[192795]: 2025-09-30 21:31:05.030 2 DEBUG nova.network.neutron [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:31:06 compute-1 nova_compute[192795]: 2025-09-30 21:31:06.546 2 DEBUG nova.compute.manager [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-changed-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:06 compute-1 nova_compute[192795]: 2025-09-30 21:31:06.547 2 DEBUG nova.compute.manager [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Refreshing instance network info cache due to event network-changed-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:31:06 compute-1 nova_compute[192795]: 2025-09-30 21:31:06.547 2 DEBUG oslo_concurrency.lockutils [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.066 2 DEBUG nova.network.neutron [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating instance_info_cache with network_info: [{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.100 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Releasing lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.103 2 DEBUG oslo_concurrency.lockutils [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.103 2 DEBUG nova.network.neutron [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Refreshing network info cache for port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.299 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.301 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.301 2 INFO nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Creating image(s)
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.301 2 DEBUG nova.objects.instance [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'trusted_certs' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.384 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.455 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.456 2 DEBUG nova.virt.disk.api [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Checking if we can resize image /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.456 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.521 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.522 2 DEBUG nova.virt.disk.api [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Cannot resize image /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.543 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.543 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Ensure instance console log exists: /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.544 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.545 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.545 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.547 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Start _get_guest_xml network_info=[{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:8a:8e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.555 2 WARNING nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.565 2 DEBUG nova.virt.libvirt.host [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.565 2 DEBUG nova.virt.libvirt.host [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.570 2 DEBUG nova.virt.libvirt.host [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.571 2 DEBUG nova.virt.libvirt.host [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.573 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.574 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9779bca-1eb6-4567-a36c-b452abeafc70',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.574 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.574 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.574 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.575 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.575 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.575 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.575 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.576 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.576 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.576 2 DEBUG nova.virt.hardware [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.576 2 DEBUG nova.objects.instance [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'vcpu_model' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.603 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.672 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.675 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.676 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.678 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.680 2 DEBUG nova.virt.libvirt.vif [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-543993129',display_name='tempest-ServerDiskConfigTestJSON-server-543993129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-543993129',id=81,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-z9ljve0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:31:01Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=c3cd73be-ae82-4c19-8ab7-ec9b06134032,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:8a:8e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.680 2 DEBUG nova.network.os_vif_util [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:8a:8e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.681 2 DEBUG nova.network.os_vif_util [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.684 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <uuid>c3cd73be-ae82-4c19-8ab7-ec9b06134032</uuid>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <name>instance-00000051</name>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <memory>196608</memory>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-543993129</nova:name>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:31:07</nova:creationTime>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <nova:flavor name="m1.micro">
Sep 30 21:31:07 compute-1 nova_compute[192795]:         <nova:memory>192</nova:memory>
Sep 30 21:31:07 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:31:07 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:31:07 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:31:07 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:31:07 compute-1 nova_compute[192795]:         <nova:user uuid="648f7bb37eeb4003825636f9a7c1f92a">tempest-ServerDiskConfigTestJSON-1133643549-project-member</nova:user>
Sep 30 21:31:07 compute-1 nova_compute[192795]:         <nova:project uuid="72559935caa44fd9b779b6770f00199f">tempest-ServerDiskConfigTestJSON-1133643549</nova:project>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:31:07 compute-1 nova_compute[192795]:         <nova:port uuid="c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad">
Sep 30 21:31:07 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <system>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <entry name="serial">c3cd73be-ae82-4c19-8ab7-ec9b06134032</entry>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <entry name="uuid">c3cd73be-ae82-4c19-8ab7-ec9b06134032</entry>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     </system>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <os>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   </os>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <features>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   </features>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:8a:8e:e0"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <target dev="tapc2d3f3fa-4e"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/console.log" append="off"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <video>
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     </video>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:31:07 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:31:07 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:31:07 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:31:07 compute-1 nova_compute[192795]: </domain>
Sep 30 21:31:07 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.685 2 DEBUG nova.virt.libvirt.vif [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-543993129',display_name='tempest-ServerDiskConfigTestJSON-server-543993129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-543993129',id=81,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-z9ljve0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:31:01Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=c3cd73be-ae82-4c19-8ab7-ec9b06134032,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:8a:8e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.685 2 DEBUG nova.network.os_vif_util [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:8a:8e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.686 2 DEBUG nova.network.os_vif_util [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.686 2 DEBUG os_vif [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.688 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.688 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.693 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2d3f3fa-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.694 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc2d3f3fa-4e, col_values=(('external_ids', {'iface-id': 'c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:8e:e0', 'vm-uuid': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-1 NetworkManager[51724]: <info>  [1759267867.6970] manager: (tapc2d3f3fa-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.706 2 INFO os_vif [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e')
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.783 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.784 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.784 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] No VIF found with MAC fa:16:3e:8a:8e:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.786 2 INFO nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Using config drive
Sep 30 21:31:07 compute-1 kernel: tapc2d3f3fa-4e: entered promiscuous mode
Sep 30 21:31:07 compute-1 NetworkManager[51724]: <info>  [1759267867.8715] manager: (tapc2d3f3fa-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/156)
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-1 ovn_controller[94902]: 2025-09-30T21:31:07Z|00310|binding|INFO|Claiming lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for this chassis.
Sep 30 21:31:07 compute-1 ovn_controller[94902]: 2025-09-30T21:31:07Z|00311|binding|INFO|c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad: Claiming fa:16:3e:8a:8e:e0 10.100.0.3
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.888 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:8e:e0 10.100.0.3'], port_security=['fa:16:3e:8a:8e:e0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a145b225-510f-43a7-8cc6-fccae3ed647e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72559935caa44fd9b779b6770f00199f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3a098193-23af-4fd8-a818-c9a9c1a46706', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4fe4d7-2316-4403-a18b-3b0227898f0d, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.890 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad in datapath a145b225-510f-43a7-8cc6-fccae3ed647e bound to our chassis
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.892 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:31:07 compute-1 ovn_controller[94902]: 2025-09-30T21:31:07Z|00312|binding|INFO|Setting lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad ovn-installed in OVS
Sep 30 21:31:07 compute-1 ovn_controller[94902]: 2025-09-30T21:31:07Z|00313|binding|INFO|Setting lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad up in Southbound
Sep 30 21:31:07 compute-1 systemd-udevd[233263]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:31:07 compute-1 nova_compute[192795]: 2025-09-30 21:31:07.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.909 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ccbf7cf4-dd88-41d2-9557-77a9cb066806]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.910 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa145b225-51 in ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.919 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa145b225-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.919 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[671664ab-d5c6-4c07-977f-268f6e613519]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.920 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[96b3e624-816a-4603-84ce-d1c59e4eb8a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:07 compute-1 NetworkManager[51724]: <info>  [1759267867.9233] device (tapc2d3f3fa-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:31:07 compute-1 NetworkManager[51724]: <info>  [1759267867.9243] device (tapc2d3f3fa-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:31:07 compute-1 systemd-machined[152783]: New machine qemu-40-instance-00000051.
Sep 30 21:31:07 compute-1 systemd[1]: Started Virtual Machine qemu-40-instance-00000051.
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.938 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[133adef8-e0a7-4a4c-b4a1-6c564e3fff05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:07.969 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[df9cb410-ffc4-491d-ad67-fa2aec293dd1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.007 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2e57619c-1f8d-400f-a4a6-9efad0592562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 NetworkManager[51724]: <info>  [1759267868.0141] manager: (tapa145b225-50): new Veth device (/org/freedesktop/NetworkManager/Devices/157)
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.013 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[38b9ec60-2eb1-4e4d-9bda-8000264dd6cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.054 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[0463f8bf-1c02-47c6-bed2-fe87f33a23a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.060 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[93351fba-0b10-4b9d-8833-a28f48484b87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 NetworkManager[51724]: <info>  [1759267868.0935] device (tapa145b225-50): carrier: link connected
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.102 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb011f3-140c-4ce4-bcd6-38206d990419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.128 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e61e8f55-956e-4528-9465-bb383575e299]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa145b225-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:43:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456376, 'reachable_time': 42627, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233296, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.151 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7c8e247f-c827-40b7-881a-2af115eb7825]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:43a1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456376, 'tstamp': 456376}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233297, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.178 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8a69e733-02aa-4f3b-9dd3-976fff8cc5ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa145b225-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:43:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456376, 'reachable_time': 42627, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233298, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.231 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[32f4eaf2-8d0f-4a45-909b-14aae5935470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.319 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5777aa4d-340a-4c9f-a051-aeda0382804e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.320 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa145b225-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.321 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.321 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa145b225-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:08 compute-1 kernel: tapa145b225-50: entered promiscuous mode
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-1 NetworkManager[51724]: <info>  [1759267868.3705] manager: (tapa145b225-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.374 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa145b225-50, col_values=(('external_ids', {'iface-id': '0f20179e-1a66-48c7-97c7-a3ccb2b25749'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-1 ovn_controller[94902]: 2025-09-30T21:31:08Z|00314|binding|INFO|Releasing lport 0f20179e-1a66-48c7-97c7-a3ccb2b25749 from this chassis (sb_readonly=0)
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.402 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.403 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1fcc05-e081-4e72-87d9-c0c2a6d76772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.404 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:31:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:08.405 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'env', 'PROCESS_TAG=haproxy-a145b225-510f-43a7-8cc6-fccae3ed647e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a145b225-510f-43a7-8cc6-fccae3ed647e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.708 2 DEBUG nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.709 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.709 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.709 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.709 2 DEBUG nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.709 2 WARNING nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state active and task_state resize_finish.
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.709 2 DEBUG nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.710 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.710 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.710 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.710 2 DEBUG nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.710 2 WARNING nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state active and task_state resize_finish.
Sep 30 21:31:08 compute-1 podman[233337]: 2025-09-30 21:31:08.822783477 +0000 UTC m=+0.059032450 container create c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:31:08 compute-1 systemd[1]: Started libpod-conmon-c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891.scope.
Sep 30 21:31:08 compute-1 podman[233337]: 2025-09-30 21:31:08.793326259 +0000 UTC m=+0.029575252 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:31:08 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:31:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77e5c31fc814cb4dabb538ea7ad8620b7dd3454f7eddc812572f49c59accfbc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:31:08 compute-1 podman[233337]: 2025-09-30 21:31:08.938393301 +0000 UTC m=+0.174642304 container init c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:31:08 compute-1 podman[233337]: 2025-09-30 21:31:08.944355993 +0000 UTC m=+0.180604996 container start c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.949 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267868.948237, c3cd73be-ae82-4c19-8ab7-ec9b06134032 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.950 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] VM Resumed (Lifecycle Event)
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.952 2 DEBUG nova.compute.manager [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.957 2 INFO nova.virt.libvirt.driver [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance running successfully.
Sep 30 21:31:08 compute-1 virtqemud[192217]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.960 2 DEBUG nova.virt.libvirt.guest [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.961 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Sep 30 21:31:08 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[233352]: [NOTICE]   (233356) : New worker (233358) forked
Sep 30 21:31:08 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[233352]: [NOTICE]   (233356) : Loading success.
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.981 2 DEBUG nova.network.neutron [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updated VIF entry in instance network info cache for port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.981 2 DEBUG nova.network.neutron [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating instance_info_cache with network_info: [{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:08 compute-1 nova_compute[192795]: 2025-09-30 21:31:08.996 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:09 compute-1 nova_compute[192795]: 2025-09-30 21:31:09.001 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:31:09 compute-1 nova_compute[192795]: 2025-09-30 21:31:09.013 2 DEBUG oslo_concurrency.lockutils [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:31:09 compute-1 nova_compute[192795]: 2025-09-30 21:31:09.048 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:31:09 compute-1 nova_compute[192795]: 2025-09-30 21:31:09.049 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267868.949769, c3cd73be-ae82-4c19-8ab7-ec9b06134032 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:09 compute-1 nova_compute[192795]: 2025-09-30 21:31:09.049 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] VM Started (Lifecycle Event)
Sep 30 21:31:09 compute-1 nova_compute[192795]: 2025-09-30 21:31:09.085 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:09 compute-1 nova_compute[192795]: 2025-09-30 21:31:09.091 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:31:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:10.339 103970 DEBUG eventlet.wsgi.server [-] (103970) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Sep 30 21:31:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:10.341 103970 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0
Sep 30 21:31:10 compute-1 ovn_metadata_agent[103856]: Accept: */*
Sep 30 21:31:10 compute-1 ovn_metadata_agent[103856]: Connection: close
Sep 30 21:31:10 compute-1 ovn_metadata_agent[103856]: Content-Type: text/plain
Sep 30 21:31:10 compute-1 ovn_metadata_agent[103856]: Host: 169.254.169.254
Sep 30 21:31:10 compute-1 ovn_metadata_agent[103856]: User-Agent: curl/7.84.0
Sep 30 21:31:10 compute-1 ovn_metadata_agent[103856]: X-Forwarded-For: 10.100.0.5
Sep 30 21:31:10 compute-1 ovn_metadata_agent[103856]: X-Ovn-Network-Id: 196d7954-47b3-470a-991c-3398bf1a7372 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Sep 30 21:31:11 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:31:11 compute-1 systemd[233166]: Activating special unit Exit the Session...
Sep 30 21:31:11 compute-1 systemd[233166]: Stopped target Main User Target.
Sep 30 21:31:11 compute-1 systemd[233166]: Stopped target Basic System.
Sep 30 21:31:11 compute-1 systemd[233166]: Stopped target Paths.
Sep 30 21:31:11 compute-1 systemd[233166]: Stopped target Sockets.
Sep 30 21:31:11 compute-1 systemd[233166]: Stopped target Timers.
Sep 30 21:31:11 compute-1 systemd[233166]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:31:11 compute-1 systemd[233166]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:31:11 compute-1 systemd[233166]: Closed D-Bus User Message Bus Socket.
Sep 30 21:31:11 compute-1 systemd[233166]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:31:11 compute-1 systemd[233166]: Removed slice User Application Slice.
Sep 30 21:31:11 compute-1 systemd[233166]: Reached target Shutdown.
Sep 30 21:31:11 compute-1 systemd[233166]: Finished Exit the Session.
Sep 30 21:31:11 compute-1 systemd[233166]: Reached target Exit the Session.
Sep 30 21:31:11 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:31:11 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:31:11 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:31:11 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:31:11 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:31:11 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:31:11 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:31:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:11.783 103970 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Sep 30 21:31:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:11.785 103970 INFO eventlet.wsgi.server [-] 10.100.0.5,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1671 time: 1.4437270
Sep 30 21:31:11 compute-1 haproxy-metadata-proxy-196d7954-47b3-470a-991c-3398bf1a7372[233056]: 10.100.0.5:56574 [30/Sep/2025:21:31:10.338] listener listener/metadata 0/0/0/1447/1447 200 1655 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.317 2 DEBUG oslo_concurrency.lockutils [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "b2a2285f-4f58-421c-a234-d42cebc7e645" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.318 2 DEBUG oslo_concurrency.lockutils [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.318 2 DEBUG oslo_concurrency.lockutils [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.318 2 DEBUG oslo_concurrency.lockutils [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.319 2 DEBUG oslo_concurrency.lockutils [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.338 2 INFO nova.compute.manager [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Terminating instance
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.358 2 DEBUG nova.compute.manager [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:31:12 compute-1 kernel: tap2881fce3-7a (unregistering): left promiscuous mode
Sep 30 21:31:12 compute-1 NetworkManager[51724]: <info>  [1759267872.4033] device (tap2881fce3-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-1 ovn_controller[94902]: 2025-09-30T21:31:12Z|00315|binding|INFO|Releasing lport 2881fce3-7ad4-48e8-9a8d-db239d0022f7 from this chassis (sb_readonly=0)
Sep 30 21:31:12 compute-1 ovn_controller[94902]: 2025-09-30T21:31:12Z|00316|binding|INFO|Setting lport 2881fce3-7ad4-48e8-9a8d-db239d0022f7 down in Southbound
Sep 30 21:31:12 compute-1 ovn_controller[94902]: 2025-09-30T21:31:12Z|00317|binding|INFO|Removing iface tap2881fce3-7a ovn-installed in OVS
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.439 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:c6:1f 10.100.0.5'], port_security=['fa:16:3e:61:c6:1f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b2a2285f-4f58-421c-a234-d42cebc7e645', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-196d7954-47b3-470a-991c-3398bf1a7372', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '635d1a8848cd4101b935e658c05f9037', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53e22dca-c5b6-4e3b-a687-e4ccb8d37523', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.174'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5a90e13-3205-4d6c-9861-d027fc810892, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=2881fce3-7ad4-48e8-9a8d-db239d0022f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.440 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 2881fce3-7ad4-48e8-9a8d-db239d0022f7 in datapath 196d7954-47b3-470a-991c-3398bf1a7372 unbound from our chassis
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.442 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 196d7954-47b3-470a-991c-3398bf1a7372, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.443 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[13f7c254-5194-4017-84b0-93e72cb72238]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.444 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372 namespace which is not needed anymore
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-1 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000052.scope: Deactivated successfully.
Sep 30 21:31:12 compute-1 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000052.scope: Consumed 14.350s CPU time.
Sep 30 21:31:12 compute-1 systemd-machined[152783]: Machine qemu-39-instance-00000052 terminated.
Sep 30 21:31:12 compute-1 podman[233375]: 2025-09-30 21:31:12.517944684 +0000 UTC m=+0.082776904 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:31:12 compute-1 podman[233371]: 2025-09-30 21:31:12.536585999 +0000 UTC m=+0.101517462 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:31:12 compute-1 podman[233374]: 2025-09-30 21:31:12.557851686 +0000 UTC m=+0.131057283 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:31:12 compute-1 neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372[233050]: [NOTICE]   (233054) : haproxy version is 2.8.14-c23fe91
Sep 30 21:31:12 compute-1 neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372[233050]: [NOTICE]   (233054) : path to executable is /usr/sbin/haproxy
Sep 30 21:31:12 compute-1 neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372[233050]: [WARNING]  (233054) : Exiting Master process...
Sep 30 21:31:12 compute-1 neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372[233050]: [WARNING]  (233054) : Exiting Master process...
Sep 30 21:31:12 compute-1 neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372[233050]: [ALERT]    (233054) : Current worker (233056) exited with code 143 (Terminated)
Sep 30 21:31:12 compute-1 neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372[233050]: [WARNING]  (233054) : All workers exited. Exiting... (0)
Sep 30 21:31:12 compute-1 systemd[1]: libpod-20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8.scope: Deactivated successfully.
Sep 30 21:31:12 compute-1 podman[233460]: 2025-09-30 21:31:12.622750825 +0000 UTC m=+0.055681480 container died 20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.640 2 INFO nova.virt.libvirt.driver [-] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Instance destroyed successfully.
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.640 2 DEBUG nova.objects.instance [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lazy-loading 'resources' on Instance uuid b2a2285f-4f58-421c-a234-d42cebc7e645 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:12 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8-userdata-shm.mount: Deactivated successfully.
Sep 30 21:31:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-82f427e7e14bf0f59a312dd73dc68ebac850c8a9a44c1f7bd512b69d752f51bf-merged.mount: Deactivated successfully.
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.667 2 DEBUG nova.virt.libvirt.vif [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=82,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFN7KIh9qkgYEW6aThUuoTAsfcOLNa55Wul/dSiKIZ6So0n1pI8fHMF8UmXfjLdVKB6SeKF645kYs+A/DgulmwQczcwL+uJu4BEcndBwtEwJpe973HTi196O56mrV1E+Ow==',key_name='tempest-keypair-873952531',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='635d1a8848cd4101b935e658c05f9037',ramdisk_id='',reservation_id='r-7g2on38v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-467131096',owner_user_name='tempest-ServersV294TestFqdnHostnames-467131096-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:30:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b78230ec3561401bac41d6a12631e379',uuid=b2a2285f-4f58-421c-a234-d42cebc7e645,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.668 2 DEBUG nova.network.os_vif_util [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Converting VIF {"id": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "address": "fa:16:3e:61:c6:1f", "network": {"id": "196d7954-47b3-470a-991c-3398bf1a7372", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1059767095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "635d1a8848cd4101b935e658c05f9037", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2881fce3-7a", "ovs_interfaceid": "2881fce3-7ad4-48e8-9a8d-db239d0022f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.669 2 DEBUG nova.network.os_vif_util [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:61:c6:1f,bridge_name='br-int',has_traffic_filtering=True,id=2881fce3-7ad4-48e8-9a8d-db239d0022f7,network=Network(196d7954-47b3-470a-991c-3398bf1a7372),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2881fce3-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:12 compute-1 podman[233460]: 2025-09-30 21:31:12.66942771 +0000 UTC m=+0.102358355 container cleanup 20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.669 2 DEBUG os_vif [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:c6:1f,bridge_name='br-int',has_traffic_filtering=True,id=2881fce3-7ad4-48e8-9a8d-db239d0022f7,network=Network(196d7954-47b3-470a-991c-3398bf1a7372),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2881fce3-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.671 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2881fce3-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:31:12 compute-1 systemd[1]: libpod-conmon-20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8.scope: Deactivated successfully.
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.681 2 INFO os_vif [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:61:c6:1f,bridge_name='br-int',has_traffic_filtering=True,id=2881fce3-7ad4-48e8-9a8d-db239d0022f7,network=Network(196d7954-47b3-470a-991c-3398bf1a7372),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2881fce3-7a')
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.682 2 INFO nova.virt.libvirt.driver [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Deleting instance files /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645_del
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.683 2 INFO nova.virt.libvirt.driver [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Deletion of /var/lib/nova/instances/b2a2285f-4f58-421c-a234-d42cebc7e645_del complete
Sep 30 21:31:12 compute-1 podman[233506]: 2025-09-30 21:31:12.741571286 +0000 UTC m=+0.041763943 container remove 20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.748 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3a4c14e7-f241-4471-a39a-9199300016b3]: (4, ('Tue Sep 30 09:31:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372 (20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8)\n20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8\nTue Sep 30 09:31:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372 (20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8)\n20b0848f6efa4d48e2d6571b97443d360f55158962ec061006a7741fa87ba7b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.749 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0230429e-7ed5-458a-b50d-c350c29c428e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.750 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap196d7954-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:12 compute-1 kernel: tap196d7954-40: left promiscuous mode
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.768 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5e2fa0a2-9a0f-4c19-888e-4498a9b0e4f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.793 2 INFO nova.compute.manager [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.795 2 DEBUG oslo.service.loopingcall [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.795 2 DEBUG nova.compute.manager [-] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:31:12 compute-1 nova_compute[192795]: 2025-09-30 21:31:12.795 2 DEBUG nova.network.neutron [-] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.799 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8d44fd91-062b-4195-be6e-e749df050f4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.800 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3912c25a-9cfb-4abb-bfb0-35c050b43e28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.813 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[60867e53-6764-4e51-8112-a8cc7105cddd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454123, 'reachable_time': 16832, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233521, 'error': None, 'target': 'ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.816 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-196d7954-47b3-470a-991c-3398bf1a7372 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:31:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:12.816 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[dffccabf-90a5-45f9-91a7-08f0e3231c88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-1 systemd[1]: run-netns-ovnmeta\x2d196d7954\x2d47b3\x2d470a\x2d991c\x2d3398bf1a7372.mount: Deactivated successfully.
Sep 30 21:31:13 compute-1 nova_compute[192795]: 2025-09-30 21:31:13.149 2 DEBUG nova.compute.manager [req-07546dfa-4a3b-4bb8-ac03-5056ee6e52ad req-08a1cbbc-a3a8-4034-b42d-5068d1eb4477 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received event network-vif-unplugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:13 compute-1 nova_compute[192795]: 2025-09-30 21:31:13.149 2 DEBUG oslo_concurrency.lockutils [req-07546dfa-4a3b-4bb8-ac03-5056ee6e52ad req-08a1cbbc-a3a8-4034-b42d-5068d1eb4477 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:13 compute-1 nova_compute[192795]: 2025-09-30 21:31:13.149 2 DEBUG oslo_concurrency.lockutils [req-07546dfa-4a3b-4bb8-ac03-5056ee6e52ad req-08a1cbbc-a3a8-4034-b42d-5068d1eb4477 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:13 compute-1 nova_compute[192795]: 2025-09-30 21:31:13.149 2 DEBUG oslo_concurrency.lockutils [req-07546dfa-4a3b-4bb8-ac03-5056ee6e52ad req-08a1cbbc-a3a8-4034-b42d-5068d1eb4477 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:13 compute-1 nova_compute[192795]: 2025-09-30 21:31:13.150 2 DEBUG nova.compute.manager [req-07546dfa-4a3b-4bb8-ac03-5056ee6e52ad req-08a1cbbc-a3a8-4034-b42d-5068d1eb4477 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] No waiting events found dispatching network-vif-unplugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:13 compute-1 nova_compute[192795]: 2025-09-30 21:31:13.150 2 DEBUG nova.compute.manager [req-07546dfa-4a3b-4bb8-ac03-5056ee6e52ad req-08a1cbbc-a3a8-4034-b42d-5068d1eb4477 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received event network-vif-unplugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:31:13 compute-1 nova_compute[192795]: 2025-09-30 21:31:13.854 2 DEBUG nova.network.neutron [-] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:13 compute-1 nova_compute[192795]: 2025-09-30 21:31:13.931 2 INFO nova.compute.manager [-] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Took 1.14 seconds to deallocate network for instance.
Sep 30 21:31:14 compute-1 nova_compute[192795]: 2025-09-30 21:31:14.011 2 DEBUG nova.compute.manager [req-0e2024c9-84b3-4050-8fc7-4703a37435c6 req-6138f82a-3192-4812-a5b9-907772cc283b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received event network-vif-deleted-2881fce3-7ad4-48e8-9a8d-db239d0022f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:14 compute-1 nova_compute[192795]: 2025-09-30 21:31:14.103 2 DEBUG oslo_concurrency.lockutils [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:14 compute-1 nova_compute[192795]: 2025-09-30 21:31:14.104 2 DEBUG oslo_concurrency.lockutils [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:14 compute-1 nova_compute[192795]: 2025-09-30 21:31:14.278 2 DEBUG nova.compute.provider_tree [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:31:14 compute-1 nova_compute[192795]: 2025-09-30 21:31:14.298 2 DEBUG nova.scheduler.client.report [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:31:14 compute-1 nova_compute[192795]: 2025-09-30 21:31:14.327 2 DEBUG oslo_concurrency.lockutils [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:14 compute-1 nova_compute[192795]: 2025-09-30 21:31:14.381 2 INFO nova.scheduler.client.report [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Deleted allocations for instance b2a2285f-4f58-421c-a234-d42cebc7e645
Sep 30 21:31:14 compute-1 nova_compute[192795]: 2025-09-30 21:31:14.496 2 DEBUG oslo_concurrency.lockutils [None req-87add431-a071-4db2-8103-5568cd472b15 b78230ec3561401bac41d6a12631e379 635d1a8848cd4101b935e658c05f9037 - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:15 compute-1 nova_compute[192795]: 2025-09-30 21:31:15.407 2 DEBUG nova.compute.manager [req-f95a20b7-8df2-46e8-8c31-e0d28e270a49 req-e55687b2-8892-4b93-83a4-5edb52f27bc3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received event network-vif-plugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:15 compute-1 nova_compute[192795]: 2025-09-30 21:31:15.407 2 DEBUG oslo_concurrency.lockutils [req-f95a20b7-8df2-46e8-8c31-e0d28e270a49 req-e55687b2-8892-4b93-83a4-5edb52f27bc3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:15 compute-1 nova_compute[192795]: 2025-09-30 21:31:15.408 2 DEBUG oslo_concurrency.lockutils [req-f95a20b7-8df2-46e8-8c31-e0d28e270a49 req-e55687b2-8892-4b93-83a4-5edb52f27bc3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:15 compute-1 nova_compute[192795]: 2025-09-30 21:31:15.408 2 DEBUG oslo_concurrency.lockutils [req-f95a20b7-8df2-46e8-8c31-e0d28e270a49 req-e55687b2-8892-4b93-83a4-5edb52f27bc3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b2a2285f-4f58-421c-a234-d42cebc7e645-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:15 compute-1 nova_compute[192795]: 2025-09-30 21:31:15.408 2 DEBUG nova.compute.manager [req-f95a20b7-8df2-46e8-8c31-e0d28e270a49 req-e55687b2-8892-4b93-83a4-5edb52f27bc3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] No waiting events found dispatching network-vif-plugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:15 compute-1 nova_compute[192795]: 2025-09-30 21:31:15.409 2 WARNING nova.compute.manager [req-f95a20b7-8df2-46e8-8c31-e0d28e270a49 req-e55687b2-8892-4b93-83a4-5edb52f27bc3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Received unexpected event network-vif-plugged-2881fce3-7ad4-48e8-9a8d-db239d0022f7 for instance with vm_state deleted and task_state None.
Sep 30 21:31:16 compute-1 podman[233522]: 2025-09-30 21:31:16.224155211 +0000 UTC m=+0.069817893 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.521 2 DEBUG oslo_concurrency.lockutils [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.521 2 DEBUG oslo_concurrency.lockutils [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.521 2 DEBUG oslo_concurrency.lockutils [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.522 2 DEBUG oslo_concurrency.lockutils [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.522 2 DEBUG oslo_concurrency.lockutils [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.543 2 INFO nova.compute.manager [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Terminating instance
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.562 2 DEBUG nova.compute.manager [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:31:17 compute-1 kernel: tapc2d3f3fa-4e (unregistering): left promiscuous mode
Sep 30 21:31:17 compute-1 NetworkManager[51724]: <info>  [1759267877.5861] device (tapc2d3f3fa-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 ovn_controller[94902]: 2025-09-30T21:31:17Z|00318|binding|INFO|Releasing lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad from this chassis (sb_readonly=0)
Sep 30 21:31:17 compute-1 ovn_controller[94902]: 2025-09-30T21:31:17Z|00319|binding|INFO|Setting lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad down in Southbound
Sep 30 21:31:17 compute-1 ovn_controller[94902]: 2025-09-30T21:31:17Z|00320|binding|INFO|Removing iface tapc2d3f3fa-4e ovn-installed in OVS
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.609 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:8e:e0 10.100.0.3'], port_security=['fa:16:3e:8a:8e:e0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a145b225-510f-43a7-8cc6-fccae3ed647e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72559935caa44fd9b779b6770f00199f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3a098193-23af-4fd8-a818-c9a9c1a46706', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4fe4d7-2316-4403-a18b-3b0227898f0d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.610 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad in datapath a145b225-510f-43a7-8cc6-fccae3ed647e unbound from our chassis
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.611 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a145b225-510f-43a7-8cc6-fccae3ed647e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.613 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cf11f475-81e7-4d2d-95bd-714191edcf43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.613 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e namespace which is not needed anymore
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000051.scope: Deactivated successfully.
Sep 30 21:31:17 compute-1 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000051.scope: Consumed 9.586s CPU time.
Sep 30 21:31:17 compute-1 systemd-machined[152783]: Machine qemu-40-instance-00000051 terminated.
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[233352]: [NOTICE]   (233356) : haproxy version is 2.8.14-c23fe91
Sep 30 21:31:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[233352]: [NOTICE]   (233356) : path to executable is /usr/sbin/haproxy
Sep 30 21:31:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[233352]: [WARNING]  (233356) : Exiting Master process...
Sep 30 21:31:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[233352]: [WARNING]  (233356) : Exiting Master process...
Sep 30 21:31:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[233352]: [ALERT]    (233356) : Current worker (233358) exited with code 143 (Terminated)
Sep 30 21:31:17 compute-1 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[233352]: [WARNING]  (233356) : All workers exited. Exiting... (0)
Sep 30 21:31:17 compute-1 systemd[1]: libpod-c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891.scope: Deactivated successfully.
Sep 30 21:31:17 compute-1 podman[233567]: 2025-09-30 21:31:17.744054867 +0000 UTC m=+0.048884105 container died c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:31:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-77e5c31fc814cb4dabb538ea7ad8620b7dd3454f7eddc812572f49c59accfbc9-merged.mount: Deactivated successfully.
Sep 30 21:31:17 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891-userdata-shm.mount: Deactivated successfully.
Sep 30 21:31:17 compute-1 podman[233567]: 2025-09-30 21:31:17.7825123 +0000 UTC m=+0.087341538 container cleanup c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:31:17 compute-1 systemd[1]: libpod-conmon-c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891.scope: Deactivated successfully.
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.831 2 INFO nova.virt.libvirt.driver [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance destroyed successfully.
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.832 2 DEBUG nova.objects.instance [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'resources' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.852 2 DEBUG nova.virt.libvirt.vif [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-543993129',display_name='tempest-ServerDiskConfigTestJSON-server-543993129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-543993129',id=81,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-z9ljve0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:31:14Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=c3cd73be-ae82-4c19-8ab7-ec9b06134032,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.853 2 DEBUG nova.network.os_vif_util [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.854 2 DEBUG nova.network.os_vif_util [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.855 2 DEBUG os_vif [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2d3f3fa-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 podman[233600]: 2025-09-30 21:31:17.862871308 +0000 UTC m=+0.053876591 container remove c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.866 2 INFO os_vif [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e')
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.867 2 INFO nova.virt.libvirt.driver [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Deleting instance files /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_del
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.869 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[025e4de2-31de-41e4-b235-1e06b990d042]: (4, ('Tue Sep 30 09:31:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e (c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891)\nc474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891\nTue Sep 30 09:31:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e (c474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891)\nc474377bb640bb421ce1eb1b033b11c9cc2439e502cad5b074da472446772891\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.870 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a30b872f-b8ed-4ac9-9182-ad457e70188a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.871 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa145b225-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:17 compute-1 kernel: tapa145b225-50: left promiscuous mode
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.880 2 INFO nova.virt.libvirt.driver [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Deletion of /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_del complete
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.893 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[eca38c3d-674e-4c88-ad21-f21afe94dd30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.893 2 DEBUG nova.compute.manager [req-7594b064-a977-4c56-a780-78f9f73644cb req-05fa62dc-fdca-4d6a-933a-d92c52e4a13e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-unplugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.894 2 DEBUG oslo_concurrency.lockutils [req-7594b064-a977-4c56-a780-78f9f73644cb req-05fa62dc-fdca-4d6a-933a-d92c52e4a13e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.894 2 DEBUG oslo_concurrency.lockutils [req-7594b064-a977-4c56-a780-78f9f73644cb req-05fa62dc-fdca-4d6a-933a-d92c52e4a13e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.894 2 DEBUG oslo_concurrency.lockutils [req-7594b064-a977-4c56-a780-78f9f73644cb req-05fa62dc-fdca-4d6a-933a-d92c52e4a13e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.894 2 DEBUG nova.compute.manager [req-7594b064-a977-4c56-a780-78f9f73644cb req-05fa62dc-fdca-4d6a-933a-d92c52e4a13e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-unplugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.895 2 DEBUG nova.compute.manager [req-7594b064-a977-4c56-a780-78f9f73644cb req-05fa62dc-fdca-4d6a-933a-d92c52e4a13e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-unplugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.941 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[30f38803-f7c5-47f1-b07f-a75359fec6aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.943 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[30986a30-8dcc-4f4c-a12f-f5cf258c497c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.958 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a08c379b-0919-459b-9b67-ec24f287a1f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456367, 'reachable_time': 44521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233628, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.959 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:31:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:17.960 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[9e98d1ef-510e-4c07-a4ad-3e919dcdb2d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:17 compute-1 systemd[1]: run-netns-ovnmeta\x2da145b225\x2d510f\x2d43a7\x2d8cc6\x2dfccae3ed647e.mount: Deactivated successfully.
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.967 2 INFO nova.compute.manager [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.967 2 DEBUG oslo.service.loopingcall [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.968 2 DEBUG nova.compute.manager [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:31:17 compute-1 nova_compute[192795]: 2025-09-30 21:31:17.968 2 DEBUG nova.network.neutron [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.092 2 DEBUG nova.network.neutron [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.132 2 INFO nova.compute.manager [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Took 1.16 seconds to deallocate network for instance.
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.213 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquiring lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.213 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.214 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquiring lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.214 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.215 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.237 2 INFO nova.compute.manager [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Terminating instance
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.242 2 DEBUG nova.compute.manager [req-7f5dbd46-cb53-4fa9-b42a-500be3e6d5dd req-1edea4a7-73f3-4f3a-9400-f0d8a6222f91 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-deleted-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.265 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquiring lock "refresh_cache-f54ded8f-9992-46a9-af52-0cfa1b80a50a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.265 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquired lock "refresh_cache-f54ded8f-9992-46a9-af52-0cfa1b80a50a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.266 2 DEBUG nova.network.neutron [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.269 2 DEBUG oslo_concurrency.lockutils [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.270 2 DEBUG oslo_concurrency.lockutils [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.283 2 DEBUG oslo_concurrency.lockutils [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.013s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.330 2 INFO nova.scheduler.client.report [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Deleted allocations for instance c3cd73be-ae82-4c19-8ab7-ec9b06134032
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.491 2 DEBUG oslo_concurrency.lockutils [None req-d63ddd5d-07f0-4ef1-8fb2-72f04727ee83 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.970s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.549 2 DEBUG nova.network.neutron [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.852 2 DEBUG nova.network.neutron [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.887 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Releasing lock "refresh_cache-f54ded8f-9992-46a9-af52-0cfa1b80a50a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.888 2 DEBUG nova.compute.manager [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:31:19 compute-1 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Sep 30 21:31:19 compute-1 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004f.scope: Consumed 15.607s CPU time.
Sep 30 21:31:19 compute-1 systemd-machined[152783]: Machine qemu-38-instance-0000004f terminated.
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.991 2 DEBUG nova.compute.manager [req-eaa88f71-6849-47d6-a352-0e09e34053fb req-79f49cdd-5e74-4f98-a733-3ea939c4e349 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.991 2 DEBUG oslo_concurrency.lockutils [req-eaa88f71-6849-47d6-a352-0e09e34053fb req-79f49cdd-5e74-4f98-a733-3ea939c4e349 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.991 2 DEBUG oslo_concurrency.lockutils [req-eaa88f71-6849-47d6-a352-0e09e34053fb req-79f49cdd-5e74-4f98-a733-3ea939c4e349 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.991 2 DEBUG oslo_concurrency.lockutils [req-eaa88f71-6849-47d6-a352-0e09e34053fb req-79f49cdd-5e74-4f98-a733-3ea939c4e349 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.992 2 DEBUG nova.compute.manager [req-eaa88f71-6849-47d6-a352-0e09e34053fb req-79f49cdd-5e74-4f98-a733-3ea939c4e349 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:19 compute-1 nova_compute[192795]: 2025-09-30 21:31:19.992 2 WARNING nova.compute.manager [req-eaa88f71-6849-47d6-a352-0e09e34053fb req-79f49cdd-5e74-4f98-a733-3ea939c4e349 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state deleted and task_state None.
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.152 2 INFO nova.virt.libvirt.driver [-] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Instance destroyed successfully.
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.153 2 DEBUG nova.objects.instance [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lazy-loading 'resources' on Instance uuid f54ded8f-9992-46a9-af52-0cfa1b80a50a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.168 2 INFO nova.virt.libvirt.driver [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Deleting instance files /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a_del
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.169 2 INFO nova.virt.libvirt.driver [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Deletion of /var/lib/nova/instances/f54ded8f-9992-46a9-af52-0cfa1b80a50a_del complete
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.250 2 INFO nova.compute.manager [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Took 0.36 seconds to destroy the instance on the hypervisor.
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.251 2 DEBUG oslo.service.loopingcall [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.252 2 DEBUG nova.compute.manager [-] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.252 2 DEBUG nova.network.neutron [-] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.722 2 DEBUG nova.network.neutron [-] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.742 2 DEBUG nova.network.neutron [-] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.760 2 INFO nova.compute.manager [-] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Took 0.51 seconds to deallocate network for instance.
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.843 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.844 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.927 2 DEBUG nova.compute.provider_tree [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.954 2 DEBUG nova.scheduler.client.report [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:31:20 compute-1 nova_compute[192795]: 2025-09-30 21:31:20.984 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:21 compute-1 nova_compute[192795]: 2025-09-30 21:31:21.010 2 INFO nova.scheduler.client.report [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Deleted allocations for instance f54ded8f-9992-46a9-af52-0cfa1b80a50a
Sep 30 21:31:21 compute-1 nova_compute[192795]: 2025-09-30 21:31:21.109 2 DEBUG oslo_concurrency.lockutils [None req-646a76a6-9505-4770-a3f5-d74eeaaca66c 5a7a790edd5341f99e0eea28163b2823 a29c3539646b4519bc06a42b7e32df43 - - default default] Lock "f54ded8f-9992-46a9-af52-0cfa1b80a50a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:22 compute-1 nova_compute[192795]: 2025-09-30 21:31:22.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:22 compute-1 nova_compute[192795]: 2025-09-30 21:31:22.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:22 compute-1 nova_compute[192795]: 2025-09-30 21:31:22.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:22 compute-1 nova_compute[192795]: 2025-09-30 21:31:22.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:23 compute-1 podman[233639]: 2025-09-30 21:31:23.238226819 +0000 UTC m=+0.068618201 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:31:23 compute-1 podman[233638]: 2025-09-30 21:31:23.259516547 +0000 UTC m=+0.097144545 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 21:31:23 compute-1 podman[233673]: 2025-09-30 21:31:23.327980142 +0000 UTC m=+0.068301522 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal)
Sep 30 21:31:27 compute-1 nova_compute[192795]: 2025-09-30 21:31:27.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:27 compute-1 nova_compute[192795]: 2025-09-30 21:31:27.639 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267872.637946, b2a2285f-4f58-421c-a234-d42cebc7e645 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:27 compute-1 nova_compute[192795]: 2025-09-30 21:31:27.639 2 INFO nova.compute.manager [-] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] VM Stopped (Lifecycle Event)
Sep 30 21:31:27 compute-1 nova_compute[192795]: 2025-09-30 21:31:27.660 2 DEBUG nova.compute.manager [None req-6766ed1a-0a98-4ac4-8828-94eabfd66c29 - - - - - -] [instance: b2a2285f-4f58-421c-a234-d42cebc7e645] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:27 compute-1 nova_compute[192795]: 2025-09-30 21:31:27.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:32 compute-1 nova_compute[192795]: 2025-09-30 21:31:32.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:32 compute-1 nova_compute[192795]: 2025-09-30 21:31:32.828 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267877.826808, c3cd73be-ae82-4c19-8ab7-ec9b06134032 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:32 compute-1 nova_compute[192795]: 2025-09-30 21:31:32.829 2 INFO nova.compute.manager [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] VM Stopped (Lifecycle Event)
Sep 30 21:31:32 compute-1 nova_compute[192795]: 2025-09-30 21:31:32.853 2 DEBUG nova.compute.manager [None req-683fd219-14b6-4915-bdc6-45540ce041b8 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:32 compute-1 nova_compute[192795]: 2025-09-30 21:31:32.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:34 compute-1 podman[233696]: 2025-09-30 21:31:34.219084504 +0000 UTC m=+0.063578054 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:31:35 compute-1 nova_compute[192795]: 2025-09-30 21:31:35.151 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267880.1484952, f54ded8f-9992-46a9-af52-0cfa1b80a50a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:35 compute-1 nova_compute[192795]: 2025-09-30 21:31:35.152 2 INFO nova.compute.manager [-] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] VM Stopped (Lifecycle Event)
Sep 30 21:31:35 compute-1 nova_compute[192795]: 2025-09-30 21:31:35.180 2 DEBUG nova.compute.manager [None req-9b40ca17-383e-45bd-a4b8-adc5f49f2e99 - - - - - -] [instance: f54ded8f-9992-46a9-af52-0cfa1b80a50a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:37 compute-1 nova_compute[192795]: 2025-09-30 21:31:37.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:37 compute-1 nova_compute[192795]: 2025-09-30 21:31:37.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:38.691 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:38.692 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:38.692 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:42 compute-1 nova_compute[192795]: 2025-09-30 21:31:42.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:42 compute-1 podman[233717]: 2025-09-30 21:31:42.634362211 +0000 UTC m=+0.069783322 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:31:42 compute-1 podman[233716]: 2025-09-30 21:31:42.652252726 +0000 UTC m=+0.093255008 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:31:42 compute-1 podman[233760]: 2025-09-30 21:31:42.738478623 +0000 UTC m=+0.087667197 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller)
Sep 30 21:31:42 compute-1 nova_compute[192795]: 2025-09-30 21:31:42.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.749 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.749 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.749 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.749 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:31:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:45.764 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:31:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:45.765 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.920 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.922 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5725MB free_disk=73.3858413696289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.922 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:45 compute-1 nova_compute[192795]: 2025-09-30 21:31:45.922 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.005 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.006 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.044 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.080 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.081 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.103 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.156 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.180 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.195 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.216 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:31:46 compute-1 nova_compute[192795]: 2025-09-30 21:31:46.217 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:47 compute-1 podman[233788]: 2025-09-30 21:31:47.23532812 +0000 UTC m=+0.074787898 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm)
Sep 30 21:31:47 compute-1 nova_compute[192795]: 2025-09-30 21:31:47.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:47.768 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:47 compute-1 nova_compute[192795]: 2025-09-30 21:31:47.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.217 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.234 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.234 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.251 2 DEBUG nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.342 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.343 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.350 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.350 2 INFO nova.compute.claims [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.481 2 DEBUG nova.compute.provider_tree [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.497 2 DEBUG nova.scheduler.client.report [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.532 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.534 2 DEBUG nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.601 2 DEBUG nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.602 2 DEBUG nova.network.neutron [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.626 2 INFO nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.656 2 DEBUG nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.832 2 DEBUG nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.835 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.836 2 INFO nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Creating image(s)
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.837 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.838 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.839 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.866 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.931 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.933 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.934 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:48 compute-1 nova_compute[192795]: 2025-09-30 21:31:48.961 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.026 2 DEBUG nova.policy [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.034 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.036 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.096 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.097 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.098 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.161 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.164 2 DEBUG nova.virt.disk.api [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Checking if we can resize image /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.165 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.232 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.234 2 DEBUG nova.virt.disk.api [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Cannot resize image /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.235 2 DEBUG nova.objects.instance [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'migration_context' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.297 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.298 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Ensure instance console log exists: /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.299 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.300 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.300 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:49 compute-1 nova_compute[192795]: 2025-09-30 21:31:49.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:31:50 compute-1 nova_compute[192795]: 2025-09-30 21:31:50.482 2 DEBUG nova.network.neutron [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Successfully created port: d8844f4f-b484-4605-8f20-0bb8b7a50471 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:31:51 compute-1 nova_compute[192795]: 2025-09-30 21:31:51.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.079 2 DEBUG nova.network.neutron [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Successfully updated port: d8844f4f-b484-4605-8f20-0bb8b7a50471 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.101 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.102 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquired lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.102 2 DEBUG nova.network.neutron [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.229 2 DEBUG nova.compute.manager [req-3ee6399c-ccc7-4a63-9823-cc1f847a8fca req-7190d2ce-72c2-4f83-bc91-699bceacb30d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received event network-changed-d8844f4f-b484-4605-8f20-0bb8b7a50471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.230 2 DEBUG nova.compute.manager [req-3ee6399c-ccc7-4a63-9823-cc1f847a8fca req-7190d2ce-72c2-4f83-bc91-699bceacb30d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Refreshing instance network info cache due to event network-changed-d8844f4f-b484-4605-8f20-0bb8b7a50471. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.230 2 DEBUG oslo_concurrency.lockutils [req-3ee6399c-ccc7-4a63-9823-cc1f847a8fca req-7190d2ce-72c2-4f83-bc91-699bceacb30d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.304 2 DEBUG nova.network.neutron [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:52 compute-1 nova_compute[192795]: 2025-09-30 21:31:52.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.314 2 DEBUG nova.network.neutron [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating instance_info_cache with network_info: [{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.367 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Releasing lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.368 2 DEBUG nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance network_info: |[{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.368 2 DEBUG oslo_concurrency.lockutils [req-3ee6399c-ccc7-4a63-9823-cc1f847a8fca req-7190d2ce-72c2-4f83-bc91-699bceacb30d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.369 2 DEBUG nova.network.neutron [req-3ee6399c-ccc7-4a63-9823-cc1f847a8fca req-7190d2ce-72c2-4f83-bc91-699bceacb30d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Refreshing network info cache for port d8844f4f-b484-4605-8f20-0bb8b7a50471 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.376 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Start _get_guest_xml network_info=[{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.384 2 WARNING nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.390 2 DEBUG nova.virt.libvirt.host [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.391 2 DEBUG nova.virt.libvirt.host [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.395 2 DEBUG nova.virt.libvirt.host [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.396 2 DEBUG nova.virt.libvirt.host [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.397 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.397 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.397 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.398 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.398 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.398 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.398 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.398 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.399 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.399 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.399 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.399 2 DEBUG nova.virt.hardware [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.403 2 DEBUG nova.virt.libvirt.vif [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2076746266',display_name='tempest-ServerActionsTestOtherB-server-2076746266',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2076746266',id=84,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGkYCgSHUTuSUzUIAuXJtLTqGK//f64VuPig3h4DAdGVbwuL+Fh6FvgBtW4lbWvdqGtfEAYA8BT52zsalAqCB8JklfZ4tahvlr3WnGK5B2oFxxGbGDUfPfDkJJDH+Xurxg==',key_name='tempest-keypair-2100038533',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-8p80unpl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:31:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=c783547f-0799-4e53-8cdc-8784800b3c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.403 2 DEBUG nova.network.os_vif_util [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.404 2 DEBUG nova.network.os_vif_util [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.405 2 DEBUG nova.objects.instance [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'pci_devices' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.430 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <uuid>c783547f-0799-4e53-8cdc-8784800b3c2d</uuid>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <name>instance-00000054</name>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerActionsTestOtherB-server-2076746266</nova:name>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:31:53</nova:creationTime>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:31:53 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:31:53 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:31:53 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:31:53 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:31:53 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:31:53 compute-1 nova_compute[192795]:         <nova:user uuid="b9b3e9f2523944539f57a1ff5d565cb4">tempest-ServerActionsTestOtherB-463525410-project-member</nova:user>
Sep 30 21:31:53 compute-1 nova_compute[192795]:         <nova:project uuid="d876c85b6ca5418eb657e48391a6503b">tempest-ServerActionsTestOtherB-463525410</nova:project>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:31:53 compute-1 nova_compute[192795]:         <nova:port uuid="d8844f4f-b484-4605-8f20-0bb8b7a50471">
Sep 30 21:31:53 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <system>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <entry name="serial">c783547f-0799-4e53-8cdc-8784800b3c2d</entry>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <entry name="uuid">c783547f-0799-4e53-8cdc-8784800b3c2d</entry>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     </system>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <os>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   </os>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <features>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   </features>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:84:18:3c"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <target dev="tapd8844f4f-b4"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/console.log" append="off"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <video>
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     </video>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:31:53 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:31:53 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:31:53 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:31:53 compute-1 nova_compute[192795]: </domain>
Sep 30 21:31:53 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.431 2 DEBUG nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Preparing to wait for external event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.433 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.433 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.434 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.435 2 DEBUG nova.virt.libvirt.vif [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2076746266',display_name='tempest-ServerActionsTestOtherB-server-2076746266',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2076746266',id=84,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGkYCgSHUTuSUzUIAuXJtLTqGK//f64VuPig3h4DAdGVbwuL+Fh6FvgBtW4lbWvdqGtfEAYA8BT52zsalAqCB8JklfZ4tahvlr3WnGK5B2oFxxGbGDUfPfDkJJDH+Xurxg==',key_name='tempest-keypair-2100038533',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-8p80unpl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:31:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=c783547f-0799-4e53-8cdc-8784800b3c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.436 2 DEBUG nova.network.os_vif_util [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.437 2 DEBUG nova.network.os_vif_util [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.438 2 DEBUG os_vif [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.440 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.440 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.446 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8844f4f-b4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.447 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd8844f4f-b4, col_values=(('external_ids', {'iface-id': 'd8844f4f-b484-4605-8f20-0bb8b7a50471', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:18:3c', 'vm-uuid': 'c783547f-0799-4e53-8cdc-8784800b3c2d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:53 compute-1 NetworkManager[51724]: <info>  [1759267913.4826] manager: (tapd8844f4f-b4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.495 2 INFO os_vif [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4')
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.555 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.556 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.556 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No VIF found with MAC fa:16:3e:84:18:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.556 2 INFO nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Using config drive
Sep 30 21:31:53 compute-1 nova_compute[192795]: 2025-09-30 21:31:53.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.112 2 INFO nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Creating config drive at /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.126 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2zptncfh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:54 compute-1 podman[233826]: 2025-09-30 21:31:54.245949474 +0000 UTC m=+0.078740466 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.270 2 DEBUG oslo_concurrency.processutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2zptncfh" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:54 compute-1 podman[233827]: 2025-09-30 21:31:54.273435178 +0000 UTC m=+0.092020305 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:31:54 compute-1 podman[233828]: 2025-09-30 21:31:54.297025288 +0000 UTC m=+0.112028728 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:31:54 compute-1 kernel: tapd8844f4f-b4: entered promiscuous mode
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:54 compute-1 NetworkManager[51724]: <info>  [1759267914.3460] manager: (tapd8844f4f-b4): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Sep 30 21:31:54 compute-1 ovn_controller[94902]: 2025-09-30T21:31:54Z|00321|binding|INFO|Claiming lport d8844f4f-b484-4605-8f20-0bb8b7a50471 for this chassis.
Sep 30 21:31:54 compute-1 ovn_controller[94902]: 2025-09-30T21:31:54Z|00322|binding|INFO|d8844f4f-b484-4605-8f20-0bb8b7a50471: Claiming fa:16:3e:84:18:3c 10.100.0.5
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.364 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:18:3c 10.100.0.5'], port_security=['fa:16:3e:84:18:3c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd876c85b6ca5418eb657e48391a6503b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d444175-4bc9-45e9-8b74-e4555ef5d88b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b003c3b3-124e-4f30-8c82-ee588d17c214, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=d8844f4f-b484-4605-8f20-0bb8b7a50471) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.365 103861 INFO neutron.agent.ovn.metadata.agent [-] Port d8844f4f-b484-4605-8f20-0bb8b7a50471 in datapath 91c84c55-96ab-4682-a6e7-9e96514ca8a5 bound to our chassis
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.366 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.383 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6575ee-22b6-499e-be61-d0c1a5669f1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.383 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap91c84c55-91 in ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.386 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap91c84c55-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.386 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[74bffdc2-d090-447a-9781-a23a8c2d2835]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.387 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a40595c7-bfb4-4e6b-8d4d-723c56bbd22f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 systemd-udevd[233903]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:31:54 compute-1 systemd-machined[152783]: New machine qemu-41-instance-00000054.
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.398 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[49a7932a-c06d-4068-85fd-478373056fbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:54 compute-1 systemd[1]: Started Virtual Machine qemu-41-instance-00000054.
Sep 30 21:31:54 compute-1 NetworkManager[51724]: <info>  [1759267914.4155] device (tapd8844f4f-b4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:31:54 compute-1 ovn_controller[94902]: 2025-09-30T21:31:54Z|00323|binding|INFO|Setting lport d8844f4f-b484-4605-8f20-0bb8b7a50471 ovn-installed in OVS
Sep 30 21:31:54 compute-1 ovn_controller[94902]: 2025-09-30T21:31:54Z|00324|binding|INFO|Setting lport d8844f4f-b484-4605-8f20-0bb8b7a50471 up in Southbound
Sep 30 21:31:54 compute-1 NetworkManager[51724]: <info>  [1759267914.4168] device (tapd8844f4f-b4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.430 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c44d5325-abf5-44ef-8bcf-fe6bc510cd23]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.466 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ed180b90-144c-49d8-a34f-5e6846c94f55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 NetworkManager[51724]: <info>  [1759267914.4756] manager: (tap91c84c55-90): new Veth device (/org/freedesktop/NetworkManager/Devices/161)
Sep 30 21:31:54 compute-1 systemd-udevd[233907]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.475 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f1b09965-3868-49a7-b286-cd29654d0e80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.521 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9bfd5ff6-9a8b-4f8e-a5ec-cdebbab3750f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.525 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ff6bb9-1cd9-426f-a4e8-b5126501e3c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 NetworkManager[51724]: <info>  [1759267914.5526] device (tap91c84c55-90): carrier: link connected
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.559 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[82f09607-1645-417b-829a-3b0d5c46b695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.585 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[50e52ced-8523-4ea1-b146-98f89dac95cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91c84c55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:a7:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461022, 'reachable_time': 33677, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233935, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.613 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ffbd795b-ab2f-4c43-962e-5ba16c5e5593]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef9:a7ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461022, 'tstamp': 461022}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233936, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.647 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e442e5-6c58-4613-80fa-7e46f7810475]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91c84c55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:a7:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461022, 'reachable_time': 33677, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233937, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.692 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f18d69dc-c648-451b-a656-4b473887bd5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.766 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[80bbc9b3-225b-4ca0-a20d-026cdf8ffcd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.768 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91c84c55-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.768 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.769 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91c84c55-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:54 compute-1 kernel: tap91c84c55-90: entered promiscuous mode
Sep 30 21:31:54 compute-1 NetworkManager[51724]: <info>  [1759267914.7729] manager: (tap91c84c55-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.777 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap91c84c55-90, col_values=(('external_ids', {'iface-id': '3996e682-c20c-41c5-9547-9688a18f316c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:54 compute-1 ovn_controller[94902]: 2025-09-30T21:31:54Z|00325|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.781 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.782 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a0a25e-cefe-416f-a3c6-b1df6fab55b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.783 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:31:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:31:54.784 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'env', 'PROCESS_TAG=haproxy-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/91c84c55-96ab-4682-a6e7-9e96514ca8a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:31:54 compute-1 nova_compute[192795]: 2025-09-30 21:31:54.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:55 compute-1 podman[233976]: 2025-09-30 21:31:55.139926154 +0000 UTC m=+0.057647323 container create 252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, io.buildah.version=1.41.3)
Sep 30 21:31:55 compute-1 systemd[1]: Started libpod-conmon-252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9.scope.
Sep 30 21:31:55 compute-1 podman[233976]: 2025-09-30 21:31:55.104517305 +0000 UTC m=+0.022238484 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:31:55 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:31:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9932ef3db0aa37a4a2c882386dfd08d43666d58cf8f2fb1437ae89c0001ce061/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:31:55 compute-1 podman[233976]: 2025-09-30 21:31:55.248847167 +0000 UTC m=+0.166568346 container init 252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:31:55 compute-1 podman[233976]: 2025-09-30 21:31:55.255382683 +0000 UTC m=+0.173103842 container start 252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.257 2 DEBUG nova.network.neutron [req-3ee6399c-ccc7-4a63-9823-cc1f847a8fca req-7190d2ce-72c2-4f83-bc91-699bceacb30d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updated VIF entry in instance network info cache for port d8844f4f-b484-4605-8f20-0bb8b7a50471. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.258 2 DEBUG nova.network.neutron [req-3ee6399c-ccc7-4a63-9823-cc1f847a8fca req-7190d2ce-72c2-4f83-bc91-699bceacb30d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating instance_info_cache with network_info: [{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:55 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233991]: [NOTICE]   (233995) : New worker (233997) forked
Sep 30 21:31:55 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233991]: [NOTICE]   (233995) : Loading success.
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.279 2 DEBUG oslo_concurrency.lockutils [req-3ee6399c-ccc7-4a63-9823-cc1f847a8fca req-7190d2ce-72c2-4f83-bc91-699bceacb30d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.361 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267915.3610666, c783547f-0799-4e53-8cdc-8784800b3c2d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.362 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] VM Started (Lifecycle Event)
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.385 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.391 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267915.3611395, c783547f-0799-4e53-8cdc-8784800b3c2d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.392 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] VM Paused (Lifecycle Event)
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.414 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.417 2 DEBUG nova.compute.manager [req-639f4ece-d269-4487-9023-2756493ae784 req-d5f67b90-6bc6-4677-ae1c-afcfc7a0da6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.418 2 DEBUG oslo_concurrency.lockutils [req-639f4ece-d269-4487-9023-2756493ae784 req-d5f67b90-6bc6-4677-ae1c-afcfc7a0da6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.418 2 DEBUG oslo_concurrency.lockutils [req-639f4ece-d269-4487-9023-2756493ae784 req-d5f67b90-6bc6-4677-ae1c-afcfc7a0da6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.419 2 DEBUG oslo_concurrency.lockutils [req-639f4ece-d269-4487-9023-2756493ae784 req-d5f67b90-6bc6-4677-ae1c-afcfc7a0da6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.419 2 DEBUG nova.compute.manager [req-639f4ece-d269-4487-9023-2756493ae784 req-d5f67b90-6bc6-4677-ae1c-afcfc7a0da6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Processing event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.420 2 DEBUG nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.423 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.424 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.428 2 INFO nova.virt.libvirt.driver [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance spawned successfully.
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.428 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.447 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.448 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.449 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.449 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.450 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.450 2 DEBUG nova.virt.libvirt.driver [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.455 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.456 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267915.42302, c783547f-0799-4e53-8cdc-8784800b3c2d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.456 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] VM Resumed (Lifecycle Event)
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.484 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.490 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.518 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.537 2 INFO nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Took 6.70 seconds to spawn the instance on the hypervisor.
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.538 2 DEBUG nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.626 2 INFO nova.compute.manager [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Took 7.31 seconds to build instance.
Sep 30 21:31:55 compute-1 nova_compute[192795]: 2025-09-30 21:31:55.652 2 DEBUG oslo_concurrency.lockutils [None req-45dc8ace-10d4-4acc-afa7-69bb7e5bac8a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.417s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:56 compute-1 nova_compute[192795]: 2025-09-30 21:31:56.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:56 compute-1 nova_compute[192795]: 2025-09-30 21:31:56.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:31:56 compute-1 nova_compute[192795]: 2025-09-30 21:31:56.725 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:31:57 compute-1 nova_compute[192795]: 2025-09-30 21:31:57.545 2 DEBUG nova.compute.manager [req-d627c870-f00d-4ebe-abbb-6d95c455d9d7 req-64f5bd27-7143-49e0-a567-12d8ed56798d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:57 compute-1 nova_compute[192795]: 2025-09-30 21:31:57.546 2 DEBUG oslo_concurrency.lockutils [req-d627c870-f00d-4ebe-abbb-6d95c455d9d7 req-64f5bd27-7143-49e0-a567-12d8ed56798d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:57 compute-1 nova_compute[192795]: 2025-09-30 21:31:57.547 2 DEBUG oslo_concurrency.lockutils [req-d627c870-f00d-4ebe-abbb-6d95c455d9d7 req-64f5bd27-7143-49e0-a567-12d8ed56798d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:57 compute-1 nova_compute[192795]: 2025-09-30 21:31:57.547 2 DEBUG oslo_concurrency.lockutils [req-d627c870-f00d-4ebe-abbb-6d95c455d9d7 req-64f5bd27-7143-49e0-a567-12d8ed56798d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:57 compute-1 nova_compute[192795]: 2025-09-30 21:31:57.548 2 DEBUG nova.compute.manager [req-d627c870-f00d-4ebe-abbb-6d95c455d9d7 req-64f5bd27-7143-49e0-a567-12d8ed56798d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] No waiting events found dispatching network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:57 compute-1 nova_compute[192795]: 2025-09-30 21:31:57.548 2 WARNING nova.compute.manager [req-d627c870-f00d-4ebe-abbb-6d95c455d9d7 req-64f5bd27-7143-49e0-a567-12d8ed56798d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received unexpected event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 for instance with vm_state active and task_state None.
Sep 30 21:31:57 compute-1 nova_compute[192795]: 2025-09-30 21:31:57.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:57 compute-1 nova_compute[192795]: 2025-09-30 21:31:57.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:58 compute-1 nova_compute[192795]: 2025-09-30 21:31:58.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:58 compute-1 nova_compute[192795]: 2025-09-30 21:31:58.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:58 compute-1 NetworkManager[51724]: <info>  [1759267918.5299] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Sep 30 21:31:58 compute-1 NetworkManager[51724]: <info>  [1759267918.5324] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Sep 30 21:31:58 compute-1 nova_compute[192795]: 2025-09-30 21:31:58.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:58 compute-1 ovn_controller[94902]: 2025-09-30T21:31:58Z|00326|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:31:58 compute-1 nova_compute[192795]: 2025-09-30 21:31:58.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:58 compute-1 nova_compute[192795]: 2025-09-30 21:31:58.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:59 compute-1 nova_compute[192795]: 2025-09-30 21:31:59.701 2 DEBUG nova.compute.manager [req-5ec7d922-e837-4603-a25e-e26afd05750c req-63fda45a-fa21-440f-ab88-4234937fa65a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received event network-changed-d8844f4f-b484-4605-8f20-0bb8b7a50471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:59 compute-1 nova_compute[192795]: 2025-09-30 21:31:59.702 2 DEBUG nova.compute.manager [req-5ec7d922-e837-4603-a25e-e26afd05750c req-63fda45a-fa21-440f-ab88-4234937fa65a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Refreshing instance network info cache due to event network-changed-d8844f4f-b484-4605-8f20-0bb8b7a50471. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:31:59 compute-1 nova_compute[192795]: 2025-09-30 21:31:59.702 2 DEBUG oslo_concurrency.lockutils [req-5ec7d922-e837-4603-a25e-e26afd05750c req-63fda45a-fa21-440f-ab88-4234937fa65a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:31:59 compute-1 nova_compute[192795]: 2025-09-30 21:31:59.703 2 DEBUG oslo_concurrency.lockutils [req-5ec7d922-e837-4603-a25e-e26afd05750c req-63fda45a-fa21-440f-ab88-4234937fa65a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:31:59 compute-1 nova_compute[192795]: 2025-09-30 21:31:59.703 2 DEBUG nova.network.neutron [req-5ec7d922-e837-4603-a25e-e26afd05750c req-63fda45a-fa21-440f-ab88-4234937fa65a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Refreshing network info cache for port d8844f4f-b484-4605-8f20-0bb8b7a50471 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:32:01 compute-1 nova_compute[192795]: 2025-09-30 21:32:01.860 2 DEBUG nova.network.neutron [req-5ec7d922-e837-4603-a25e-e26afd05750c req-63fda45a-fa21-440f-ab88-4234937fa65a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updated VIF entry in instance network info cache for port d8844f4f-b484-4605-8f20-0bb8b7a50471. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:32:01 compute-1 nova_compute[192795]: 2025-09-30 21:32:01.861 2 DEBUG nova.network.neutron [req-5ec7d922-e837-4603-a25e-e26afd05750c req-63fda45a-fa21-440f-ab88-4234937fa65a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating instance_info_cache with network_info: [{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:32:01 compute-1 nova_compute[192795]: 2025-09-30 21:32:01.885 2 DEBUG oslo_concurrency.lockutils [req-5ec7d922-e837-4603-a25e-e26afd05750c req-63fda45a-fa21-440f-ab88-4234937fa65a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:32:02 compute-1 ovn_controller[94902]: 2025-09-30T21:32:02Z|00327|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:32:02 compute-1 nova_compute[192795]: 2025-09-30 21:32:02.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:02 compute-1 nova_compute[192795]: 2025-09-30 21:32:02.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:03 compute-1 nova_compute[192795]: 2025-09-30 21:32:03.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:03 compute-1 nova_compute[192795]: 2025-09-30 21:32:03.690 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:05 compute-1 podman[234007]: 2025-09-30 21:32:05.236387689 +0000 UTC m=+0.075791265 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:32:07 compute-1 nova_compute[192795]: 2025-09-30 21:32:07.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:07 compute-1 nova_compute[192795]: 2025-09-30 21:32:07.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:08 compute-1 nova_compute[192795]: 2025-09-30 21:32:08.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:09 compute-1 ovn_controller[94902]: 2025-09-30T21:32:09Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:84:18:3c 10.100.0.5
Sep 30 21:32:09 compute-1 ovn_controller[94902]: 2025-09-30T21:32:09Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:84:18:3c 10.100.0.5
Sep 30 21:32:10 compute-1 nova_compute[192795]: 2025-09-30 21:32:10.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:12 compute-1 nova_compute[192795]: 2025-09-30 21:32:12.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:12 compute-1 nova_compute[192795]: 2025-09-30 21:32:12.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:13 compute-1 podman[234040]: 2025-09-30 21:32:13.238799114 +0000 UTC m=+0.077019018 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Sep 30 21:32:13 compute-1 podman[234042]: 2025-09-30 21:32:13.275022066 +0000 UTC m=+0.093088154 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:32:13 compute-1 podman[234041]: 2025-09-30 21:32:13.313190821 +0000 UTC m=+0.139130382 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 21:32:13 compute-1 nova_compute[192795]: 2025-09-30 21:32:13.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:16 compute-1 nova_compute[192795]: 2025-09-30 21:32:16.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:17 compute-1 nova_compute[192795]: 2025-09-30 21:32:17.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:18 compute-1 nova_compute[192795]: 2025-09-30 21:32:18.133 2 DEBUG nova.compute.manager [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:18 compute-1 nova_compute[192795]: 2025-09-30 21:32:18.224 2 INFO nova.compute.manager [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] instance snapshotting
Sep 30 21:32:18 compute-1 nova_compute[192795]: 2025-09-30 21:32:18.225 2 DEBUG nova.objects.instance [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'flavor' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:32:18 compute-1 podman[234106]: 2025-09-30 21:32:18.275843245 +0000 UTC m=+0.102199401 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 21:32:18 compute-1 nova_compute[192795]: 2025-09-30 21:32:18.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:18 compute-1 nova_compute[192795]: 2025-09-30 21:32:18.599 2 INFO nova.virt.libvirt.driver [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Beginning live snapshot process
Sep 30 21:32:18 compute-1 virtqemud[192217]: invalid argument: disk vda does not have an active block job
Sep 30 21:32:18 compute-1 nova_compute[192795]: 2025-09-30 21:32:18.900 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:18 compute-1 nova_compute[192795]: 2025-09-30 21:32:18.989 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:18 compute-1 nova_compute[192795]: 2025-09-30 21:32:18.992 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.069 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.082 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.163 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.165 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprmcvoh2f/59143191eda4468ea076ba58f6ac2161.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.221 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprmcvoh2f/59143191eda4468ea076ba58f6ac2161.delta 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.222 2 INFO nova.virt.libvirt.driver [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.303 2 DEBUG nova.virt.libvirt.guest [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:32:19 compute-1 ovn_controller[94902]: 2025-09-30T21:32:19Z|00328|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.810 2 DEBUG nova.virt.libvirt.guest [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.814 2 INFO nova.virt.libvirt.driver [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.880 2 DEBUG nova.privsep.utils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:32:19 compute-1 nova_compute[192795]: 2025-09-30 21:32:19.882 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprmcvoh2f/59143191eda4468ea076ba58f6ac2161.delta /var/lib/nova/instances/snapshots/tmprmcvoh2f/59143191eda4468ea076ba58f6ac2161 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:20 compute-1 nova_compute[192795]: 2025-09-30 21:32:20.453 2 DEBUG oslo_concurrency.processutils [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprmcvoh2f/59143191eda4468ea076ba58f6ac2161.delta /var/lib/nova/instances/snapshots/tmprmcvoh2f/59143191eda4468ea076ba58f6ac2161" returned: 0 in 0.571s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:20 compute-1 nova_compute[192795]: 2025-09-30 21:32:20.461 2 INFO nova.virt.libvirt.driver [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Snapshot extracted, beginning image upload
Sep 30 21:32:22 compute-1 nova_compute[192795]: 2025-09-30 21:32:22.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:23 compute-1 nova_compute[192795]: 2025-09-30 21:32:23.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:23 compute-1 nova_compute[192795]: 2025-09-30 21:32:23.628 2 INFO nova.virt.libvirt.driver [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Snapshot image upload complete
Sep 30 21:32:23 compute-1 nova_compute[192795]: 2025-09-30 21:32:23.629 2 INFO nova.compute.manager [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Took 5.37 seconds to snapshot the instance on the hypervisor.
Sep 30 21:32:24 compute-1 nova_compute[192795]: 2025-09-30 21:32:24.045 2 DEBUG nova.compute.manager [None req-42bf598f-b4a9-483a-8843-54ecaadce27d b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Found 1 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Sep 30 21:32:25 compute-1 podman[234163]: 2025-09-30 21:32:25.251505972 +0000 UTC m=+0.088174941 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41)
Sep 30 21:32:25 compute-1 podman[234165]: 2025-09-30 21:32:25.251757538 +0000 UTC m=+0.079826804 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:32:25 compute-1 podman[234164]: 2025-09-30 21:32:25.252034586 +0000 UTC m=+0.083571816 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:32:26 compute-1 nova_compute[192795]: 2025-09-30 21:32:26.372 2 DEBUG nova.compute.manager [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:26 compute-1 nova_compute[192795]: 2025-09-30 21:32:26.462 2 INFO nova.compute.manager [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] instance snapshotting
Sep 30 21:32:26 compute-1 nova_compute[192795]: 2025-09-30 21:32:26.463 2 DEBUG nova.objects.instance [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'flavor' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:32:26 compute-1 nova_compute[192795]: 2025-09-30 21:32:26.858 2 INFO nova.virt.libvirt.driver [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Beginning live snapshot process
Sep 30 21:32:27 compute-1 virtqemud[192217]: invalid argument: disk vda does not have an active block job
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.080 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.137 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.140 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.208 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.221 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.281 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.282 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp8o7h6t0r/3dfaea5e1b674e85b75d80af033cb66c.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.322 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp8o7h6t0r/3dfaea5e1b674e85b75d80af033cb66c.delta 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.324 2 INFO nova.virt.libvirt.driver [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.378 2 DEBUG nova.virt.libvirt.guest [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.883 2 DEBUG nova.virt.libvirt.guest [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.888 2 INFO nova.virt.libvirt.driver [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.928 2 DEBUG nova.privsep.utils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:32:27 compute-1 nova_compute[192795]: 2025-09-30 21:32:27.929 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp8o7h6t0r/3dfaea5e1b674e85b75d80af033cb66c.delta /var/lib/nova/instances/snapshots/tmp8o7h6t0r/3dfaea5e1b674e85b75d80af033cb66c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:28 compute-1 nova_compute[192795]: 2025-09-30 21:32:28.304 2 DEBUG oslo_concurrency.processutils [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp8o7h6t0r/3dfaea5e1b674e85b75d80af033cb66c.delta /var/lib/nova/instances/snapshots/tmp8o7h6t0r/3dfaea5e1b674e85b75d80af033cb66c" returned: 0 in 0.375s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:28 compute-1 nova_compute[192795]: 2025-09-30 21:32:28.310 2 INFO nova.virt.libvirt.driver [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Snapshot extracted, beginning image upload
Sep 30 21:32:28 compute-1 nova_compute[192795]: 2025-09-30 21:32:28.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:28 compute-1 nova_compute[192795]: 2025-09-30 21:32:28.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:31 compute-1 nova_compute[192795]: 2025-09-30 21:32:31.058 2 INFO nova.virt.libvirt.driver [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Snapshot image upload complete
Sep 30 21:32:31 compute-1 nova_compute[192795]: 2025-09-30 21:32:31.059 2 INFO nova.compute.manager [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Took 4.50 seconds to snapshot the instance on the hypervisor.
Sep 30 21:32:31 compute-1 nova_compute[192795]: 2025-09-30 21:32:31.400 2 DEBUG nova.compute.manager [None req-48323e2e-f846-4a9f-845b-ced055473f53 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Found 2 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Sep 30 21:32:32 compute-1 nova_compute[192795]: 2025-09-30 21:32:32.599 2 DEBUG nova.compute.manager [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:32 compute-1 nova_compute[192795]: 2025-09-30 21:32:32.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:32 compute-1 nova_compute[192795]: 2025-09-30 21:32:32.716 2 INFO nova.compute.manager [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] instance snapshotting
Sep 30 21:32:32 compute-1 nova_compute[192795]: 2025-09-30 21:32:32.717 2 DEBUG nova.objects.instance [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'flavor' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.321 2 INFO nova.virt.libvirt.driver [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Beginning live snapshot process
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:33 compute-1 virtqemud[192217]: invalid argument: disk vda does not have an active block job
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.581 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.674 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.675 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.732 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json -f qcow2" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.746 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.803 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.804 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpgtua55fn/3bd46517c0e14aaca78d077983e7879e.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.854 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpgtua55fn/3bd46517c0e14aaca78d077983e7879e.delta 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.855 2 INFO nova.virt.libvirt.driver [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:32:33 compute-1 nova_compute[192795]: 2025-09-30 21:32:33.928 2 DEBUG nova.virt.libvirt.guest [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:32:34 compute-1 nova_compute[192795]: 2025-09-30 21:32:34.431 2 DEBUG nova.virt.libvirt.guest [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:32:34 compute-1 nova_compute[192795]: 2025-09-30 21:32:34.437 2 INFO nova.virt.libvirt.driver [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:32:34 compute-1 nova_compute[192795]: 2025-09-30 21:32:34.472 2 DEBUG nova.privsep.utils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:32:34 compute-1 nova_compute[192795]: 2025-09-30 21:32:34.473 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpgtua55fn/3bd46517c0e14aaca78d077983e7879e.delta /var/lib/nova/instances/snapshots/tmpgtua55fn/3bd46517c0e14aaca78d077983e7879e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:34 compute-1 nova_compute[192795]: 2025-09-30 21:32:34.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:34 compute-1 nova_compute[192795]: 2025-09-30 21:32:34.844 2 DEBUG oslo_concurrency.processutils [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpgtua55fn/3bd46517c0e14aaca78d077983e7879e.delta /var/lib/nova/instances/snapshots/tmpgtua55fn/3bd46517c0e14aaca78d077983e7879e" returned: 0 in 0.371s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:34 compute-1 nova_compute[192795]: 2025-09-30 21:32:34.850 2 INFO nova.virt.libvirt.driver [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Snapshot extracted, beginning image upload
Sep 30 21:32:36 compute-1 podman[234282]: 2025-09-30 21:32:36.220510275 +0000 UTC m=+0.062971238 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:32:37 compute-1 nova_compute[192795]: 2025-09-30 21:32:37.591 2 INFO nova.virt.libvirt.driver [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Snapshot image upload complete
Sep 30 21:32:37 compute-1 nova_compute[192795]: 2025-09-30 21:32:37.592 2 INFO nova.compute.manager [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Took 4.85 seconds to snapshot the instance on the hypervisor.
Sep 30 21:32:37 compute-1 nova_compute[192795]: 2025-09-30 21:32:37.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:38 compute-1 nova_compute[192795]: 2025-09-30 21:32:38.105 2 DEBUG nova.compute.manager [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Found 3 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Sep 30 21:32:38 compute-1 nova_compute[192795]: 2025-09-30 21:32:38.106 2 DEBUG nova.compute.manager [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Rotating out 1 backups _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4458
Sep 30 21:32:38 compute-1 nova_compute[192795]: 2025-09-30 21:32:38.106 2 DEBUG nova.compute.manager [None req-be12e599-9fd6-4297-8111-974258051d20 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Deleting image e448c925-6500-4c58-891a-0fb7db556308 _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4463
Sep 30 21:32:38 compute-1 nova_compute[192795]: 2025-09-30 21:32:38.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:32:38.692 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:32:38.693 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:32:38.693 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:40 compute-1 ovn_controller[94902]: 2025-09-30T21:32:40Z|00329|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:32:40 compute-1 nova_compute[192795]: 2025-09-30 21:32:40.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:42 compute-1 nova_compute[192795]: 2025-09-30 21:32:42.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:43 compute-1 nova_compute[192795]: 2025-09-30 21:32:43.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.025 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000054', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd876c85b6ca5418eb657e48391a6503b', 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'hostId': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.026 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.030 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c783547f-0799-4e53-8cdc-8784800b3c2d / tapd8844f4f-b4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.030 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7441710f-bf60-462c-a7c8-9d8afaabc487', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.026472', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '001ce47e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': 'eae0a4c11bbb69e082472b1e8280dfa2869f443a1ff2e1762e41d54f2e94ce4b'}]}, 'timestamp': '2025-09-30 21:32:44.031455', '_unique_id': 'ac8c2c10cdf64d97ba9cd166d50da940'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.033 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.035 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.035 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '444220f6-6495-49df-a0c7-17a3a9f646ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.035157', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '001d8f00-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': '98ae932f3ce003fc8870d457bca007d4869fe07f81de128fe79577cd1671c371'}]}, 'timestamp': '2025-09-30 21:32:44.035703', '_unique_id': '22f8235430554d8da6a0d30c6717d5d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.036 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.037 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.062 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.read.latency volume: 591820107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.063 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.read.latency volume: 59999033 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4433b0b6-a420-431d-bca9-0c8a70b406cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 591820107, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-vda', 'timestamp': '2025-09-30T21:32:44.038095', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0021d452-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': '268a165ef58093cfdc7a7b4cce58639cee6cba5e04fa6f08e533bc5359748a0b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59999033, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-sda', 'timestamp': '2025-09-30T21:32:44.038095', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0021e758-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': '4cb9d0f4df7ec7523ff505e2aa426c20d3119f9fa08b140dec4ef09097f4b815'}]}, 'timestamp': '2025-09-30 21:32:44.064120', '_unique_id': '66f5a2168c8147ce84901ca882cae18c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.065 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.067 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.read.requests volume: 1137 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.067 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edc62f22-7d1b-4195-b099-318c1d80a273', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1137, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-vda', 'timestamp': '2025-09-30T21:32:44.067183', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '002270a6-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': '22fc0443ff9f34d30ed6c498ee4b27253307c4f6cef64484fe57b38aeb7dbd8e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-sda', 'timestamp': '2025-09-30T21:32:44.067183', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00227fe2-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': 'd0265428ed826a740132529d2ff5045d6b74a2d4e58283da529219bfa2afd6ce'}]}, 'timestamp': '2025-09-30 21:32:44.067998', '_unique_id': '9732359fb8cf42a99a684aec966ca628'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.068 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.069 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.086 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.087 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '233c1b73-7acb-4473-bc20-7a350bf769f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-vda', 'timestamp': '2025-09-30T21:32:44.069661', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0025646e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.802507976, 'message_signature': 'e581d164a283e8397a5760f6c92b09fd13ec12c409064f0d4f281828e7b162a8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-sda', 'timestamp': '2025-09-30T21:32:44.069661', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0025803e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.802507976, 'message_signature': '6e284a60b78e5067355abe8882826d53ea07b73fab1c8b2344f0398ddf23bb6b'}]}, 'timestamp': '2025-09-30 21:32:44.087926', '_unique_id': 'b2b4db8c0f6a4983acc593c32eff5783'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.090 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.091 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.092 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-2076746266>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-2076746266>]
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.092 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bc76fad-c6d9-46bd-9681-55b3676bf42c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.092756', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '00265996-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': 'c1ca77f1e4d7eec09d8750920db2d8833f44c29e191e282a67b409ae7f81878a'}]}, 'timestamp': '2025-09-30 21:32:44.093361', '_unique_id': 'b0dc2d60380448a99a88296b35f9ebf8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.094 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.096 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.096 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86ed8e18-351d-482a-82b5-0d3ec0497236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-vda', 'timestamp': '2025-09-30T21:32:44.095984', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0026d6e6-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.802507976, 'message_signature': '9782e56d4627ad65d4913da0daf2be796803c567344aee59ac8f094ab2cfb9f9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-sda', 'timestamp': '2025-09-30T21:32:44.095984', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0026ead2-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.802507976, 'message_signature': '1bb8713204910c06e0321ef874c3a45ebcef2dbcd19045f31ea72e36b81787ea'}]}, 'timestamp': '2025-09-30 21:32:44.097003', '_unique_id': '9422df37ee41425c92cb12bfd605b297'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.098 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.099 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.099 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-2076746266>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-2076746266>]
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.100 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.write.bytes volume: 73015296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.100 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bd58805-5e35-409f-b836-709ec857383a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73015296, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-vda', 'timestamp': '2025-09-30T21:32:44.100378', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '002782da-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': '8b7b2260a8d33039cf6f01530ab446b0fa55c6367ef78850108d6a37db56f996'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-sda', 'timestamp': '2025-09-30T21:32:44.100378', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '002794dc-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': 'f4cec33d6e10cad853ade9978bfd0a27c6e979574dd0d605ca32874a48dccf7b'}]}, 'timestamp': '2025-09-30 21:32:44.101396', '_unique_id': '4426fdf124b04623b21d86fa62855cc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.102 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.104 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.write.latency volume: 4006159917 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.104 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96c76e8b-968b-4563-8efc-7325c3027558', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4006159917, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-vda', 'timestamp': '2025-09-30T21:32:44.104134', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00281600-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': '209598126f45237ff5cccce92bd5ae394e3c99e356ff2bc0499fa647bc84ae75'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-sda', 'timestamp': '2025-09-30T21:32:44.104134', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00282a0a-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': '16d8475b367090a28aa7f1b5d66c36f8b34bd994bc29c595485ec762a9be3794'}]}, 'timestamp': '2025-09-30 21:32:44.105228', '_unique_id': 'f77689450b524d958ea6bd09ddcf090f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.106 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.108 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.write.requests volume: 291 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.108 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6f027d7-958a-433a-8d09-a2c752156262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 291, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-vda', 'timestamp': '2025-09-30T21:32:44.107977', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0028ae94-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': 'a39505c312fc984630a371abb48e403e461178d6f48bc316f23389b0df694171'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-sda', 'timestamp': '2025-09-30T21:32:44.107977', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0028c12c-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': '95be02a52abfbe99cb4a4eaa46f469e7f593d341fcae7684b5b981b2712442db'}]}, 'timestamp': '2025-09-30 21:32:44.109038', '_unique_id': '240ca861665b496697eea15a28f65488'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.111 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.incoming.bytes volume: 4495 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c48436f1-fbed-42c5-b49b-7c1541b1319a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4495, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.111799', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '002940e8-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': '1cac5b9e51bb530703692b1c393c7532593e04d6dfb400154f0e226d1b539d78'}]}, 'timestamp': '2025-09-30 21:32:44.112392', '_unique_id': '68112175fca8496fa6cfd608d58c5589'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.115 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd89e3e04-6138-42fe-af22-6952ed32bc5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.115229', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '0029c7a2-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': 'f5252e313888d72158b40bc2ad4b5401cda71661afbb4cbc3bb489d0099d5f12'}]}, 'timestamp': '2025-09-30 21:32:44.115802', '_unique_id': 'f0fbbbd35c0f4d99acbe8caeb84806ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.118 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f470bc32-3b09-4968-9cda-b514a1467cee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.118411', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '002a4330-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': 'f82e8485dff20c7b9aa0467ee927d2719ddf7ede8dfa5293160a073c19f3d6e4'}]}, 'timestamp': '2025-09-30 21:32:44.118950', '_unique_id': 'e30c1f8ae7bf47c09a66367abd9323a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.119 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.121 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.read.bytes volume: 31025664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.121 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb7f9af1-35da-45a7-8d76-538fe17c7db0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31025664, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-vda', 'timestamp': '2025-09-30T21:32:44.121028', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '002aa410-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': 'eda99a3829319c1f4930f5a625d52240e57ceb41e402b9b76557f64257793859'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-sda', 'timestamp': '2025-09-30T21:32:44.121028', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '002ab1f8-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.770882649, 'message_signature': '2756e44fb1ac733522339905e4471b1fd82171980a26d1b7b95e86d9326a9c81'}]}, 'timestamp': '2025-09-30 21:32:44.121680', '_unique_id': '2cb38d5a245542f5bfd18c43a08adc00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.122 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.123 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.123 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-2076746266>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-2076746266>]
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.123 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.incoming.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '679a0493-15fa-4cc6-bd03-81549b08191e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.123697', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '002b0cde-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': 'c01c339a6b35f1a197f2ebeb6739ee4f23d5d66a971f7708704e11d2503f87ef'}]}, 'timestamp': '2025-09-30 21:32:44.124032', '_unique_id': '94826039b4934394b9fec43b68af0a1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.124 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.125 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.125 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-2076746266>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-2076746266>]
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.126 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5bd9444-a382-4b17-b904-881da0c57b8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.125982', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '002b6562-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': '289731a8560e23c4fdbda6deec6ddf2b33c12a3d524a004e2a52b220be2eb097'}]}, 'timestamp': '2025-09-30 21:32:44.126319', '_unique_id': 'c8740f4b494447a7b6f35c8b2e5cc452'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.127 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4a20d74-a8c1-4f35-88e5-6ffe803bd426', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.127892', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '002bb03a-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': '0d4684545142aa92cc8e86270e5932c0df1349e0a40d345ecef71289de90db11'}]}, 'timestamp': '2025-09-30 21:32:44.128208', '_unique_id': 'f666dad9bb4e434da870e0b8a43cb04c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.128 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.129 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:32:44 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.153 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/memory.usage volume: 42.7890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d47ffe2-8090-41bc-8ab6-f1690f164459', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7890625, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'timestamp': '2025-09-30T21:32:44.129717', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '002fa6f4-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.885917117, 'message_signature': 'c4737aad2d81f58b2f46353d089ccb2c90dd76ab586a99d043197d0a83313a9b'}]}, 'timestamp': '2025-09-30 21:32:44.154403', '_unique_id': 'e47009ba2b7b4ecfb38df71785fb67b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.156 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.157 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.157 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/cpu volume: 12070000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '677c44ac-ad0a-455f-ba48-71b7dc53226a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12070000000, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'timestamp': '2025-09-30T21:32:44.157554', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '00303808-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.885917117, 'message_signature': '0606f8155e4ccc5828e5e7456ccea7dab3fb584e2a2d298004bdabf0472d4696'}]}, 'timestamp': '2025-09-30 21:32:44.157942', '_unique_id': 'c8b5abc8d48c4d9ea2a0af3d180a99fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.158 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.159 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.160 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a8445b5-082f-459c-ae04-3bf284351d0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-vda', 'timestamp': '2025-09-30T21:32:44.159675', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00308a56-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.802507976, 'message_signature': 'd5a9c04f8ac5830d985c25c60b1a63387af96fe4101d9424b63fc37382c35e16'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d-sda', 'timestamp': '2025-09-30T21:32:44.159675', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'instance-00000054', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003095e6-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.802507976, 'message_signature': '3ea609cfbe24c27991f5ca3843a2a4650265177be69fc5e09acc2096bbd6d28f'}]}, 'timestamp': '2025-09-30 21:32:44.160315', '_unique_id': '780f6c18f476479c963c735c7397db68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.162 12 DEBUG ceilometer.compute.pollsters [-] c783547f-0799-4e53-8cdc-8784800b3c2d/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '350ead4c-4284-4d95-8d4a-5e246f949532', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-00000054-c783547f-0799-4e53-8cdc-8784800b3c2d-tapd8844f4f-b4', 'timestamp': '2025-09-30T21:32:44.162016', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-2076746266', 'name': 'tapd8844f4f-b4', 'instance_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'instance_type': 'm1.nano', 'host': '39e1b0bfddcf677c4952ca29eff3c4dc08223464d721ccaa9eb8a821', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:18:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd8844f4f-b4'}, 'message_id': '0030e60e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4659.759265284, 'message_signature': '358cb2ca8c6f41888fe9b97de2e7b1dbe93088d943e70339cf2eec7bcd041ba1'}]}, 'timestamp': '2025-09-30 21:32:44.162424', '_unique_id': '5daa77c4541d47298f8e049c8a1a31bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:32:44.163 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-1 podman[234305]: 2025-09-30 21:32:44.243808197 +0000 UTC m=+0.069754072 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:32:44 compute-1 podman[234303]: 2025-09-30 21:32:44.248564166 +0000 UTC m=+0.085410416 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd)
Sep 30 21:32:44 compute-1 podman[234304]: 2025-09-30 21:32:44.276621056 +0000 UTC m=+0.108211554 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250923, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:32:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:32:45.989 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:32:45 compute-1 nova_compute[192795]: 2025-09-30 21:32:45.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:32:45.990 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.723 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.724 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.724 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.724 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.791 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.882 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.884 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:46 compute-1 nova_compute[192795]: 2025-09-30 21:32:46.961 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.167 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.170 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5505MB free_disk=73.35669326782227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.171 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.172 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.250 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance c783547f-0799-4e53-8cdc-8784800b3c2d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.252 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.253 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.304 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.323 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.495 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.496 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:47 compute-1 nova_compute[192795]: 2025-09-30 21:32:47.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:48 compute-1 nova_compute[192795]: 2025-09-30 21:32:48.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:32:48.993 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:49 compute-1 podman[234375]: 2025-09-30 21:32:49.230841871 +0000 UTC m=+0.077711037 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:32:50 compute-1 nova_compute[192795]: 2025-09-30 21:32:50.496 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:50 compute-1 nova_compute[192795]: 2025-09-30 21:32:50.497 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:50 compute-1 nova_compute[192795]: 2025-09-30 21:32:50.497 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:32:50 compute-1 nova_compute[192795]: 2025-09-30 21:32:50.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:51 compute-1 nova_compute[192795]: 2025-09-30 21:32:51.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:52 compute-1 nova_compute[192795]: 2025-09-30 21:32:52.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:52 compute-1 nova_compute[192795]: 2025-09-30 21:32:52.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:53 compute-1 nova_compute[192795]: 2025-09-30 21:32:53.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:53 compute-1 nova_compute[192795]: 2025-09-30 21:32:53.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:55 compute-1 nova_compute[192795]: 2025-09-30 21:32:55.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:56 compute-1 podman[234397]: 2025-09-30 21:32:56.222565341 +0000 UTC m=+0.051561419 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, io.buildah.version=1.41.3)
Sep 30 21:32:56 compute-1 podman[234396]: 2025-09-30 21:32:56.248247747 +0000 UTC m=+0.074506190 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:32:56 compute-1 podman[234395]: 2025-09-30 21:32:56.256232594 +0000 UTC m=+0.096603840 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:32:57 compute-1 nova_compute[192795]: 2025-09-30 21:32:57.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-1 nova_compute[192795]: 2025-09-30 21:32:58.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-1 nova_compute[192795]: 2025-09-30 21:32:58.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:58 compute-1 nova_compute[192795]: 2025-09-30 21:32:58.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:32:58 compute-1 nova_compute[192795]: 2025-09-30 21:32:58.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:32:58 compute-1 nova_compute[192795]: 2025-09-30 21:32:58.881 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:32:58 compute-1 nova_compute[192795]: 2025-09-30 21:32:58.881 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:32:58 compute-1 nova_compute[192795]: 2025-09-30 21:32:58.882 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:32:58 compute-1 nova_compute[192795]: 2025-09-30 21:32:58.882 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.516 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "272487f9-0986-45d4-bfde-ccdef8045c03" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.517 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.604 2 DEBUG nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.719 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.720 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.726 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.726 2 INFO nova.compute.claims [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.962 2 DEBUG nova.compute.provider_tree [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.974 2 DEBUG nova.scheduler.client.report [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.997 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:59 compute-1 nova_compute[192795]: 2025-09-30 21:32:59.997 2 DEBUG nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.082 2 DEBUG nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.083 2 DEBUG nova.network.neutron [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.106 2 INFO nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.127 2 DEBUG nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.247 2 DEBUG nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.248 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.249 2 INFO nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Creating image(s)
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.249 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "/var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.250 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "/var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.250 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "/var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.261 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.327 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.328 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.329 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.340 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.357 2 DEBUG nova.policy [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b62a036c63304f7ab5b0ba37a95b2b78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4c6e533166cc44d4b61600c6c1270ee8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.360 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating instance_info_cache with network_info: [{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.385 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.386 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.386 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.392 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.393 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.426 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.427 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.427 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.485 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.486 2 DEBUG nova.virt.disk.api [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Checking if we can resize image /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.486 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.550 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.551 2 DEBUG nova.virt.disk.api [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Cannot resize image /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.552 2 DEBUG nova.objects.instance [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lazy-loading 'migration_context' on Instance uuid 272487f9-0986-45d4-bfde-ccdef8045c03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.569 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.570 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Ensure instance console log exists: /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.570 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.570 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.571 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:00 compute-1 nova_compute[192795]: 2025-09-30 21:33:00.954 2 DEBUG nova.network.neutron [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Successfully created port: 637d02e2-70ae-4b64-85d3-17b2c44bbee3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:33:01 compute-1 nova_compute[192795]: 2025-09-30 21:33:01.830 2 DEBUG nova.network.neutron [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Successfully updated port: 637d02e2-70ae-4b64-85d3-17b2c44bbee3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:33:01 compute-1 nova_compute[192795]: 2025-09-30 21:33:01.851 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "refresh_cache-272487f9-0986-45d4-bfde-ccdef8045c03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:01 compute-1 nova_compute[192795]: 2025-09-30 21:33:01.851 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquired lock "refresh_cache-272487f9-0986-45d4-bfde-ccdef8045c03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:01 compute-1 nova_compute[192795]: 2025-09-30 21:33:01.852 2 DEBUG nova.network.neutron [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.050 2 DEBUG nova.network.neutron [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.893 2 DEBUG nova.network.neutron [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Updating instance_info_cache with network_info: [{"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.925 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Releasing lock "refresh_cache-272487f9-0986-45d4-bfde-ccdef8045c03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.925 2 DEBUG nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Instance network_info: |[{"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.930 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Start _get_guest_xml network_info=[{"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.938 2 WARNING nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.948 2 DEBUG nova.virt.libvirt.host [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.949 2 DEBUG nova.virt.libvirt.host [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.953 2 DEBUG nova.virt.libvirt.host [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.955 2 DEBUG nova.virt.libvirt.host [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.957 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.958 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.959 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.960 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.960 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.961 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.962 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.962 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.963 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.963 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.964 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.964 2 DEBUG nova.virt.hardware [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.972 2 DEBUG nova.virt.libvirt.vif [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:32:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1145676151',display_name='tempest-ServerAddressesNegativeTestJSON-server-1145676151',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1145676151',id=91,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4c6e533166cc44d4b61600c6c1270ee8',ramdisk_id='',reservation_id='r-qfts054k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-77041409',owner_user_name='tempest-ServerAddressesNegativeTestJSON-77041409-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:00Z,user_data=None,user_id='b62a036c63304f7ab5b0ba37a95b2b78',uuid=272487f9-0986-45d4-bfde-ccdef8045c03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.973 2 DEBUG nova.network.os_vif_util [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Converting VIF {"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.974 2 DEBUG nova.network.os_vif_util [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:c0:23,bridge_name='br-int',has_traffic_filtering=True,id=637d02e2-70ae-4b64-85d3-17b2c44bbee3,network=Network(a86de3ea-7915-4f9e-ad60-f5a230627203),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637d02e2-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.976 2 DEBUG nova.objects.instance [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 272487f9-0986-45d4-bfde-ccdef8045c03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:02 compute-1 nova_compute[192795]: 2025-09-30 21:33:02.997 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <uuid>272487f9-0986-45d4-bfde-ccdef8045c03</uuid>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <name>instance-0000005b</name>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerAddressesNegativeTestJSON-server-1145676151</nova:name>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:33:02</nova:creationTime>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:33:02 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:33:02 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:33:02 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:33:02 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:33:02 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:33:02 compute-1 nova_compute[192795]:         <nova:user uuid="b62a036c63304f7ab5b0ba37a95b2b78">tempest-ServerAddressesNegativeTestJSON-77041409-project-member</nova:user>
Sep 30 21:33:02 compute-1 nova_compute[192795]:         <nova:project uuid="4c6e533166cc44d4b61600c6c1270ee8">tempest-ServerAddressesNegativeTestJSON-77041409</nova:project>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:33:02 compute-1 nova_compute[192795]:         <nova:port uuid="637d02e2-70ae-4b64-85d3-17b2c44bbee3">
Sep 30 21:33:02 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <system>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <entry name="serial">272487f9-0986-45d4-bfde-ccdef8045c03</entry>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <entry name="uuid">272487f9-0986-45d4-bfde-ccdef8045c03</entry>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     </system>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <os>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   </os>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <features>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   </features>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:33:02 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk.config"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:d7:c0:23"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <target dev="tap637d02e2-70"/>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:33:02 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:33:02 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/console.log" append="off"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <video>
Sep 30 21:33:03 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     </video>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:33:03 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:33:03 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:33:03 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:33:03 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:33:03 compute-1 nova_compute[192795]: </domain>
Sep 30 21:33:03 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.000 2 DEBUG nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Preparing to wait for external event network-vif-plugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.000 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.001 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.001 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.002 2 DEBUG nova.virt.libvirt.vif [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:32:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1145676151',display_name='tempest-ServerAddressesNegativeTestJSON-server-1145676151',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1145676151',id=91,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4c6e533166cc44d4b61600c6c1270ee8',ramdisk_id='',reservation_id='r-qfts054k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-77041409',owner_user_name='tempest-ServerAddressesNegativeTestJSON-77041409-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:00Z,user_data=None,user_id='b62a036c63304f7ab5b0ba37a95b2b78',uuid=272487f9-0986-45d4-bfde-ccdef8045c03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.002 2 DEBUG nova.network.os_vif_util [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Converting VIF {"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.003 2 DEBUG nova.network.os_vif_util [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:c0:23,bridge_name='br-int',has_traffic_filtering=True,id=637d02e2-70ae-4b64-85d3-17b2c44bbee3,network=Network(a86de3ea-7915-4f9e-ad60-f5a230627203),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637d02e2-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.004 2 DEBUG os_vif [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:c0:23,bridge_name='br-int',has_traffic_filtering=True,id=637d02e2-70ae-4b64-85d3-17b2c44bbee3,network=Network(a86de3ea-7915-4f9e-ad60-f5a230627203),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637d02e2-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.005 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.006 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.011 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap637d02e2-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.012 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap637d02e2-70, col_values=(('external_ids', {'iface-id': '637d02e2-70ae-4b64-85d3-17b2c44bbee3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:c0:23', 'vm-uuid': '272487f9-0986-45d4-bfde-ccdef8045c03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:03 compute-1 NetworkManager[51724]: <info>  [1759267983.0153] manager: (tap637d02e2-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.024 2 INFO os_vif [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:c0:23,bridge_name='br-int',has_traffic_filtering=True,id=637d02e2-70ae-4b64-85d3-17b2c44bbee3,network=Network(a86de3ea-7915-4f9e-ad60-f5a230627203),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637d02e2-70')
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.071 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.071 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.071 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] No VIF found with MAC fa:16:3e:d7:c0:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.072 2 INFO nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Using config drive
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.517 2 INFO nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Creating config drive at /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk.config
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.527 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptpekvf13 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.555 2 DEBUG nova.compute.manager [req-06993e90-60a0-42d7-9654-7505667a7a03 req-5ddfc25a-c21b-4dcf-a11f-8794470044bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Received event network-changed-637d02e2-70ae-4b64-85d3-17b2c44bbee3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.556 2 DEBUG nova.compute.manager [req-06993e90-60a0-42d7-9654-7505667a7a03 req-5ddfc25a-c21b-4dcf-a11f-8794470044bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Refreshing instance network info cache due to event network-changed-637d02e2-70ae-4b64-85d3-17b2c44bbee3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.556 2 DEBUG oslo_concurrency.lockutils [req-06993e90-60a0-42d7-9654-7505667a7a03 req-5ddfc25a-c21b-4dcf-a11f-8794470044bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-272487f9-0986-45d4-bfde-ccdef8045c03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.556 2 DEBUG oslo_concurrency.lockutils [req-06993e90-60a0-42d7-9654-7505667a7a03 req-5ddfc25a-c21b-4dcf-a11f-8794470044bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-272487f9-0986-45d4-bfde-ccdef8045c03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.557 2 DEBUG nova.network.neutron [req-06993e90-60a0-42d7-9654-7505667a7a03 req-5ddfc25a-c21b-4dcf-a11f-8794470044bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Refreshing network info cache for port 637d02e2-70ae-4b64-85d3-17b2c44bbee3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.657 2 DEBUG oslo_concurrency.processutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptpekvf13" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:03 compute-1 kernel: tap637d02e2-70: entered promiscuous mode
Sep 30 21:33:03 compute-1 NetworkManager[51724]: <info>  [1759267983.7276] manager: (tap637d02e2-70): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Sep 30 21:33:03 compute-1 ovn_controller[94902]: 2025-09-30T21:33:03Z|00330|binding|INFO|Claiming lport 637d02e2-70ae-4b64-85d3-17b2c44bbee3 for this chassis.
Sep 30 21:33:03 compute-1 ovn_controller[94902]: 2025-09-30T21:33:03Z|00331|binding|INFO|637d02e2-70ae-4b64-85d3-17b2c44bbee3: Claiming fa:16:3e:d7:c0:23 10.100.0.13
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.736 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:c0:23 10.100.0.13'], port_security=['fa:16:3e:d7:c0:23 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '272487f9-0986-45d4-bfde-ccdef8045c03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a86de3ea-7915-4f9e-ad60-f5a230627203', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c6e533166cc44d4b61600c6c1270ee8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9ee6fecf-08ff-4540-83da-73a10028a93a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1eadc59-3afe-4792-9b96-33234afd2028, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=637d02e2-70ae-4b64-85d3-17b2c44bbee3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.737 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 637d02e2-70ae-4b64-85d3-17b2c44bbee3 in datapath a86de3ea-7915-4f9e-ad60-f5a230627203 bound to our chassis
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.739 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a86de3ea-7915-4f9e-ad60-f5a230627203
Sep 30 21:33:03 compute-1 ovn_controller[94902]: 2025-09-30T21:33:03Z|00332|binding|INFO|Setting lport 637d02e2-70ae-4b64-85d3-17b2c44bbee3 ovn-installed in OVS
Sep 30 21:33:03 compute-1 ovn_controller[94902]: 2025-09-30T21:33:03Z|00333|binding|INFO|Setting lport 637d02e2-70ae-4b64-85d3-17b2c44bbee3 up in Southbound
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.753 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[36fb985e-938f-49b3-8bae-993261469405]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.754 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa86de3ea-71 in ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.755 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa86de3ea-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.755 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc35471-4a6e-4b07-8f41-f34d424faa05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.756 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0078762b-22c9-4706-9c9a-9b7b8a2b1dd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 systemd-udevd[234493]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.768 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ef5583-c55e-4a8a-941e-5b4cd91457e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 nova_compute[192795]: 2025-09-30 21:33:03.774 2 DEBUG nova.compute.manager [None req-7c2ebaae-b69e-4b6a-805e-7983230a2929 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196
Sep 30 21:33:03 compute-1 systemd-machined[152783]: New machine qemu-42-instance-0000005b.
Sep 30 21:33:03 compute-1 NetworkManager[51724]: <info>  [1759267983.7841] device (tap637d02e2-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:33:03 compute-1 NetworkManager[51724]: <info>  [1759267983.7850] device (tap637d02e2-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.789 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c60987-674e-4fc7-9559-183b291f3e88]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 systemd[1]: Started Virtual Machine qemu-42-instance-0000005b.
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.825 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[93dc2ea8-65b9-4363-a398-e88ef0b4920d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 systemd-udevd[234497]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:33:03 compute-1 NetworkManager[51724]: <info>  [1759267983.8338] manager: (tapa86de3ea-70): new Veth device (/org/freedesktop/NetworkManager/Devices/167)
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.832 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1dd026-cb00-47c2-82b5-2e5a7b3ecaec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.865 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[74847e26-bf47-4000-8e5a-d73333faea28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.870 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7821b1-e542-4aa5-9b00-2146de29ba21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 NetworkManager[51724]: <info>  [1759267983.8964] device (tapa86de3ea-70): carrier: link connected
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.901 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7ccff4be-5941-4807-be80-0724f6133845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.923 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[386a870f-0e4e-4281-be66-22830dcb30d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa86de3ea-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:23:c9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467956, 'reachable_time': 31489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234525, 'error': None, 'target': 'ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.942 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a06a9695-73be-43e4-b1f3-056ab6625596]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:23c9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467956, 'tstamp': 467956}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234526, 'error': None, 'target': 'ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:03 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:03.964 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e377d8ba-7dab-48cc-a63e-e61182bf7c00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa86de3ea-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:23:c9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467956, 'reachable_time': 31489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234529, 'error': None, 'target': 'ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.003 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5e42b6f6-cc27-4649-bb62-69e8c895cad0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.093 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f97545bd-dc6c-4060-bba6-207d00ecd81a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.095 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa86de3ea-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.095 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.096 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa86de3ea-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:04 compute-1 NetworkManager[51724]: <info>  [1759267984.0990] manager: (tapa86de3ea-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Sep 30 21:33:04 compute-1 kernel: tapa86de3ea-70: entered promiscuous mode
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.104 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa86de3ea-70, col_values=(('external_ids', {'iface-id': '3d8c6c50-ca08-4e68-b0cf-49048868fcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:04 compute-1 ovn_controller[94902]: 2025-09-30T21:33:04Z|00334|binding|INFO|Releasing lport 3d8c6c50-ca08-4e68-b0cf-49048868fcdd from this chassis (sb_readonly=0)
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.109 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a86de3ea-7915-4f9e-ad60-f5a230627203.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a86de3ea-7915-4f9e-ad60-f5a230627203.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.110 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[13e7bbfe-644d-47bc-bd11-55d47dd3047b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.111 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-a86de3ea-7915-4f9e-ad60-f5a230627203
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/a86de3ea-7915-4f9e-ad60-f5a230627203.pid.haproxy
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID a86de3ea-7915-4f9e-ad60-f5a230627203
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:33:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:04.111 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203', 'env', 'PROCESS_TAG=haproxy-a86de3ea-7915-4f9e-ad60-f5a230627203', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a86de3ea-7915-4f9e-ad60-f5a230627203.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.422 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267984.4213717, 272487f9-0986-45d4-bfde-ccdef8045c03 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.423 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] VM Started (Lifecycle Event)
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.472 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.478 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267984.422788, 272487f9-0986-45d4-bfde-ccdef8045c03 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.478 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] VM Paused (Lifecycle Event)
Sep 30 21:33:04 compute-1 podman[234566]: 2025-09-30 21:33:04.510544415 +0000 UTC m=+0.053058760 container create 4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.547 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.551 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:33:04 compute-1 systemd[1]: Started libpod-conmon-4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09.scope.
Sep 30 21:33:04 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:33:04 compute-1 podman[234566]: 2025-09-30 21:33:04.481207389 +0000 UTC m=+0.023721765 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:33:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/033cd23b523ae74c03d6a78ec4f54615af84addaa38f4078659f30c8fda96cc3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:33:04 compute-1 podman[234566]: 2025-09-30 21:33:04.592508377 +0000 UTC m=+0.135022752 container init 4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:33:04 compute-1 podman[234566]: 2025-09-30 21:33:04.597369368 +0000 UTC m=+0.139883723 container start 4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Sep 30 21:33:04 compute-1 nova_compute[192795]: 2025-09-30 21:33:04.606 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:33:04 compute-1 neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203[234582]: [NOTICE]   (234586) : New worker (234588) forked
Sep 30 21:33:04 compute-1 neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203[234582]: [NOTICE]   (234586) : Loading success.
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.665 2 DEBUG nova.compute.manager [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Received event network-vif-plugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.665 2 DEBUG oslo_concurrency.lockutils [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.666 2 DEBUG oslo_concurrency.lockutils [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.666 2 DEBUG oslo_concurrency.lockutils [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.667 2 DEBUG nova.compute.manager [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Processing event network-vif-plugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.667 2 DEBUG nova.compute.manager [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Received event network-vif-plugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.668 2 DEBUG oslo_concurrency.lockutils [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.668 2 DEBUG oslo_concurrency.lockutils [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.669 2 DEBUG oslo_concurrency.lockutils [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.669 2 DEBUG nova.compute.manager [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] No waiting events found dispatching network-vif-plugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.670 2 WARNING nova.compute.manager [req-e1c79a03-79a5-47af-8d67-e5ee1994b3e9 req-9a2817d4-1379-40e3-b04e-f79b281ddabe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Received unexpected event network-vif-plugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 for instance with vm_state building and task_state spawning.
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.671 2 DEBUG nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.676 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759267985.675731, 272487f9-0986-45d4-bfde-ccdef8045c03 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.676 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] VM Resumed (Lifecycle Event)
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.680 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.684 2 INFO nova.virt.libvirt.driver [-] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Instance spawned successfully.
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.684 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.716 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.724 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.728 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.729 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.730 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.731 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.732 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.732 2 DEBUG nova.virt.libvirt.driver [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.760 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.911 2 DEBUG nova.network.neutron [req-06993e90-60a0-42d7-9654-7505667a7a03 req-5ddfc25a-c21b-4dcf-a11f-8794470044bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Updated VIF entry in instance network info cache for port 637d02e2-70ae-4b64-85d3-17b2c44bbee3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.912 2 DEBUG nova.network.neutron [req-06993e90-60a0-42d7-9654-7505667a7a03 req-5ddfc25a-c21b-4dcf-a11f-8794470044bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Updating instance_info_cache with network_info: [{"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.914 2 INFO nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Took 5.67 seconds to spawn the instance on the hypervisor.
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.914 2 DEBUG nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:05 compute-1 nova_compute[192795]: 2025-09-30 21:33:05.962 2 DEBUG oslo_concurrency.lockutils [req-06993e90-60a0-42d7-9654-7505667a7a03 req-5ddfc25a-c21b-4dcf-a11f-8794470044bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-272487f9-0986-45d4-bfde-ccdef8045c03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:06 compute-1 nova_compute[192795]: 2025-09-30 21:33:06.172 2 INFO nova.compute.manager [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Took 6.49 seconds to build instance.
Sep 30 21:33:06 compute-1 nova_compute[192795]: 2025-09-30 21:33:06.226 2 DEBUG oslo_concurrency.lockutils [None req-79c66f3f-43eb-42f3-bc48-a2bf78ed27a4 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:07 compute-1 podman[234597]: 2025-09-30 21:33:07.236108131 +0000 UTC m=+0.074951223 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:33:07 compute-1 nova_compute[192795]: 2025-09-30 21:33:07.647 2 DEBUG oslo_concurrency.lockutils [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:07 compute-1 nova_compute[192795]: 2025-09-30 21:33:07.648 2 DEBUG oslo_concurrency.lockutils [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:07 compute-1 nova_compute[192795]: 2025-09-30 21:33:07.648 2 DEBUG nova.compute.manager [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:07 compute-1 nova_compute[192795]: 2025-09-30 21:33:07.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:07 compute-1 nova_compute[192795]: 2025-09-30 21:33:07.677 2 DEBUG nova.compute.manager [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Sep 30 21:33:07 compute-1 nova_compute[192795]: 2025-09-30 21:33:07.678 2 DEBUG nova.objects.instance [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'flavor' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:07 compute-1 nova_compute[192795]: 2025-09-30 21:33:07.708 2 DEBUG nova.objects.instance [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'info_cache' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:07 compute-1 nova_compute[192795]: 2025-09-30 21:33:07.742 2 DEBUG nova.virt.libvirt.driver [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:33:08 compute-1 nova_compute[192795]: 2025-09-30 21:33:08.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.594 2 DEBUG oslo_concurrency.lockutils [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "272487f9-0986-45d4-bfde-ccdef8045c03" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.595 2 DEBUG oslo_concurrency.lockutils [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.596 2 DEBUG oslo_concurrency.lockutils [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.596 2 DEBUG oslo_concurrency.lockutils [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.597 2 DEBUG oslo_concurrency.lockutils [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.611 2 INFO nova.compute.manager [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Terminating instance
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.624 2 DEBUG nova.compute.manager [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:33:09 compute-1 kernel: tap637d02e2-70 (unregistering): left promiscuous mode
Sep 30 21:33:09 compute-1 NetworkManager[51724]: <info>  [1759267989.6450] device (tap637d02e2-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 ovn_controller[94902]: 2025-09-30T21:33:09Z|00335|binding|INFO|Releasing lport 637d02e2-70ae-4b64-85d3-17b2c44bbee3 from this chassis (sb_readonly=0)
Sep 30 21:33:09 compute-1 ovn_controller[94902]: 2025-09-30T21:33:09Z|00336|binding|INFO|Setting lport 637d02e2-70ae-4b64-85d3-17b2c44bbee3 down in Southbound
Sep 30 21:33:09 compute-1 ovn_controller[94902]: 2025-09-30T21:33:09Z|00337|binding|INFO|Removing iface tap637d02e2-70 ovn-installed in OVS
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:09.673 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:c0:23 10.100.0.13'], port_security=['fa:16:3e:d7:c0:23 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '272487f9-0986-45d4-bfde-ccdef8045c03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a86de3ea-7915-4f9e-ad60-f5a230627203', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c6e533166cc44d4b61600c6c1270ee8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9ee6fecf-08ff-4540-83da-73a10028a93a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1eadc59-3afe-4792-9b96-33234afd2028, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=637d02e2-70ae-4b64-85d3-17b2c44bbee3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:09.677 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 637d02e2-70ae-4b64-85d3-17b2c44bbee3 in datapath a86de3ea-7915-4f9e-ad60-f5a230627203 unbound from our chassis
Sep 30 21:33:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:09.681 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a86de3ea-7915-4f9e-ad60-f5a230627203, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:33:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:09.683 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[77ac4913-a341-4d24-812e-aab49c841a1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:09.684 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203 namespace which is not needed anymore
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Sep 30 21:33:09 compute-1 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005b.scope: Consumed 4.516s CPU time.
Sep 30 21:33:09 compute-1 systemd-machined[152783]: Machine qemu-42-instance-0000005b terminated.
Sep 30 21:33:09 compute-1 neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203[234582]: [NOTICE]   (234586) : haproxy version is 2.8.14-c23fe91
Sep 30 21:33:09 compute-1 neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203[234582]: [NOTICE]   (234586) : path to executable is /usr/sbin/haproxy
Sep 30 21:33:09 compute-1 neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203[234582]: [WARNING]  (234586) : Exiting Master process...
Sep 30 21:33:09 compute-1 neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203[234582]: [ALERT]    (234586) : Current worker (234588) exited with code 143 (Terminated)
Sep 30 21:33:09 compute-1 neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203[234582]: [WARNING]  (234586) : All workers exited. Exiting... (0)
Sep 30 21:33:09 compute-1 systemd[1]: libpod-4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09.scope: Deactivated successfully.
Sep 30 21:33:09 compute-1 conmon[234582]: conmon 4098ca6ed386dbd90331 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09.scope/container/memory.events
Sep 30 21:33:09 compute-1 podman[234650]: 2025-09-30 21:33:09.855345226 +0000 UTC m=+0.057234933 container died 4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:33:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-033cd23b523ae74c03d6a78ec4f54615af84addaa38f4078659f30c8fda96cc3-merged.mount: Deactivated successfully.
Sep 30 21:33:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09-userdata-shm.mount: Deactivated successfully.
Sep 30 21:33:09 compute-1 podman[234650]: 2025-09-30 21:33:09.900999092 +0000 UTC m=+0.102888799 container cleanup 4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.910 2 DEBUG nova.compute.manager [req-17831d1a-16f7-443e-937d-a68605451975 req-c6dc9cf4-9572-415c-a911-07940f7ca02b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Received event network-vif-unplugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.911 2 DEBUG oslo_concurrency.lockutils [req-17831d1a-16f7-443e-937d-a68605451975 req-c6dc9cf4-9572-415c-a911-07940f7ca02b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.912 2 DEBUG oslo_concurrency.lockutils [req-17831d1a-16f7-443e-937d-a68605451975 req-c6dc9cf4-9572-415c-a911-07940f7ca02b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.912 2 DEBUG oslo_concurrency.lockutils [req-17831d1a-16f7-443e-937d-a68605451975 req-c6dc9cf4-9572-415c-a911-07940f7ca02b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.913 2 DEBUG nova.compute.manager [req-17831d1a-16f7-443e-937d-a68605451975 req-c6dc9cf4-9572-415c-a911-07940f7ca02b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] No waiting events found dispatching network-vif-unplugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.913 2 DEBUG nova.compute.manager [req-17831d1a-16f7-443e-937d-a68605451975 req-c6dc9cf4-9572-415c-a911-07940f7ca02b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Received event network-vif-unplugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.919 2 INFO nova.virt.libvirt.driver [-] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Instance destroyed successfully.
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.920 2 DEBUG nova.objects.instance [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lazy-loading 'resources' on Instance uuid 272487f9-0986-45d4-bfde-ccdef8045c03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:09 compute-1 systemd[1]: libpod-conmon-4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09.scope: Deactivated successfully.
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.936 2 DEBUG nova.virt.libvirt.vif [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:32:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1145676151',display_name='tempest-ServerAddressesNegativeTestJSON-server-1145676151',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1145676151',id=91,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:33:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4c6e533166cc44d4b61600c6c1270ee8',ramdisk_id='',reservation_id='r-qfts054k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-77041409',owner_user_name='tempest-ServerAddressesNegativeTestJSON-77041409-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:33:05Z,user_data=None,user_id='b62a036c63304f7ab5b0ba37a95b2b78',uuid=272487f9-0986-45d4-bfde-ccdef8045c03,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.936 2 DEBUG nova.network.os_vif_util [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Converting VIF {"id": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "address": "fa:16:3e:d7:c0:23", "network": {"id": "a86de3ea-7915-4f9e-ad60-f5a230627203", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-345567387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e533166cc44d4b61600c6c1270ee8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637d02e2-70", "ovs_interfaceid": "637d02e2-70ae-4b64-85d3-17b2c44bbee3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.937 2 DEBUG nova.network.os_vif_util [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:c0:23,bridge_name='br-int',has_traffic_filtering=True,id=637d02e2-70ae-4b64-85d3-17b2c44bbee3,network=Network(a86de3ea-7915-4f9e-ad60-f5a230627203),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637d02e2-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.937 2 DEBUG os_vif [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:c0:23,bridge_name='br-int',has_traffic_filtering=True,id=637d02e2-70ae-4b64-85d3-17b2c44bbee3,network=Network(a86de3ea-7915-4f9e-ad60-f5a230627203),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637d02e2-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.939 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap637d02e2-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 kernel: tapd8844f4f-b4 (unregistering): left promiscuous mode
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.951 2 INFO os_vif [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:c0:23,bridge_name='br-int',has_traffic_filtering=True,id=637d02e2-70ae-4b64-85d3-17b2c44bbee3,network=Network(a86de3ea-7915-4f9e-ad60-f5a230627203),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637d02e2-70')
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.952 2 INFO nova.virt.libvirt.driver [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Deleting instance files /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03_del
Sep 30 21:33:09 compute-1 NetworkManager[51724]: <info>  [1759267989.9526] device (tapd8844f4f-b4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.953 2 INFO nova.virt.libvirt.driver [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Deletion of /var/lib/nova/instances/272487f9-0986-45d4-bfde-ccdef8045c03_del complete
Sep 30 21:33:09 compute-1 ovn_controller[94902]: 2025-09-30T21:33:09Z|00338|binding|INFO|Releasing lport d8844f4f-b484-4605-8f20-0bb8b7a50471 from this chassis (sb_readonly=0)
Sep 30 21:33:09 compute-1 ovn_controller[94902]: 2025-09-30T21:33:09Z|00339|binding|INFO|Setting lport d8844f4f-b484-4605-8f20-0bb8b7a50471 down in Southbound
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 ovn_controller[94902]: 2025-09-30T21:33:09Z|00340|binding|INFO|Removing iface tapd8844f4f-b4 ovn-installed in OVS
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 nova_compute[192795]: 2025-09-30 21:33:09.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:09.982 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:18:3c 10.100.0.5'], port_security=['fa:16:3e:84:18:3c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd876c85b6ca5418eb657e48391a6503b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d444175-4bc9-45e9-8b74-e4555ef5d88b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b003c3b3-124e-4f30-8c82-ee588d17c214, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=d8844f4f-b484-4605-8f20-0bb8b7a50471) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:10 compute-1 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000054.scope: Deactivated successfully.
Sep 30 21:33:10 compute-1 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000054.scope: Consumed 16.622s CPU time.
Sep 30 21:33:10 compute-1 podman[234698]: 2025-09-30 21:33:10.004820516 +0000 UTC m=+0.070897682 container remove 4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:33:10 compute-1 systemd-machined[152783]: Machine qemu-41-instance-00000054 terminated.
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.013 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[53b4e667-18b7-4bea-9bb2-7bc182d7db55]: (4, ('Tue Sep 30 09:33:09 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203 (4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09)\n4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09\nTue Sep 30 09:33:09 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203 (4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09)\n4098ca6ed386dbd90331b6e2e882c17bb63866b8cb5a26248836aecbddd2db09\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.016 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4b843588-5cae-4ed2-bba2-cf1e45ed8b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.018 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa86de3ea-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:10 compute-1 kernel: tapa86de3ea-70: left promiscuous mode
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.040 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3495af43-7472-4ac8-8901-f4d7ebaebf8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.066 2 INFO nova.compute.manager [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Took 0.44 seconds to destroy the instance on the hypervisor.
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.066 2 DEBUG oslo.service.loopingcall [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.066 2 DEBUG nova.compute.manager [-] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.066 2 DEBUG nova.network.neutron [-] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.068 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[339b74dd-2cbb-4168-959f-1f7cc10225e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.072 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[56a9f6b4-6d6f-4b41-8f0d-0956d4a8b842]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.093 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[216c2f9f-c1cf-4210-a091-15d48215a9dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467949, 'reachable_time': 30601, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234719, 'error': None, 'target': 'ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 systemd[1]: run-netns-ovnmeta\x2da86de3ea\x2d7915\x2d4f9e\x2dad60\x2df5a230627203.mount: Deactivated successfully.
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.097 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a86de3ea-7915-4f9e-ad60-f5a230627203 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.097 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[acd81a3a-6472-419c-845f-6554e6fcb2ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.098 103861 INFO neutron.agent.ovn.metadata.agent [-] Port d8844f4f-b484-4605-8f20-0bb8b7a50471 in datapath 91c84c55-96ab-4682-a6e7-9e96514ca8a5 unbound from our chassis
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.099 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91c84c55-96ab-4682-a6e7-9e96514ca8a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.101 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[24ca875f-7ed3-4cd9-ab41-07d4a71624c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.101 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 namespace which is not needed anymore
Sep 30 21:33:10 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233991]: [NOTICE]   (233995) : haproxy version is 2.8.14-c23fe91
Sep 30 21:33:10 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233991]: [NOTICE]   (233995) : path to executable is /usr/sbin/haproxy
Sep 30 21:33:10 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233991]: [WARNING]  (233995) : Exiting Master process...
Sep 30 21:33:10 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233991]: [WARNING]  (233995) : Exiting Master process...
Sep 30 21:33:10 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233991]: [ALERT]    (233995) : Current worker (233997) exited with code 143 (Terminated)
Sep 30 21:33:10 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233991]: [WARNING]  (233995) : All workers exited. Exiting... (0)
Sep 30 21:33:10 compute-1 systemd[1]: libpod-252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9.scope: Deactivated successfully.
Sep 30 21:33:10 compute-1 podman[234743]: 2025-09-30 21:33:10.263136699 +0000 UTC m=+0.055172277 container died 252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:33:10 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9-userdata-shm.mount: Deactivated successfully.
Sep 30 21:33:10 compute-1 podman[234743]: 2025-09-30 21:33:10.299157135 +0000 UTC m=+0.091192683 container cleanup 252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:33:10 compute-1 systemd[1]: var-lib-containers-storage-overlay-9932ef3db0aa37a4a2c882386dfd08d43666d58cf8f2fb1437ae89c0001ce061-merged.mount: Deactivated successfully.
Sep 30 21:33:10 compute-1 systemd[1]: libpod-conmon-252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9.scope: Deactivated successfully.
Sep 30 21:33:10 compute-1 podman[234782]: 2025-09-30 21:33:10.360544558 +0000 UTC m=+0.039728427 container remove 252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.367 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[99767507-1856-4d9a-8937-914b684cc55b]: (4, ('Tue Sep 30 09:33:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 (252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9)\n252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9\nTue Sep 30 09:33:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 (252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9)\n252a0a3add0494912e4f24f94c3aa44df487339ffd23780030d6b064fb5c79e9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.370 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9b22338a-e2e5-43b0-a10f-40389c908fc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.371 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91c84c55-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:10 compute-1 kernel: tap91c84c55-90: left promiscuous mode
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.391 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7861d935-9b68-45a8-9eff-1abda7e83071]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.427 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3962a106-1b4e-4936-bb75-c8b7c9bc4623]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.430 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b140dafc-d303-443d-8211-5ad3e01a092b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.449 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[480daf2f-554e-461f-ae1b-8f174dcda8c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461013, 'reachable_time': 25328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234801, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.451 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:33:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:10.452 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[88a33bc4-6e43-450e-93c8-3522802f317c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.764 2 INFO nova.virt.libvirt.driver [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance shutdown successfully after 3 seconds.
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.769 2 INFO nova.virt.libvirt.driver [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance destroyed successfully.
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.770 2 DEBUG nova.objects.instance [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'numa_topology' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.786 2 DEBUG nova.compute.manager [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:10 compute-1 nova_compute[192795]: 2025-09-30 21:33:10.875 2 DEBUG oslo_concurrency.lockutils [None req-d3432099-3356-47b2-954c-833b349dc377 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:10 compute-1 systemd[1]: run-netns-ovnmeta\x2d91c84c55\x2d96ab\x2d4682\x2da6e7\x2d9e96514ca8a5.mount: Deactivated successfully.
Sep 30 21:33:11 compute-1 nova_compute[192795]: 2025-09-30 21:33:11.154 2 DEBUG nova.network.neutron [-] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:11 compute-1 nova_compute[192795]: 2025-09-30 21:33:11.182 2 INFO nova.compute.manager [-] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Took 1.12 seconds to deallocate network for instance.
Sep 30 21:33:11 compute-1 nova_compute[192795]: 2025-09-30 21:33:11.261 2 DEBUG oslo_concurrency.lockutils [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:11 compute-1 nova_compute[192795]: 2025-09-30 21:33:11.261 2 DEBUG oslo_concurrency.lockutils [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:11 compute-1 nova_compute[192795]: 2025-09-30 21:33:11.337 2 DEBUG nova.compute.provider_tree [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:33:11 compute-1 nova_compute[192795]: 2025-09-30 21:33:11.371 2 DEBUG nova.scheduler.client.report [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:33:11 compute-1 nova_compute[192795]: 2025-09-30 21:33:11.547 2 DEBUG oslo_concurrency.lockutils [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:11 compute-1 nova_compute[192795]: 2025-09-30 21:33:11.603 2 INFO nova.scheduler.client.report [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Deleted allocations for instance 272487f9-0986-45d4-bfde-ccdef8045c03
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.132 2 DEBUG nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Received event network-vif-plugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.133 2 DEBUG oslo_concurrency.lockutils [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.133 2 DEBUG oslo_concurrency.lockutils [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.133 2 DEBUG oslo_concurrency.lockutils [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.133 2 DEBUG nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] No waiting events found dispatching network-vif-plugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.133 2 WARNING nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Received unexpected event network-vif-plugged-637d02e2-70ae-4b64-85d3-17b2c44bbee3 for instance with vm_state deleted and task_state None.
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.134 2 DEBUG nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received event network-vif-unplugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.134 2 DEBUG oslo_concurrency.lockutils [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.134 2 DEBUG oslo_concurrency.lockutils [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.134 2 DEBUG oslo_concurrency.lockutils [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.134 2 DEBUG nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] No waiting events found dispatching network-vif-unplugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.135 2 WARNING nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received unexpected event network-vif-unplugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 for instance with vm_state stopped and task_state None.
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.135 2 DEBUG nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.135 2 DEBUG oslo_concurrency.lockutils [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.135 2 DEBUG oslo_concurrency.lockutils [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.135 2 DEBUG oslo_concurrency.lockutils [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.136 2 DEBUG nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] No waiting events found dispatching network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.136 2 WARNING nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received unexpected event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 for instance with vm_state stopped and task_state None.
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.136 2 DEBUG nova.compute.manager [req-35711357-931b-495c-a7fc-561c53732e4d req-b1a72cb4-9f16-4922-8796-876a40d4bd76 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Received event network-vif-deleted-637d02e2-70ae-4b64-85d3-17b2c44bbee3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.136 2 DEBUG oslo_concurrency.lockutils [None req-d577c8d3-8cb0-47fb-bf67-52aa2565a922 b62a036c63304f7ab5b0ba37a95b2b78 4c6e533166cc44d4b61600c6c1270ee8 - - default default] Lock "272487f9-0986-45d4-bfde-ccdef8045c03" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:12 compute-1 nova_compute[192795]: 2025-09-30 21:33:12.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:14 compute-1 nova_compute[192795]: 2025-09-30 21:33:14.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.206 2 DEBUG nova.compute.manager [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Stashing vm_state: stopped _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Sep 30 21:33:15 compute-1 podman[234804]: 2025-09-30 21:33:15.241362163 +0000 UTC m=+0.074245483 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:33:15 compute-1 podman[234802]: 2025-09-30 21:33:15.249593927 +0000 UTC m=+0.080875963 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:33:15 compute-1 podman[234803]: 2025-09-30 21:33:15.284862552 +0000 UTC m=+0.124797463 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.293 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.294 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.313 2 DEBUG nova.objects.instance [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'pci_requests' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.329 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.330 2 INFO nova.compute.claims [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.330 2 DEBUG nova.objects.instance [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'resources' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.341 2 DEBUG nova.objects.instance [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'pci_devices' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.425 2 INFO nova.compute.resource_tracker [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating resource usage from migration 8e6864c3-7e97-476b-8041-2ab94162d7e1
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.501 2 DEBUG nova.compute.provider_tree [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.515 2 DEBUG nova.scheduler.client.report [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.531 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.531 2 INFO nova.compute.manager [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Migrating
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.572 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.573 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquired lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.573 2 DEBUG nova.network.neutron [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:33:15 compute-1 nova_compute[192795]: 2025-09-30 21:33:15.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:16 compute-1 nova_compute[192795]: 2025-09-30 21:33:16.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:16 compute-1 nova_compute[192795]: 2025-09-30 21:33:16.886 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:16 compute-1 nova_compute[192795]: 2025-09-30 21:33:16.886 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:16 compute-1 nova_compute[192795]: 2025-09-30 21:33:16.914 2 DEBUG nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:33:16 compute-1 nova_compute[192795]: 2025-09-30 21:33:16.989 2 DEBUG nova.network.neutron [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating instance_info_cache with network_info: [{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.014 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Releasing lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.018 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.018 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.025 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.025 2 INFO nova.compute.claims [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.164 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.173 2 INFO nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance already shutdown.
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.180 2 INFO nova.virt.libvirt.driver [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance destroyed successfully.
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.181 2 DEBUG nova.virt.libvirt.vif [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2076746266',display_name='tempest-ServerActionsTestOtherB-server-2076746266',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2076746266',id=84,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGkYCgSHUTuSUzUIAuXJtLTqGK//f64VuPig3h4DAdGVbwuL+Fh6FvgBtW4lbWvdqGtfEAYA8BT52zsalAqCB8JklfZ4tahvlr3WnGK5B2oFxxGbGDUfPfDkJJDH+Xurxg==',key_name='tempest-keypair-2100038533',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:31:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-8p80unpl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:33:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=c783547f-0799-4e53-8cdc-8784800b3c2d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1696557468-network", "vif_mac": "fa:16:3e:84:18:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.182 2 DEBUG nova.network.os_vif_util [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1696557468-network", "vif_mac": "fa:16:3e:84:18:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.183 2 DEBUG nova.network.os_vif_util [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.184 2 DEBUG os_vif [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.187 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8844f4f-b4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.200 2 INFO os_vif [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4')
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.206 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.263 2 DEBUG nova.compute.provider_tree [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.278 2 DEBUG nova.scheduler.client.report [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.302 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.303 2 DEBUG nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.309 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.310 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.367 2 DEBUG nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.368 2 DEBUG nova.network.neutron [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.394 2 INFO nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.397 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.399 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d_resize/disk /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.423 2 DEBUG nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.442 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "cp -r /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d_resize/disk /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.444 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d_resize/disk.config /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.478 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "cp -r /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d_resize/disk.config /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.479 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d_resize/disk.info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.512 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "cp -r /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d_resize/disk.info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.info" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.576 2 DEBUG nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.578 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.579 2 INFO nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Creating image(s)
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.580 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.581 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.582 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.606 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.643 2 DEBUG nova.policy [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c27a02706b9d43ffaab2c5fa833fec04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b33d27d5088343569f4459643d0da580', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.704 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.705 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.706 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.716 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.756 2 DEBUG nova.network.neutron [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Port d8844f4f-b484-4605-8f20-0bb8b7a50471 binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.800 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.802 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.877 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk 1073741824" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.881 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.882 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.940 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.941 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.941 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.948 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.949 2 DEBUG nova.virt.disk.api [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Checking if we can resize image /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:33:17 compute-1 nova_compute[192795]: 2025-09-30 21:33:17.950 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.032 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.034 2 DEBUG nova.virt.disk.api [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Cannot resize image /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.034 2 DEBUG nova.objects.instance [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'migration_context' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.058 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.058 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Ensure instance console log exists: /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.059 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.060 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.061 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.324 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.324 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquired lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.324 2 DEBUG nova.network.neutron [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:33:18 compute-1 nova_compute[192795]: 2025-09-30 21:33:18.337 2 DEBUG nova.network.neutron [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Successfully created port: 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:33:19 compute-1 nova_compute[192795]: 2025-09-30 21:33:19.269 2 DEBUG nova.network.neutron [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Successfully updated port: 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:33:19 compute-1 nova_compute[192795]: 2025-09-30 21:33:19.286 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "refresh_cache-210f7950-671f-44dd-8721-fac4227dd74b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:19 compute-1 nova_compute[192795]: 2025-09-30 21:33:19.287 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquired lock "refresh_cache-210f7950-671f-44dd-8721-fac4227dd74b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:19 compute-1 nova_compute[192795]: 2025-09-30 21:33:19.287 2 DEBUG nova.network.neutron [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:33:19 compute-1 nova_compute[192795]: 2025-09-30 21:33:19.532 2 DEBUG nova.compute.manager [req-96ec9723-3f78-4688-872d-3f69fa7cab87 req-889eb89f-f740-4ac6-8183-e2e4c36466ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received event network-changed-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:19 compute-1 nova_compute[192795]: 2025-09-30 21:33:19.533 2 DEBUG nova.compute.manager [req-96ec9723-3f78-4688-872d-3f69fa7cab87 req-889eb89f-f740-4ac6-8183-e2e4c36466ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Refreshing instance network info cache due to event network-changed-01ee41cf-cc1d-4380-8c0b-59da2846c8f9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:33:19 compute-1 nova_compute[192795]: 2025-09-30 21:33:19.533 2 DEBUG oslo_concurrency.lockutils [req-96ec9723-3f78-4688-872d-3f69fa7cab87 req-889eb89f-f740-4ac6-8183-e2e4c36466ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-210f7950-671f-44dd-8721-fac4227dd74b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:19 compute-1 nova_compute[192795]: 2025-09-30 21:33:19.557 2 DEBUG nova.network.neutron [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:33:20 compute-1 podman[234894]: 2025-09-30 21:33:20.238564702 +0000 UTC m=+0.076199876 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.269 2 DEBUG nova.network.neutron [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating instance_info_cache with network_info: [{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.296 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Releasing lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.394 2 DEBUG nova.network.neutron [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Updating instance_info_cache with network_info: [{"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.431 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Releasing lock "refresh_cache-210f7950-671f-44dd-8721-fac4227dd74b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.432 2 DEBUG nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance network_info: |[{"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.432 2 DEBUG oslo_concurrency.lockutils [req-96ec9723-3f78-4688-872d-3f69fa7cab87 req-889eb89f-f740-4ac6-8183-e2e4c36466ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-210f7950-671f-44dd-8721-fac4227dd74b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.432 2 DEBUG nova.network.neutron [req-96ec9723-3f78-4688-872d-3f69fa7cab87 req-889eb89f-f740-4ac6-8183-e2e4c36466ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Refreshing network info cache for port 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.436 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Start _get_guest_xml network_info=[{"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.449 2 WARNING nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.458 2 DEBUG nova.virt.libvirt.host [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.458 2 DEBUG nova.virt.libvirt.host [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.460 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.462 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.462 2 INFO nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Creating image(s)
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.463 2 DEBUG nova.objects.instance [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'trusted_certs' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.468 2 DEBUG nova.virt.libvirt.host [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.469 2 DEBUG nova.virt.libvirt.host [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.470 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.470 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.471 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.471 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.471 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.472 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.472 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.472 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.473 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.473 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.473 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.473 2 DEBUG nova.virt.hardware [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.478 2 DEBUG nova.virt.libvirt.vif [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:33:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-139248646',display_name='tempest-tempest.common.compute-instance-139248646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-139248646',id=92,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b33d27d5088343569f4459643d0da580',ramdisk_id='',reservation_id='r-uwei7ucy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-553860193',owner_user_name='tempest-ServerActionsTestOtherA-553860193-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:17Z,user_data=None,user_id='c27a02706b9d43ffaab2c5fa833fec04',uuid=210f7950-671f-44dd-8721-fac4227dd74b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.478 2 DEBUG nova.network.os_vif_util [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converting VIF {"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.479 2 DEBUG nova.network.os_vif_util [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.480 2 DEBUG nova.objects.instance [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'pci_devices' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.482 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.508 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <uuid>210f7950-671f-44dd-8721-fac4227dd74b</uuid>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <name>instance-0000005c</name>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:name>tempest-tempest.common.compute-instance-139248646</nova:name>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:33:20</nova:creationTime>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:user uuid="c27a02706b9d43ffaab2c5fa833fec04">tempest-ServerActionsTestOtherA-553860193-project-member</nova:user>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:project uuid="b33d27d5088343569f4459643d0da580">tempest-ServerActionsTestOtherA-553860193</nova:project>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:port uuid="01ee41cf-cc1d-4380-8c0b-59da2846c8f9">
Sep 30 21:33:20 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <system>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="serial">210f7950-671f-44dd-8721-fac4227dd74b</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="uuid">210f7950-671f-44dd-8721-fac4227dd74b</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </system>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <os>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </os>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <features>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </features>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.config"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:b8:bd:70"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <target dev="tap01ee41cf-cc"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/console.log" append="off"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <video>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </video>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:33:20 compute-1 nova_compute[192795]: </domain>
Sep 30 21:33:20 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.511 2 DEBUG nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Preparing to wait for external event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.512 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.513 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.514 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.516 2 DEBUG nova.virt.libvirt.vif [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:33:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-139248646',display_name='tempest-tempest.common.compute-instance-139248646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-139248646',id=92,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b33d27d5088343569f4459643d0da580',ramdisk_id='',reservation_id='r-uwei7ucy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-553860193',owner_user_name='tempest-ServerActionsTestOtherA-553860193-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:17Z,user_data=None,user_id='c27a02706b9d43ffaab2c5fa833fec04',uuid=210f7950-671f-44dd-8721-fac4227dd74b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.517 2 DEBUG nova.network.os_vif_util [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converting VIF {"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.519 2 DEBUG nova.network.os_vif_util [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.519 2 DEBUG os_vif [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.560 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.561 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.569 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01ee41cf-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.570 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01ee41cf-cc, col_values=(('external_ids', {'iface-id': '01ee41cf-cc1d-4380-8c0b-59da2846c8f9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:bd:70', 'vm-uuid': '210f7950-671f-44dd-8721-fac4227dd74b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-1 NetworkManager[51724]: <info>  [1759268000.5734] manager: (tap01ee41cf-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.579 2 INFO os_vif [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc')
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.600 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.601 2 DEBUG nova.virt.disk.api [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Checking if we can resize image /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.601 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.666 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.667 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.667 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] No VIF found with MAC fa:16:3e:b8:bd:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.668 2 INFO nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Using config drive
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.680 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.680 2 DEBUG nova.virt.disk.api [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Cannot resize image /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.696 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.696 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Ensure instance console log exists: /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.697 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.697 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.698 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.701 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Start _get_guest_xml network_info=[{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1696557468-network", "vif_mac": "fa:16:3e:84:18:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.706 2 WARNING nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.710 2 DEBUG nova.virt.libvirt.host [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.710 2 DEBUG nova.virt.libvirt.host [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.713 2 DEBUG nova.virt.libvirt.host [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.714 2 DEBUG nova.virt.libvirt.host [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.715 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.715 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9779bca-1eb6-4567-a36c-b452abeafc70',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.716 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.716 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.716 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.716 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.716 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.717 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.717 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.717 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.717 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.717 2 DEBUG nova.virt.hardware [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.718 2 DEBUG nova.objects.instance [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'vcpu_model' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.734 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.811 2 DEBUG oslo_concurrency.processutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.812 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.813 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.814 2 DEBUG oslo_concurrency.lockutils [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.815 2 DEBUG nova.virt.libvirt.vif [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2076746266',display_name='tempest-ServerActionsTestOtherB-server-2076746266',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2076746266',id=84,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGkYCgSHUTuSUzUIAuXJtLTqGK//f64VuPig3h4DAdGVbwuL+Fh6FvgBtW4lbWvdqGtfEAYA8BT52zsalAqCB8JklfZ4tahvlr3WnGK5B2oFxxGbGDUfPfDkJJDH+Xurxg==',key_name='tempest-keypair-2100038533',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:31:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-8p80unpl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=c783547f-0799-4e53-8cdc-8784800b3c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1696557468-network", "vif_mac": "fa:16:3e:84:18:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.816 2 DEBUG nova.network.os_vif_util [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1696557468-network", "vif_mac": "fa:16:3e:84:18:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.817 2 DEBUG nova.network.os_vif_util [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.820 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <uuid>c783547f-0799-4e53-8cdc-8784800b3c2d</uuid>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <name>instance-00000054</name>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <memory>196608</memory>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerActionsTestOtherB-server-2076746266</nova:name>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:33:20</nova:creationTime>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:flavor name="m1.micro">
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:memory>192</nova:memory>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:user uuid="b9b3e9f2523944539f57a1ff5d565cb4">tempest-ServerActionsTestOtherB-463525410-project-member</nova:user>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:project uuid="d876c85b6ca5418eb657e48391a6503b">tempest-ServerActionsTestOtherB-463525410</nova:project>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         <nova:port uuid="d8844f4f-b484-4605-8f20-0bb8b7a50471">
Sep 30 21:33:20 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <system>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="serial">c783547f-0799-4e53-8cdc-8784800b3c2d</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="uuid">c783547f-0799-4e53-8cdc-8784800b3c2d</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </system>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <os>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </os>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <features>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </features>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:84:18:3c"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <target dev="tapd8844f4f-b4"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/console.log" append="off"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <video>
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </video>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:33:20 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:33:20 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:33:20 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:33:20 compute-1 nova_compute[192795]: </domain>
Sep 30 21:33:20 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.821 2 DEBUG nova.virt.libvirt.vif [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2076746266',display_name='tempest-ServerActionsTestOtherB-server-2076746266',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2076746266',id=84,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGkYCgSHUTuSUzUIAuXJtLTqGK//f64VuPig3h4DAdGVbwuL+Fh6FvgBtW4lbWvdqGtfEAYA8BT52zsalAqCB8JklfZ4tahvlr3WnGK5B2oFxxGbGDUfPfDkJJDH+Xurxg==',key_name='tempest-keypair-2100038533',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:31:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-8p80unpl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=c783547f-0799-4e53-8cdc-8784800b3c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1696557468-network", "vif_mac": "fa:16:3e:84:18:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.821 2 DEBUG nova.network.os_vif_util [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1696557468-network", "vif_mac": "fa:16:3e:84:18:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.822 2 DEBUG nova.network.os_vif_util [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.822 2 DEBUG os_vif [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.824 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.825 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.828 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8844f4f-b4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd8844f4f-b4, col_values=(('external_ids', {'iface-id': 'd8844f4f-b484-4605-8f20-0bb8b7a50471', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:18:3c', 'vm-uuid': 'c783547f-0799-4e53-8cdc-8784800b3c2d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:20 compute-1 NetworkManager[51724]: <info>  [1759268000.8326] manager: (tapd8844f4f-b4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.843 2 INFO os_vif [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4')
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.898 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.898 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.898 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No VIF found with MAC fa:16:3e:84:18:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.899 2 INFO nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Using config drive
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.899 2 DEBUG nova.compute.manager [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:33:20 compute-1 nova_compute[192795]: 2025-09-30 21:33:20.900 2 DEBUG nova.virt.libvirt.driver [None req-876b34c2-edbf-42e6-810e-0cdb5816af2f b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.085 2 INFO nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Creating config drive at /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.config
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.089 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpowrt07hi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.234 2 DEBUG oslo_concurrency.processutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpowrt07hi" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:21 compute-1 kernel: tap01ee41cf-cc: entered promiscuous mode
Sep 30 21:33:21 compute-1 NetworkManager[51724]: <info>  [1759268001.3024] manager: (tap01ee41cf-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/171)
Sep 30 21:33:21 compute-1 ovn_controller[94902]: 2025-09-30T21:33:21Z|00341|binding|INFO|Claiming lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 for this chassis.
Sep 30 21:33:21 compute-1 ovn_controller[94902]: 2025-09-30T21:33:21Z|00342|binding|INFO|01ee41cf-cc1d-4380-8c0b-59da2846c8f9: Claiming fa:16:3e:b8:bd:70 10.100.0.10
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 NetworkManager[51724]: <info>  [1759268001.3237] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 NetworkManager[51724]: <info>  [1759268001.3246] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Sep 30 21:33:21 compute-1 systemd-udevd[234946]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.329 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:bd:70 10.100.0.10'], port_security=['fa:16:3e:b8:bd:70 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '210f7950-671f-44dd-8721-fac4227dd74b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05b270a8-0653-4995-ab43-826254451140', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b33d27d5088343569f4459643d0da580', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e317a017-785e-4ba5-91f5-e79d51ecd764', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=babcc0e7-e8b1-4d4c-a7eb-2970ba0dde6e, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=01ee41cf-cc1d-4380-8c0b-59da2846c8f9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.330 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 in datapath 05b270a8-0653-4995-ab43-826254451140 bound to our chassis
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.332 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 05b270a8-0653-4995-ab43-826254451140
Sep 30 21:33:21 compute-1 NetworkManager[51724]: <info>  [1759268001.3490] device (tap01ee41cf-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:33:21 compute-1 NetworkManager[51724]: <info>  [1759268001.3505] device (tap01ee41cf-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.354 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[30f99fc6-f619-45af-88cc-eba8c0482e09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.355 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap05b270a8-01 in ovnmeta-05b270a8-0653-4995-ab43-826254451140 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:33:21 compute-1 systemd-machined[152783]: New machine qemu-43-instance-0000005c.
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.363 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap05b270a8-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.363 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3b51af-9a81-4149-8854-6d3101fb688a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.364 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c110b0d7-d5c0-4894-bcc9-489cecb3bfd4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.382 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[47dd9b92-4745-4e71-a953-8b4f147c3e35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.411 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5a592b-d8db-459a-8740-86388dc86953]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 systemd[1]: Started Virtual Machine qemu-43-instance-0000005c.
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.455 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5d8aa6-dc44-48f2-a6e6-04f6807d5783]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 NetworkManager[51724]: <info>  [1759268001.4677] manager: (tap05b270a8-00): new Veth device (/org/freedesktop/NetworkManager/Devices/174)
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.468 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f07b1e30-a364-47e3-86e5-7f9da420f253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 ovn_controller[94902]: 2025-09-30T21:33:21Z|00343|binding|INFO|Setting lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 ovn-installed in OVS
Sep 30 21:33:21 compute-1 ovn_controller[94902]: 2025-09-30T21:33:21Z|00344|binding|INFO|Setting lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 up in Southbound
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.506 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[669805ba-d280-4459-9720-e74f8f5ae609]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.510 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[fa836283-c7b7-4b05-b6c2-f2bd82de7f6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 NetworkManager[51724]: <info>  [1759268001.5389] device (tap05b270a8-00): carrier: link connected
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.545 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[bcdce2ad-f13c-4f31-b863-9df99cb14a10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.565 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[668f5638-1dc6-4bbc-b0e2-184fd2fb61f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05b270a8-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:8b:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469721, 'reachable_time': 44685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234985, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.589 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ff9a27-8d3d-42be-861e-9080f4c19d05]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:8b80'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469721, 'tstamp': 469721}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234986, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.609 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb762e9-6251-48b1-80a3-92334c669963]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05b270a8-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:8b:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469721, 'reachable_time': 44685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234987, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.644 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f9cfe8cc-7ed9-499e-aad3-ed9c532ca97b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.740 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1c292697-34a1-4fa6-b647-1def6b28762b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.742 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05b270a8-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.742 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.742 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05b270a8-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.744 2 DEBUG nova.compute.manager [req-0e09f5de-22ef-4b35-b189-6ca99b18be6f req-e262b9c9-a445-4608-951a-23152c1c216b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:21 compute-1 kernel: tap05b270a8-00: entered promiscuous mode
Sep 30 21:33:21 compute-1 NetworkManager[51724]: <info>  [1759268001.7456] manager: (tap05b270a8-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.747 2 DEBUG oslo_concurrency.lockutils [req-0e09f5de-22ef-4b35-b189-6ca99b18be6f req-e262b9c9-a445-4608-951a-23152c1c216b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.748 2 DEBUG oslo_concurrency.lockutils [req-0e09f5de-22ef-4b35-b189-6ca99b18be6f req-e262b9c9-a445-4608-951a-23152c1c216b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.748 2 DEBUG oslo_concurrency.lockutils [req-0e09f5de-22ef-4b35-b189-6ca99b18be6f req-e262b9c9-a445-4608-951a-23152c1c216b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.749 2 DEBUG nova.compute.manager [req-0e09f5de-22ef-4b35-b189-6ca99b18be6f req-e262b9c9-a445-4608-951a-23152c1c216b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Processing event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.756 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap05b270a8-00, col_values=(('external_ids', {'iface-id': '28bea95f-2c3a-4d33-8bad-3373e1efde4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 ovn_controller[94902]: 2025-09-30T21:33:21Z|00345|binding|INFO|Releasing lport 28bea95f-2c3a-4d33-8bad-3373e1efde4f from this chassis (sb_readonly=0)
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 nova_compute[192795]: 2025-09-30 21:33:21.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.777 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/05b270a8-0653-4995-ab43-826254451140.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/05b270a8-0653-4995-ab43-826254451140.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.778 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[99834542-d906-4f7e-8649-9ca4823eb886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.779 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-05b270a8-0653-4995-ab43-826254451140
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/05b270a8-0653-4995-ab43-826254451140.pid.haproxy
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 05b270a8-0653-4995-ab43-826254451140
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:33:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:21.781 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'env', 'PROCESS_TAG=haproxy-05b270a8-0653-4995-ab43-826254451140', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/05b270a8-0653-4995-ab43-826254451140.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.156 2 DEBUG nova.network.neutron [req-96ec9723-3f78-4688-872d-3f69fa7cab87 req-889eb89f-f740-4ac6-8183-e2e4c36466ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Updated VIF entry in instance network info cache for port 01ee41cf-cc1d-4380-8c0b-59da2846c8f9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.157 2 DEBUG nova.network.neutron [req-96ec9723-3f78-4688-872d-3f69fa7cab87 req-889eb89f-f740-4ac6-8183-e2e4c36466ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Updating instance_info_cache with network_info: [{"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.175 2 DEBUG oslo_concurrency.lockutils [req-96ec9723-3f78-4688-872d-3f69fa7cab87 req-889eb89f-f740-4ac6-8183-e2e4c36466ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-210f7950-671f-44dd-8721-fac4227dd74b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:22 compute-1 podman[235031]: 2025-09-30 21:33:22.187551151 +0000 UTC m=+0.046920142 container create 8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.211 2 DEBUG nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.212 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268002.2113326, 210f7950-671f-44dd-8721-fac4227dd74b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.212 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] VM Started (Lifecycle Event)
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.217 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.220 2 INFO nova.virt.libvirt.driver [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance spawned successfully.
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.220 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:33:22 compute-1 systemd[1]: Started libpod-conmon-8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026.scope.
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.237 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.244 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.247 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.248 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.248 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.248 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.249 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.249 2 DEBUG nova.virt.libvirt.driver [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:22 compute-1 podman[235031]: 2025-09-30 21:33:22.159054119 +0000 UTC m=+0.018423120 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:33:22 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:33:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f832ca4d742bf0cce91a72b77a4ba09b77e07c92fc5d936b1e89e49037d8cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.282 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.282 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268002.2130284, 210f7950-671f-44dd-8721-fac4227dd74b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.282 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] VM Paused (Lifecycle Event)
Sep 30 21:33:22 compute-1 podman[235031]: 2025-09-30 21:33:22.285511276 +0000 UTC m=+0.144880297 container init 8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:33:22 compute-1 podman[235031]: 2025-09-30 21:33:22.290675696 +0000 UTC m=+0.150044687 container start 8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.318 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.322 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268002.2161555, 210f7950-671f-44dd-8721-fac4227dd74b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.322 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] VM Resumed (Lifecycle Event)
Sep 30 21:33:22 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235046]: [NOTICE]   (235050) : New worker (235052) forked
Sep 30 21:33:22 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235046]: [NOTICE]   (235050) : Loading success.
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.346 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.349 2 INFO nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Took 4.77 seconds to spawn the instance on the hypervisor.
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.349 2 DEBUG nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.352 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.399 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.445 2 DEBUG oslo_concurrency.lockutils [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.446 2 DEBUG oslo_concurrency.lockutils [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.447 2 DEBUG nova.compute.manager [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Going to confirm migration 16 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.466 2 INFO nova.compute.manager [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Took 5.48 seconds to build instance.
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.475 2 DEBUG nova.objects.instance [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'info_cache' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.488 2 DEBUG oslo_concurrency.lockutils [None req-19e077a3-cb41-412b-a78b-be5996d3cf44 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.699 2 DEBUG oslo_concurrency.lockutils [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.700 2 DEBUG oslo_concurrency.lockutils [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquired lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.700 2 DEBUG nova.network.neutron [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:33:22 compute-1 nova_compute[192795]: 2025-09-30 21:33:22.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:23 compute-1 nova_compute[192795]: 2025-09-30 21:33:23.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:23 compute-1 nova_compute[192795]: 2025-09-30 21:33:23.802 2 DEBUG oslo_concurrency.lockutils [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:23 compute-1 nova_compute[192795]: 2025-09-30 21:33:23.802 2 DEBUG oslo_concurrency.lockutils [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:23 compute-1 nova_compute[192795]: 2025-09-30 21:33:23.803 2 DEBUG nova.compute.manager [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:23 compute-1 nova_compute[192795]: 2025-09-30 21:33:23.807 2 DEBUG nova.compute.manager [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Sep 30 21:33:23 compute-1 nova_compute[192795]: 2025-09-30 21:33:23.808 2 DEBUG nova.objects.instance [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'flavor' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:23 compute-1 nova_compute[192795]: 2025-09-30 21:33:23.837 2 DEBUG nova.objects.instance [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'info_cache' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:23 compute-1 nova_compute[192795]: 2025-09-30 21:33:23.874 2 DEBUG nova.virt.libvirt.driver [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.065 2 DEBUG nova.compute.manager [req-66dcb61f-63c6-4ae1-8356-8ed6b15b2c61 req-659dd07a-766d-4f36-9731-b557fe411cc6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.066 2 DEBUG oslo_concurrency.lockutils [req-66dcb61f-63c6-4ae1-8356-8ed6b15b2c61 req-659dd07a-766d-4f36-9731-b557fe411cc6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.066 2 DEBUG oslo_concurrency.lockutils [req-66dcb61f-63c6-4ae1-8356-8ed6b15b2c61 req-659dd07a-766d-4f36-9731-b557fe411cc6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.066 2 DEBUG oslo_concurrency.lockutils [req-66dcb61f-63c6-4ae1-8356-8ed6b15b2c61 req-659dd07a-766d-4f36-9731-b557fe411cc6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.066 2 DEBUG nova.compute.manager [req-66dcb61f-63c6-4ae1-8356-8ed6b15b2c61 req-659dd07a-766d-4f36-9731-b557fe411cc6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] No waiting events found dispatching network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.066 2 WARNING nova.compute.manager [req-66dcb61f-63c6-4ae1-8356-8ed6b15b2c61 req-659dd07a-766d-4f36-9731-b557fe411cc6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received unexpected event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 for instance with vm_state active and task_state powering-off.
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.590 2 DEBUG nova.network.neutron [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating instance_info_cache with network_info: [{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.614 2 DEBUG oslo_concurrency.lockutils [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Releasing lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.615 2 DEBUG nova.objects.instance [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'migration_context' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.630 2 DEBUG oslo_concurrency.lockutils [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.630 2 DEBUG oslo_concurrency.lockutils [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.737 2 DEBUG nova.compute.provider_tree [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.750 2 DEBUG nova.scheduler.client.report [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.790 2 DEBUG oslo_concurrency.lockutils [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.790 2 DEBUG nova.compute.manager [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Resized/migrated instance is powered off. Setting vm_state to 'stopped'. _confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4805
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.915 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267989.9058385, 272487f9-0986-45d4-bfde-ccdef8045c03 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.915 2 INFO nova.compute.manager [-] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] VM Stopped (Lifecycle Event)
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.939 2 DEBUG nova.compute.manager [None req-02e323e2-f828-4e9e-b8b9-2ba3499cd841 - - - - - -] [instance: 272487f9-0986-45d4-bfde-ccdef8045c03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:24 compute-1 nova_compute[192795]: 2025-09-30 21:33:24.972 2 INFO nova.scheduler.client.report [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Deleted allocation for migration 8e6864c3-7e97-476b-8041-2ab94162d7e1
Sep 30 21:33:25 compute-1 nova_compute[192795]: 2025-09-30 21:33:25.044 2 DEBUG oslo_concurrency.lockutils [None req-1232e722-9132-4b01-840b-f944ec129157 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:25 compute-1 nova_compute[192795]: 2025-09-30 21:33:25.243 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267990.2431934, c783547f-0799-4e53-8cdc-8784800b3c2d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:25 compute-1 nova_compute[192795]: 2025-09-30 21:33:25.244 2 INFO nova.compute.manager [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] VM Stopped (Lifecycle Event)
Sep 30 21:33:25 compute-1 nova_compute[192795]: 2025-09-30 21:33:25.263 2 DEBUG nova.compute.manager [None req-81a17201-b539-4581-857e-fa9cfcc6f6c2 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:25 compute-1 nova_compute[192795]: 2025-09-30 21:33:25.268 2 DEBUG nova.compute.manager [None req-81a17201-b539-4581-857e-fa9cfcc6f6c2 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:33:25 compute-1 nova_compute[192795]: 2025-09-30 21:33:25.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:25 compute-1 nova_compute[192795]: 2025-09-30 21:33:25.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:26 compute-1 nova_compute[192795]: 2025-09-30 21:33:26.896 2 DEBUG nova.objects.instance [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'flavor' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:26 compute-1 nova_compute[192795]: 2025-09-30 21:33:26.927 2 DEBUG nova.objects.instance [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'info_cache' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:26 compute-1 nova_compute[192795]: 2025-09-30 21:33:26.964 2 DEBUG oslo_concurrency.lockutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:26 compute-1 nova_compute[192795]: 2025-09-30 21:33:26.965 2 DEBUG oslo_concurrency.lockutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquired lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:26 compute-1 nova_compute[192795]: 2025-09-30 21:33:26.965 2 DEBUG nova.network.neutron [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:33:27 compute-1 podman[235061]: 2025-09-30 21:33:27.242353702 +0000 UTC m=+0.079413604 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Sep 30 21:33:27 compute-1 podman[235063]: 2025-09-30 21:33:27.268336936 +0000 UTC m=+0.097344009 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:33:27 compute-1 podman[235062]: 2025-09-30 21:33:27.268437449 +0000 UTC m=+0.101376199 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:33:27 compute-1 nova_compute[192795]: 2025-09-30 21:33:27.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:27 compute-1 nova_compute[192795]: 2025-09-30 21:33:27.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.888 2 DEBUG nova.network.neutron [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating instance_info_cache with network_info: [{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.918 2 DEBUG oslo_concurrency.lockutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Releasing lock "refresh_cache-c783547f-0799-4e53-8cdc-8784800b3c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.954 2 INFO nova.virt.libvirt.driver [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance destroyed successfully.
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.955 2 DEBUG nova.objects.instance [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'numa_topology' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.972 2 DEBUG nova.objects.instance [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'resources' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.989 2 DEBUG nova.virt.libvirt.vif [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2076746266',display_name='tempest-ServerActionsTestOtherB-server-2076746266',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2076746266',id=84,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGkYCgSHUTuSUzUIAuXJtLTqGK//f64VuPig3h4DAdGVbwuL+Fh6FvgBtW4lbWvdqGtfEAYA8BT52zsalAqCB8JklfZ4tahvlr3WnGK5B2oFxxGbGDUfPfDkJJDH+Xurxg==',key_name='tempest-keypair-2100038533',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:33:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-8p80unpl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:33:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=c783547f-0799-4e53-8cdc-8784800b3c2d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.990 2 DEBUG nova.network.os_vif_util [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.991 2 DEBUG nova.network.os_vif_util [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.992 2 DEBUG os_vif [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.996 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8844f4f-b4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:29 compute-1 nova_compute[192795]: 2025-09-30 21:33:29.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.008 2 INFO os_vif [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4')
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.019 2 DEBUG nova.virt.libvirt.driver [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Start _get_guest_xml network_info=[{"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.028 2 WARNING nova.virt.libvirt.driver [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.037 2 DEBUG nova.virt.libvirt.host [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.039 2 DEBUG nova.virt.libvirt.host [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.045 2 DEBUG nova.virt.libvirt.host [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.046 2 DEBUG nova.virt.libvirt.host [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.048 2 DEBUG nova.virt.libvirt.driver [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.048 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9779bca-1eb6-4567-a36c-b452abeafc70',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.049 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.050 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.051 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.051 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.052 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.052 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.053 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.054 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.054 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.055 2 DEBUG nova.virt.hardware [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.055 2 DEBUG nova.objects.instance [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'vcpu_model' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.081 2 DEBUG nova.virt.libvirt.vif [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2076746266',display_name='tempest-ServerActionsTestOtherB-server-2076746266',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2076746266',id=84,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGkYCgSHUTuSUzUIAuXJtLTqGK//f64VuPig3h4DAdGVbwuL+Fh6FvgBtW4lbWvdqGtfEAYA8BT52zsalAqCB8JklfZ4tahvlr3WnGK5B2oFxxGbGDUfPfDkJJDH+Xurxg==',key_name='tempest-keypair-2100038533',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:33:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-8p80unpl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:33:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=c783547f-0799-4e53-8cdc-8784800b3c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.082 2 DEBUG nova.network.os_vif_util [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.084 2 DEBUG nova.network.os_vif_util [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.086 2 DEBUG nova.objects.instance [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'pci_devices' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.109 2 DEBUG nova.virt.libvirt.driver [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <uuid>c783547f-0799-4e53-8cdc-8784800b3c2d</uuid>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <name>instance-00000054</name>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <memory>196608</memory>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerActionsTestOtherB-server-2076746266</nova:name>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:33:30</nova:creationTime>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <nova:flavor name="m1.micro">
Sep 30 21:33:30 compute-1 nova_compute[192795]:         <nova:memory>192</nova:memory>
Sep 30 21:33:30 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:33:30 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:33:30 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:33:30 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:33:30 compute-1 nova_compute[192795]:         <nova:user uuid="b9b3e9f2523944539f57a1ff5d565cb4">tempest-ServerActionsTestOtherB-463525410-project-member</nova:user>
Sep 30 21:33:30 compute-1 nova_compute[192795]:         <nova:project uuid="d876c85b6ca5418eb657e48391a6503b">tempest-ServerActionsTestOtherB-463525410</nova:project>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:33:30 compute-1 nova_compute[192795]:         <nova:port uuid="d8844f4f-b484-4605-8f20-0bb8b7a50471">
Sep 30 21:33:30 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <system>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <entry name="serial">c783547f-0799-4e53-8cdc-8784800b3c2d</entry>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <entry name="uuid">c783547f-0799-4e53-8cdc-8784800b3c2d</entry>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     </system>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <os>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   </os>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <features>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   </features>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk.config"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:84:18:3c"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <target dev="tapd8844f4f-b4"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/console.log" append="off"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <video>
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     </video>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <input type="keyboard" bus="usb"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:33:30 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:33:30 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:33:30 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:33:30 compute-1 nova_compute[192795]: </domain>
Sep 30 21:33:30 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.122 2 DEBUG oslo_concurrency.processutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.221 2 DEBUG oslo_concurrency.processutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.224 2 DEBUG oslo_concurrency.processutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.288 2 DEBUG oslo_concurrency.processutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.291 2 DEBUG nova.objects.instance [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'trusted_certs' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.315 2 DEBUG oslo_concurrency.processutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.389 2 DEBUG oslo_concurrency.processutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.392 2 DEBUG nova.virt.disk.api [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Checking if we can resize image /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.393 2 DEBUG oslo_concurrency.processutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.460 2 DEBUG oslo_concurrency.processutils [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.462 2 DEBUG nova.virt.disk.api [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Cannot resize image /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.463 2 DEBUG nova.objects.instance [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'migration_context' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.481 2 DEBUG nova.virt.libvirt.vif [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2076746266',display_name='tempest-ServerActionsTestOtherB-server-2076746266',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2076746266',id=84,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGkYCgSHUTuSUzUIAuXJtLTqGK//f64VuPig3h4DAdGVbwuL+Fh6FvgBtW4lbWvdqGtfEAYA8BT52zsalAqCB8JklfZ4tahvlr3WnGK5B2oFxxGbGDUfPfDkJJDH+Xurxg==',key_name='tempest-keypair-2100038533',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:33:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-8p80unpl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=c783547f-0799-4e53-8cdc-8784800b3c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.483 2 DEBUG nova.network.os_vif_util [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.484 2 DEBUG nova.network.os_vif_util [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.485 2 DEBUG os_vif [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.488 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.495 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8844f4f-b4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.496 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd8844f4f-b4, col_values=(('external_ids', {'iface-id': 'd8844f4f-b484-4605-8f20-0bb8b7a50471', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:18:3c', 'vm-uuid': 'c783547f-0799-4e53-8cdc-8784800b3c2d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-1 NetworkManager[51724]: <info>  [1759268010.4996] manager: (tapd8844f4f-b4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.509 2 INFO os_vif [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4')
Sep 30 21:33:30 compute-1 kernel: tapd8844f4f-b4: entered promiscuous mode
Sep 30 21:33:30 compute-1 NetworkManager[51724]: <info>  [1759268010.6387] manager: (tapd8844f4f-b4): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Sep 30 21:33:30 compute-1 ovn_controller[94902]: 2025-09-30T21:33:30Z|00346|binding|INFO|Claiming lport d8844f4f-b484-4605-8f20-0bb8b7a50471 for this chassis.
Sep 30 21:33:30 compute-1 ovn_controller[94902]: 2025-09-30T21:33:30Z|00347|binding|INFO|d8844f4f-b484-4605-8f20-0bb8b7a50471: Claiming fa:16:3e:84:18:3c 10.100.0.5
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.656 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:18:3c 10.100.0.5'], port_security=['fa:16:3e:84:18:3c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd876c85b6ca5418eb657e48391a6503b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6d444175-4bc9-45e9-8b74-e4555ef5d88b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b003c3b3-124e-4f30-8c82-ee588d17c214, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=d8844f4f-b484-4605-8f20-0bb8b7a50471) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.659 103861 INFO neutron.agent.ovn.metadata.agent [-] Port d8844f4f-b484-4605-8f20-0bb8b7a50471 in datapath 91c84c55-96ab-4682-a6e7-9e96514ca8a5 bound to our chassis
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.662 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.693 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a54b807b-9733-4b7d-be1a-f4802bf80c5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.695 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap91c84c55-91 in ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:33:30 compute-1 ovn_controller[94902]: 2025-09-30T21:33:30Z|00348|binding|INFO|Setting lport d8844f4f-b484-4605-8f20-0bb8b7a50471 ovn-installed in OVS
Sep 30 21:33:30 compute-1 ovn_controller[94902]: 2025-09-30T21:33:30Z|00349|binding|INFO|Setting lport d8844f4f-b484-4605-8f20-0bb8b7a50471 up in Southbound
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.697 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap91c84c55-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.697 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5659dec5-7853-4f7c-8378-a6e251789582]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.699 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ea627e-be36-40fa-baff-de1d89e1e710]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-1 nova_compute[192795]: 2025-09-30 21:33:30.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-1 systemd-machined[152783]: New machine qemu-44-instance-00000054.
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.728 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[33d862a0-b709-481b-bb4a-dff5cc82e848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 systemd[1]: Started Virtual Machine qemu-44-instance-00000054.
Sep 30 21:33:30 compute-1 systemd-udevd[235156]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.747 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fd45b382-25ab-4e7d-baad-c51824cef78b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 NetworkManager[51724]: <info>  [1759268010.7776] device (tapd8844f4f-b4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:33:30 compute-1 NetworkManager[51724]: <info>  [1759268010.7797] device (tapd8844f4f-b4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.793 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[8f759db1-50f2-4f5b-bf15-dd17798d73c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 systemd-udevd[235161]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:33:30 compute-1 NetworkManager[51724]: <info>  [1759268010.8051] manager: (tap91c84c55-90): new Veth device (/org/freedesktop/NetworkManager/Devices/178)
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.806 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[83174eef-367b-409b-90e1-59dbe7d03c0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.863 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc513b1-e553-447c-a976-4bf79b8a83bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.867 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[da108a59-ebc5-4011-b7f5-a568e1d82bc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 NetworkManager[51724]: <info>  [1759268010.9004] device (tap91c84c55-90): carrier: link connected
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.911 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[54a319de-f2c1-4aab-8432-83909423453e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.939 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f685aa69-20a1-4575-9430-2496c4616a65]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91c84c55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:a7:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470657, 'reachable_time': 22181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235186, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.967 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5c2f0c-338a-459b-8d70-d96ab2c1f160]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef9:a7ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 470657, 'tstamp': 470657}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235187, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:30.990 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[88274110-9a18-46f2-a838-e6f6d919f914]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91c84c55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:a7:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470657, 'reachable_time': 22181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235188, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.045 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[38f6d11d-c23f-4bba-8803-63ea82292e7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.149 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[336b89ad-a7f2-4123-8fc6-12914405b0d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.152 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91c84c55-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.152 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.153 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91c84c55-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:31 compute-1 kernel: tap91c84c55-90: entered promiscuous mode
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:31 compute-1 NetworkManager[51724]: <info>  [1759268011.1928] manager: (tap91c84c55-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.193 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap91c84c55-90, col_values=(('external_ids', {'iface-id': '3996e682-c20c-41c5-9547-9688a18f316c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.196 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:33:31 compute-1 ovn_controller[94902]: 2025-09-30T21:33:31Z|00350|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.198 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[25b67b5f-4688-4d95-8adc-4afbf6e2abaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.198 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:33:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:31.199 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'env', 'PROCESS_TAG=haproxy-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/91c84c55-96ab-4682-a6e7-9e96514ca8a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:31 compute-1 podman[235220]: 2025-09-30 21:33:31.643924462 +0000 UTC m=+0.067518850 container create d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:33:31 compute-1 systemd[1]: Started libpod-conmon-d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b.scope.
Sep 30 21:33:31 compute-1 podman[235220]: 2025-09-30 21:33:31.617160273 +0000 UTC m=+0.040754721 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.715 2 DEBUG nova.compute.manager [req-365f5180-a7b2-4a4d-b57a-619e72b9fa55 req-443188b4-ca40-4e66-a9a9-e4963751af13 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.716 2 DEBUG oslo_concurrency.lockutils [req-365f5180-a7b2-4a4d-b57a-619e72b9fa55 req-443188b4-ca40-4e66-a9a9-e4963751af13 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.716 2 DEBUG oslo_concurrency.lockutils [req-365f5180-a7b2-4a4d-b57a-619e72b9fa55 req-443188b4-ca40-4e66-a9a9-e4963751af13 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.716 2 DEBUG oslo_concurrency.lockutils [req-365f5180-a7b2-4a4d-b57a-619e72b9fa55 req-443188b4-ca40-4e66-a9a9-e4963751af13 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.716 2 DEBUG nova.compute.manager [req-365f5180-a7b2-4a4d-b57a-619e72b9fa55 req-443188b4-ca40-4e66-a9a9-e4963751af13 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] No waiting events found dispatching network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:31 compute-1 nova_compute[192795]: 2025-09-30 21:33:31.716 2 WARNING nova.compute.manager [req-365f5180-a7b2-4a4d-b57a-619e72b9fa55 req-443188b4-ca40-4e66-a9a9-e4963751af13 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received unexpected event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 for instance with vm_state stopped and task_state powering-on.
Sep 30 21:33:31 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:33:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d27999c4da3bda33f98c5441286731264cb078f4c111bc600ff29776e16d362d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:33:31 compute-1 podman[235220]: 2025-09-30 21:33:31.751637606 +0000 UTC m=+0.175232054 container init d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:33:31 compute-1 podman[235220]: 2025-09-30 21:33:31.756803797 +0000 UTC m=+0.180398225 container start d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:33:31 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[235236]: [NOTICE]   (235240) : New worker (235244) forked
Sep 30 21:33:31 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[235236]: [NOTICE]   (235240) : Loading success.
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.341 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268012.3407655, c783547f-0799-4e53-8cdc-8784800b3c2d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.342 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] VM Resumed (Lifecycle Event)
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.345 2 DEBUG nova.compute.manager [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.355 2 INFO nova.virt.libvirt.driver [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance rebooted successfully.
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.356 2 DEBUG nova.compute.manager [None req-65913926-23bf-4226-870e-2f4d74b4a6a4 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.442 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.445 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.465 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] During sync_power_state the instance has a pending task (powering-on). Skip.
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.467 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268012.341034, c783547f-0799-4e53-8cdc-8784800b3c2d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.467 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] VM Started (Lifecycle Event)
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.495 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.500 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:32 compute-1 ovn_controller[94902]: 2025-09-30T21:33:32Z|00351|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:33:32 compute-1 ovn_controller[94902]: 2025-09-30T21:33:32Z|00352|binding|INFO|Releasing lport 28bea95f-2c3a-4d33-8bad-3373e1efde4f from this chassis (sb_readonly=0)
Sep 30 21:33:32 compute-1 nova_compute[192795]: 2025-09-30 21:33:32.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:33 compute-1 nova_compute[192795]: 2025-09-30 21:33:33.895 2 DEBUG nova.compute.manager [req-6274b87b-037b-4044-b616-a2ce8f4fec69 req-be911cc6-8251-4d8b-abec-493d202a7303 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:33 compute-1 nova_compute[192795]: 2025-09-30 21:33:33.898 2 DEBUG oslo_concurrency.lockutils [req-6274b87b-037b-4044-b616-a2ce8f4fec69 req-be911cc6-8251-4d8b-abec-493d202a7303 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:33 compute-1 nova_compute[192795]: 2025-09-30 21:33:33.899 2 DEBUG oslo_concurrency.lockutils [req-6274b87b-037b-4044-b616-a2ce8f4fec69 req-be911cc6-8251-4d8b-abec-493d202a7303 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:33 compute-1 nova_compute[192795]: 2025-09-30 21:33:33.899 2 DEBUG oslo_concurrency.lockutils [req-6274b87b-037b-4044-b616-a2ce8f4fec69 req-be911cc6-8251-4d8b-abec-493d202a7303 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:33 compute-1 nova_compute[192795]: 2025-09-30 21:33:33.900 2 DEBUG nova.compute.manager [req-6274b87b-037b-4044-b616-a2ce8f4fec69 req-be911cc6-8251-4d8b-abec-493d202a7303 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] No waiting events found dispatching network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:33 compute-1 nova_compute[192795]: 2025-09-30 21:33:33.901 2 WARNING nova.compute.manager [req-6274b87b-037b-4044-b616-a2ce8f4fec69 req-be911cc6-8251-4d8b-abec-493d202a7303 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received unexpected event network-vif-plugged-d8844f4f-b484-4605-8f20-0bb8b7a50471 for instance with vm_state active and task_state None.
Sep 30 21:33:33 compute-1 nova_compute[192795]: 2025-09-30 21:33:33.947 2 DEBUG nova.virt.libvirt.driver [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:33:34 compute-1 ovn_controller[94902]: 2025-09-30T21:33:34Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b8:bd:70 10.100.0.10
Sep 30 21:33:34 compute-1 ovn_controller[94902]: 2025-09-30T21:33:34Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:bd:70 10.100.0.10
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.590 2 DEBUG oslo_concurrency.lockutils [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.591 2 DEBUG oslo_concurrency.lockutils [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.591 2 DEBUG oslo_concurrency.lockutils [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.592 2 DEBUG oslo_concurrency.lockutils [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.592 2 DEBUG oslo_concurrency.lockutils [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.697 2 INFO nova.compute.manager [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Terminating instance
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.795 2 DEBUG nova.compute.manager [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:33:34 compute-1 kernel: tapd8844f4f-b4 (unregistering): left promiscuous mode
Sep 30 21:33:34 compute-1 NetworkManager[51724]: <info>  [1759268014.8260] device (tapd8844f4f-b4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:33:34 compute-1 ovn_controller[94902]: 2025-09-30T21:33:34Z|00353|binding|INFO|Releasing lport d8844f4f-b484-4605-8f20-0bb8b7a50471 from this chassis (sb_readonly=0)
Sep 30 21:33:34 compute-1 ovn_controller[94902]: 2025-09-30T21:33:34Z|00354|binding|INFO|Setting lport d8844f4f-b484-4605-8f20-0bb8b7a50471 down in Southbound
Sep 30 21:33:34 compute-1 ovn_controller[94902]: 2025-09-30T21:33:34Z|00355|binding|INFO|Removing iface tapd8844f4f-b4 ovn-installed in OVS
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:34 compute-1 nova_compute[192795]: 2025-09-30 21:33:34.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:34 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:34.888 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:18:3c 10.100.0.5'], port_security=['fa:16:3e:84:18:3c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c783547f-0799-4e53-8cdc-8784800b3c2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd876c85b6ca5418eb657e48391a6503b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6d444175-4bc9-45e9-8b74-e4555ef5d88b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b003c3b3-124e-4f30-8c82-ee588d17c214, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=d8844f4f-b484-4605-8f20-0bb8b7a50471) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:34 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:34.890 103861 INFO neutron.agent.ovn.metadata.agent [-] Port d8844f4f-b484-4605-8f20-0bb8b7a50471 in datapath 91c84c55-96ab-4682-a6e7-9e96514ca8a5 unbound from our chassis
Sep 30 21:33:34 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:34.892 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91c84c55-96ab-4682-a6e7-9e96514ca8a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:33:34 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:34.893 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ecef31c5-e205-4cc7-9b95-261f3594b92c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:34 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:34.894 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 namespace which is not needed anymore
Sep 30 21:33:34 compute-1 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000054.scope: Deactivated successfully.
Sep 30 21:33:34 compute-1 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000054.scope: Consumed 4.123s CPU time.
Sep 30 21:33:34 compute-1 systemd-machined[152783]: Machine qemu-44-instance-00000054 terminated.
Sep 30 21:33:35 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[235236]: [NOTICE]   (235240) : haproxy version is 2.8.14-c23fe91
Sep 30 21:33:35 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[235236]: [NOTICE]   (235240) : path to executable is /usr/sbin/haproxy
Sep 30 21:33:35 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[235236]: [WARNING]  (235240) : Exiting Master process...
Sep 30 21:33:35 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[235236]: [WARNING]  (235240) : Exiting Master process...
Sep 30 21:33:35 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[235236]: [ALERT]    (235240) : Current worker (235244) exited with code 143 (Terminated)
Sep 30 21:33:35 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[235236]: [WARNING]  (235240) : All workers exited. Exiting... (0)
Sep 30 21:33:35 compute-1 systemd[1]: libpod-d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b.scope: Deactivated successfully.
Sep 30 21:33:35 compute-1 podman[235300]: 2025-09-30 21:33:35.072108801 +0000 UTC m=+0.065717211 container died d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.082 2 INFO nova.virt.libvirt.driver [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Instance destroyed successfully.
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.083 2 DEBUG nova.objects.instance [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'resources' on Instance uuid c783547f-0799-4e53-8cdc-8784800b3c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-d27999c4da3bda33f98c5441286731264cb078f4c111bc600ff29776e16d362d-merged.mount: Deactivated successfully.
Sep 30 21:33:35 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b-userdata-shm.mount: Deactivated successfully.
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.106 2 DEBUG nova.virt.libvirt.vif [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:31:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-2076746266',display_name='tempest-ServerActionsTestOtherB-server-2076746266',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-2076746266',id=84,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGkYCgSHUTuSUzUIAuXJtLTqGK//f64VuPig3h4DAdGVbwuL+Fh6FvgBtW4lbWvdqGtfEAYA8BT52zsalAqCB8JklfZ4tahvlr3WnGK5B2oFxxGbGDUfPfDkJJDH+Xurxg==',key_name='tempest-keypair-2100038533',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:33:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-8p80unpl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:33:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=c783547f-0799-4e53-8cdc-8784800b3c2d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.115 2 DEBUG nova.network.os_vif_util [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "address": "fa:16:3e:84:18:3c", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8844f4f-b4", "ovs_interfaceid": "d8844f4f-b484-4605-8f20-0bb8b7a50471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.116 2 DEBUG nova.network.os_vif_util [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.116 2 DEBUG os_vif [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.118 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8844f4f-b4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.127 2 INFO os_vif [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:18:3c,bridge_name='br-int',has_traffic_filtering=True,id=d8844f4f-b484-4605-8f20-0bb8b7a50471,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8844f4f-b4')
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.127 2 INFO nova.virt.libvirt.driver [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Deleting instance files /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d_del
Sep 30 21:33:35 compute-1 podman[235300]: 2025-09-30 21:33:35.128259701 +0000 UTC m=+0.121868061 container cleanup d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.128 2 INFO nova.virt.libvirt.driver [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Deletion of /var/lib/nova/instances/c783547f-0799-4e53-8cdc-8784800b3c2d_del complete
Sep 30 21:33:35 compute-1 systemd[1]: libpod-conmon-d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b.scope: Deactivated successfully.
Sep 30 21:33:35 compute-1 podman[235346]: 2025-09-30 21:33:35.215515917 +0000 UTC m=+0.057238620 container remove d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.222 2 INFO nova.compute.manager [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.223 2 DEBUG oslo.service.loopingcall [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.223 2 DEBUG nova.compute.manager [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.223 2 DEBUG nova.network.neutron [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:33:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:35.225 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ae7e52e9-25dd-401b-a125-3f5c7f5cfbea]: (4, ('Tue Sep 30 09:33:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 (d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b)\nd0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b\nTue Sep 30 09:33:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 (d0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b)\nd0cf0fbe80f159d3fa91622eb838dc93200d6fc9d48fbcc215c6e9d3396ab46b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:35.227 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[642fdcab-ed00-451f-9573-0a624d07d824]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:35.228 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91c84c55-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:35 compute-1 kernel: tap91c84c55-90: left promiscuous mode
Sep 30 21:33:35 compute-1 nova_compute[192795]: 2025-09-30 21:33:35.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:35.247 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4a8827-0e7d-433f-8f18-d0c754c64353]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:35.285 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[817032e4-f94e-4358-a1d7-15dc321dcd1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:35.287 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[816ae11f-65d2-4e20-8f64-787a1678649f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:35.316 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7d2c98-3ba8-474a-b971-45a45a5ebce9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470645, 'reachable_time': 41702, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235361, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:35.320 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:33:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:35.321 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[5f38b075-fcb8-4e8a-8a34-ce64fce6b1f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:35 compute-1 systemd[1]: run-netns-ovnmeta\x2d91c84c55\x2d96ab\x2d4682\x2da6e7\x2d9e96514ca8a5.mount: Deactivated successfully.
Sep 30 21:33:36 compute-1 nova_compute[192795]: 2025-09-30 21:33:36.787 2 DEBUG nova.network.neutron [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:36 compute-1 nova_compute[192795]: 2025-09-30 21:33:36.811 2 INFO nova.compute.manager [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Took 1.59 seconds to deallocate network for instance.
Sep 30 21:33:36 compute-1 nova_compute[192795]: 2025-09-30 21:33:36.896 2 DEBUG oslo_concurrency.lockutils [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:36 compute-1 nova_compute[192795]: 2025-09-30 21:33:36.896 2 DEBUG oslo_concurrency.lockutils [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:36 compute-1 nova_compute[192795]: 2025-09-30 21:33:36.906 2 DEBUG oslo_concurrency.lockutils [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:36 compute-1 nova_compute[192795]: 2025-09-30 21:33:36.970 2 INFO nova.scheduler.client.report [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Deleted allocations for instance c783547f-0799-4e53-8cdc-8784800b3c2d
Sep 30 21:33:36 compute-1 nova_compute[192795]: 2025-09-30 21:33:36.983 2 DEBUG nova.compute.manager [req-1f5740bf-1aba-4196-aa2c-efc46d7c5a80 req-dce2a42c-a8a5-44b8-9903-fe2cc14deca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Received event network-vif-deleted-d8844f4f-b484-4605-8f20-0bb8b7a50471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.063 2 DEBUG oslo_concurrency.lockutils [None req-5d70c8be-261a-4820-89d8-99cc2c60c9e6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "c783547f-0799-4e53-8cdc-8784800b3c2d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.472s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:37 compute-1 kernel: tap01ee41cf-cc (unregistering): left promiscuous mode
Sep 30 21:33:37 compute-1 NetworkManager[51724]: <info>  [1759268017.1772] device (tap01ee41cf-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:37 compute-1 ovn_controller[94902]: 2025-09-30T21:33:37Z|00356|binding|INFO|Releasing lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 from this chassis (sb_readonly=0)
Sep 30 21:33:37 compute-1 ovn_controller[94902]: 2025-09-30T21:33:37Z|00357|binding|INFO|Setting lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 down in Southbound
Sep 30 21:33:37 compute-1 ovn_controller[94902]: 2025-09-30T21:33:37Z|00358|binding|INFO|Removing iface tap01ee41cf-cc ovn-installed in OVS
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.195 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:bd:70 10.100.0.10'], port_security=['fa:16:3e:b8:bd:70 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '210f7950-671f-44dd-8721-fac4227dd74b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05b270a8-0653-4995-ab43-826254451140', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b33d27d5088343569f4459643d0da580', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e317a017-785e-4ba5-91f5-e79d51ecd764', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=babcc0e7-e8b1-4d4c-a7eb-2970ba0dde6e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=01ee41cf-cc1d-4380-8c0b-59da2846c8f9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.197 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 in datapath 05b270a8-0653-4995-ab43-826254451140 unbound from our chassis
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.199 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 05b270a8-0653-4995-ab43-826254451140, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.202 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2b1685-f6f1-47a8-b893-69321da8af0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.203 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-05b270a8-0653-4995-ab43-826254451140 namespace which is not needed anymore
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:37 compute-1 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Sep 30 21:33:37 compute-1 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000005c.scope: Consumed 13.239s CPU time.
Sep 30 21:33:37 compute-1 systemd-machined[152783]: Machine qemu-43-instance-0000005c terminated.
Sep 30 21:33:37 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235046]: [NOTICE]   (235050) : haproxy version is 2.8.14-c23fe91
Sep 30 21:33:37 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235046]: [NOTICE]   (235050) : path to executable is /usr/sbin/haproxy
Sep 30 21:33:37 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235046]: [WARNING]  (235050) : Exiting Master process...
Sep 30 21:33:37 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235046]: [ALERT]    (235050) : Current worker (235052) exited with code 143 (Terminated)
Sep 30 21:33:37 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235046]: [WARNING]  (235050) : All workers exited. Exiting... (0)
Sep 30 21:33:37 compute-1 systemd[1]: libpod-8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026.scope: Deactivated successfully.
Sep 30 21:33:37 compute-1 podman[235385]: 2025-09-30 21:33:37.359782253 +0000 UTC m=+0.048772729 container died 8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:33:37 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026-userdata-shm.mount: Deactivated successfully.
Sep 30 21:33:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-e6f832ca4d742bf0cce91a72b77a4ba09b77e07c92fc5d936b1e89e49037d8cc-merged.mount: Deactivated successfully.
Sep 30 21:33:37 compute-1 podman[235385]: 2025-09-30 21:33:37.405369005 +0000 UTC m=+0.094359451 container cleanup 8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:33:37 compute-1 systemd[1]: libpod-conmon-8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026.scope: Deactivated successfully.
Sep 30 21:33:37 compute-1 NetworkManager[51724]: <info>  [1759268017.4186] manager: (tap01ee41cf-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Sep 30 21:33:37 compute-1 podman[235401]: 2025-09-30 21:33:37.455975503 +0000 UTC m=+0.070736267 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Sep 30 21:33:37 compute-1 podman[235428]: 2025-09-30 21:33:37.482043193 +0000 UTC m=+0.046491667 container remove 8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.489 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[061deb71-3ef2-4ac2-9d86-8251ccb69eab]: (4, ('Tue Sep 30 09:33:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140 (8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026)\n8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026\nTue Sep 30 09:33:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140 (8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026)\n8bb5a23e2afe443aeb8dc139546bb0b64abd49a0c727349da97ace2d75529026\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.492 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[38c86ab2-26ea-4cc1-bb37-896790c621f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.493 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05b270a8-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:37 compute-1 kernel: tap05b270a8-00: left promiscuous mode
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.513 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c912e740-22c4-4219-9e2c-f56e361cdf24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.548 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1e238fe0-e2fe-49b2-8086-acfbf5af495a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.550 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bec20ce0-92e9-41c1-a60f-3f4df1bf6f1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.569 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[68192ac1-3cae-4add-9070-2d9739d5507d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469712, 'reachable_time': 37030, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235467, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.572 2 DEBUG nova.compute.manager [req-d146de84-677c-4ab4-b1ef-fc2dd4f15596 req-b0c10fb5-7d11-48a9-b63d-07608915df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received event network-vif-unplugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.572 2 DEBUG oslo_concurrency.lockutils [req-d146de84-677c-4ab4-b1ef-fc2dd4f15596 req-b0c10fb5-7d11-48a9-b63d-07608915df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.572 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-05b270a8-0653-4995-ab43-826254451140 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:33:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:37.572 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[277cb268-16b4-43f1-b351-e26dc2749fd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.573 2 DEBUG oslo_concurrency.lockutils [req-d146de84-677c-4ab4-b1ef-fc2dd4f15596 req-b0c10fb5-7d11-48a9-b63d-07608915df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.573 2 DEBUG oslo_concurrency.lockutils [req-d146de84-677c-4ab4-b1ef-fc2dd4f15596 req-b0c10fb5-7d11-48a9-b63d-07608915df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.574 2 DEBUG nova.compute.manager [req-d146de84-677c-4ab4-b1ef-fc2dd4f15596 req-b0c10fb5-7d11-48a9-b63d-07608915df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] No waiting events found dispatching network-vif-unplugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.574 2 WARNING nova.compute.manager [req-d146de84-677c-4ab4-b1ef-fc2dd4f15596 req-b0c10fb5-7d11-48a9-b63d-07608915df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received unexpected event network-vif-unplugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 for instance with vm_state active and task_state powering-off.
Sep 30 21:33:37 compute-1 systemd[1]: run-netns-ovnmeta\x2d05b270a8\x2d0653\x2d4995\x2dab43\x2d826254451140.mount: Deactivated successfully.
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.969 2 INFO nova.virt.libvirt.driver [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance shutdown successfully after 14 seconds.
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.974 2 INFO nova.virt.libvirt.driver [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance destroyed successfully.
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.974 2 DEBUG nova.objects.instance [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'numa_topology' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:37 compute-1 nova_compute[192795]: 2025-09-30 21:33:37.991 2 DEBUG nova.compute.manager [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:38 compute-1 nova_compute[192795]: 2025-09-30 21:33:38.068 2 DEBUG oslo_concurrency.lockutils [None req-31561587-0ff2-424b-ac27-21be03aecd88 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 14.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:38.693 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:38.695 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:38.695 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:39 compute-1 nova_compute[192795]: 2025-09-30 21:33:39.673 2 DEBUG nova.compute.manager [req-86e809d3-0557-4dc1-9f35-bee662db5128 req-b99526f1-11a4-4326-b613-57eac5bb44e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:39 compute-1 nova_compute[192795]: 2025-09-30 21:33:39.674 2 DEBUG oslo_concurrency.lockutils [req-86e809d3-0557-4dc1-9f35-bee662db5128 req-b99526f1-11a4-4326-b613-57eac5bb44e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:39 compute-1 nova_compute[192795]: 2025-09-30 21:33:39.674 2 DEBUG oslo_concurrency.lockutils [req-86e809d3-0557-4dc1-9f35-bee662db5128 req-b99526f1-11a4-4326-b613-57eac5bb44e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:39 compute-1 nova_compute[192795]: 2025-09-30 21:33:39.674 2 DEBUG oslo_concurrency.lockutils [req-86e809d3-0557-4dc1-9f35-bee662db5128 req-b99526f1-11a4-4326-b613-57eac5bb44e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:39 compute-1 nova_compute[192795]: 2025-09-30 21:33:39.675 2 DEBUG nova.compute.manager [req-86e809d3-0557-4dc1-9f35-bee662db5128 req-b99526f1-11a4-4326-b613-57eac5bb44e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] No waiting events found dispatching network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:39 compute-1 nova_compute[192795]: 2025-09-30 21:33:39.675 2 WARNING nova.compute.manager [req-86e809d3-0557-4dc1-9f35-bee662db5128 req-b99526f1-11a4-4326-b613-57eac5bb44e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received unexpected event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 for instance with vm_state stopped and task_state None.
Sep 30 21:33:40 compute-1 nova_compute[192795]: 2025-09-30 21:33:40.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:40 compute-1 nova_compute[192795]: 2025-09-30 21:33:40.963 2 INFO nova.compute.manager [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Rebuilding instance
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.273 2 DEBUG nova.compute.manager [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.336 2 DEBUG nova.objects.instance [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'pci_requests' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.351 2 DEBUG nova.objects.instance [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'pci_devices' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.369 2 DEBUG nova.objects.instance [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'resources' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.381 2 DEBUG nova.objects.instance [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'migration_context' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.389 2 DEBUG nova.objects.instance [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.393 2 INFO nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance already shutdown.
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.398 2 INFO nova.virt.libvirt.driver [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance destroyed successfully.
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.403 2 INFO nova.virt.libvirt.driver [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance destroyed successfully.
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.404 2 DEBUG nova.virt.libvirt.vif [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:33:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-139248646',display_name='tempest-tempest.common.compute-instance-139248646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-139248646',id=92,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:33:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b33d27d5088343569f4459643d0da580',ramdisk_id='',reservation_id='r-uwei7ucy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-553860193',owner_user_name='tempest-ServerActionsTestOtherA-553860193-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:40Z,user_data=None,user_id='c27a02706b9d43ffaab2c5fa833fec04',uuid=210f7950-671f-44dd-8721-fac4227dd74b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.404 2 DEBUG nova.network.os_vif_util [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converting VIF {"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.405 2 DEBUG nova.network.os_vif_util [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.405 2 DEBUG os_vif [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.407 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01ee41cf-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.412 2 INFO os_vif [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc')
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.413 2 INFO nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Deleting instance files /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b_del
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.413 2 INFO nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Deletion of /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b_del complete
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.633 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.634 2 INFO nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Creating image(s)
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.635 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.635 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.636 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.636 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:41 compute-1 nova_compute[192795]: 2025-09-30 21:33:41.637 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:42 compute-1 nova_compute[192795]: 2025-09-30 21:33:42.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:42 compute-1 nova_compute[192795]: 2025-09-30 21:33:42.945 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.006 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.007 2 DEBUG nova.virt.images [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] 29834554-3ec3-4459-bfde-932aa778e979 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.009 2 DEBUG nova.privsep.utils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.009 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.part /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.219 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.part /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.converted" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.226 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.288 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.converted --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.290 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.317 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.379 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.381 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.382 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.407 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.468 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.470 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.519 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.521 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.521 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.586 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.587 2 DEBUG nova.virt.disk.api [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Checking if we can resize image /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.588 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.660 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.662 2 DEBUG nova.virt.disk.api [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Cannot resize image /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.663 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.664 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Ensure instance console log exists: /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.664 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.665 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.666 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.670 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Start _get_guest_xml network_info=[{"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.677 2 WARNING nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.683 2 DEBUG nova.virt.libvirt.host [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.684 2 DEBUG nova.virt.libvirt.host [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.691 2 DEBUG nova.virt.libvirt.host [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.692 2 DEBUG nova.virt.libvirt.host [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.693 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.693 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.694 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.694 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.694 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.694 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.694 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.694 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.695 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.695 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.695 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.695 2 DEBUG nova.virt.hardware [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.695 2 DEBUG nova.objects.instance [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.718 2 DEBUG nova.virt.libvirt.vif [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:33:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-139248646',display_name='tempest-tempest.common.compute-instance-139248646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-139248646',id=92,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:33:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b33d27d5088343569f4459643d0da580',ramdisk_id='',reservation_id='r-uwei7ucy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-553860193',owner_user_name='tempest-ServerActionsTestOtherA-553860193-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:41Z,user_data=None,user_id='c27a02706b9d43ffaab2c5fa833fec04',uuid=210f7950-671f-44dd-8721-fac4227dd74b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.718 2 DEBUG nova.network.os_vif_util [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converting VIF {"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.719 2 DEBUG nova.network.os_vif_util [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.720 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <uuid>210f7950-671f-44dd-8721-fac4227dd74b</uuid>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <name>instance-0000005c</name>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <nova:name>tempest-tempest.common.compute-instance-139248646</nova:name>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:33:43</nova:creationTime>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:33:43 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:33:43 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:33:43 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:33:43 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:33:43 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:33:43 compute-1 nova_compute[192795]:         <nova:user uuid="c27a02706b9d43ffaab2c5fa833fec04">tempest-ServerActionsTestOtherA-553860193-project-member</nova:user>
Sep 30 21:33:43 compute-1 nova_compute[192795]:         <nova:project uuid="b33d27d5088343569f4459643d0da580">tempest-ServerActionsTestOtherA-553860193</nova:project>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="29834554-3ec3-4459-bfde-932aa778e979"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:33:43 compute-1 nova_compute[192795]:         <nova:port uuid="01ee41cf-cc1d-4380-8c0b-59da2846c8f9">
Sep 30 21:33:43 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <system>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <entry name="serial">210f7950-671f-44dd-8721-fac4227dd74b</entry>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <entry name="uuid">210f7950-671f-44dd-8721-fac4227dd74b</entry>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     </system>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <os>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   </os>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <features>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   </features>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.config"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:b8:bd:70"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <target dev="tap01ee41cf-cc"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/console.log" append="off"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <video>
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     </video>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:33:43 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:33:43 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:33:43 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:33:43 compute-1 nova_compute[192795]: </domain>
Sep 30 21:33:43 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.722 2 DEBUG nova.compute.manager [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Preparing to wait for external event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.723 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.723 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.724 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.725 2 DEBUG nova.virt.libvirt.vif [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:33:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-139248646',display_name='tempest-tempest.common.compute-instance-139248646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-139248646',id=92,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:33:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b33d27d5088343569f4459643d0da580',ramdisk_id='',reservation_id='r-uwei7ucy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-553860193',owner_user_name='tempest-ServerActionsTestOtherA-553860193-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:41Z,user_data=None,user_id='c27a02706b9d43ffaab2c5fa833fec04',uuid=210f7950-671f-44dd-8721-fac4227dd74b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.725 2 DEBUG nova.network.os_vif_util [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converting VIF {"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.726 2 DEBUG nova.network.os_vif_util [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.726 2 DEBUG os_vif [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.728 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.728 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.732 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01ee41cf-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.736 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01ee41cf-cc, col_values=(('external_ids', {'iface-id': '01ee41cf-cc1d-4380-8c0b-59da2846c8f9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:bd:70', 'vm-uuid': '210f7950-671f-44dd-8721-fac4227dd74b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:43 compute-1 NetworkManager[51724]: <info>  [1759268023.7394] manager: (tap01ee41cf-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.744 2 INFO os_vif [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc')
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.802 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.802 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.802 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] No VIF found with MAC fa:16:3e:b8:bd:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.803 2 INFO nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Using config drive
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.819 2 DEBUG nova.objects.instance [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:43 compute-1 nova_compute[192795]: 2025-09-30 21:33:43.852 2 DEBUG nova.objects.instance [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'keypairs' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.306 2 INFO nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Creating config drive at /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.config
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.312 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnnsqj7sc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.444 2 DEBUG oslo_concurrency.processutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnnsqj7sc" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:44 compute-1 kernel: tap01ee41cf-cc: entered promiscuous mode
Sep 30 21:33:44 compute-1 NetworkManager[51724]: <info>  [1759268024.5402] manager: (tap01ee41cf-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/182)
Sep 30 21:33:44 compute-1 ovn_controller[94902]: 2025-09-30T21:33:44Z|00359|binding|INFO|Claiming lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 for this chassis.
Sep 30 21:33:44 compute-1 ovn_controller[94902]: 2025-09-30T21:33:44Z|00360|binding|INFO|01ee41cf-cc1d-4380-8c0b-59da2846c8f9: Claiming fa:16:3e:b8:bd:70 10.100.0.10
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:44 compute-1 ovn_controller[94902]: 2025-09-30T21:33:44Z|00361|binding|INFO|Setting lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 ovn-installed in OVS
Sep 30 21:33:44 compute-1 ovn_controller[94902]: 2025-09-30T21:33:44Z|00362|binding|INFO|Setting lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 up in Southbound
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.553 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:bd:70 10.100.0.10'], port_security=['fa:16:3e:b8:bd:70 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '210f7950-671f-44dd-8721-fac4227dd74b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05b270a8-0653-4995-ab43-826254451140', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b33d27d5088343569f4459643d0da580', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e317a017-785e-4ba5-91f5-e79d51ecd764', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=babcc0e7-e8b1-4d4c-a7eb-2970ba0dde6e, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=01ee41cf-cc1d-4380-8c0b-59da2846c8f9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.558 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 in datapath 05b270a8-0653-4995-ab43-826254451140 bound to our chassis
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.561 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 05b270a8-0653-4995-ab43-826254451140
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.580 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8314dd0e-9cbb-4d50-b5b7-116fc216c24a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.582 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap05b270a8-01 in ovnmeta-05b270a8-0653-4995-ab43-826254451140 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:33:44 compute-1 systemd-udevd[235517]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.585 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap05b270a8-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.585 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fe217840-e956-4868-bb37-0fee0fd4288b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.586 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4cfedbf4-7691-4606-8ac8-ebec7d15fed0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 systemd-machined[152783]: New machine qemu-45-instance-0000005c.
Sep 30 21:33:44 compute-1 NetworkManager[51724]: <info>  [1759268024.6005] device (tap01ee41cf-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.601 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[45c9ff85-8e67-4650-8731-aa4d2f7e7f73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 NetworkManager[51724]: <info>  [1759268024.6039] device (tap01ee41cf-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:33:44 compute-1 systemd[1]: Started Virtual Machine qemu-45-instance-0000005c.
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.628 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d05d4e-03f1-47d7-a96d-7e5b52da8407]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.665 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[388e1ec2-73e2-458c-b43e-0628f98564a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.671 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e45d2d19-5b13-4183-bc63-b3629f360aaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 NetworkManager[51724]: <info>  [1759268024.6731] manager: (tap05b270a8-00): new Veth device (/org/freedesktop/NetworkManager/Devices/183)
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.725 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[62ca813f-7572-4dd5-8881-8101e7a8376a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.728 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[fa31223d-71c9-475f-a308-34996c923611]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 NetworkManager[51724]: <info>  [1759268024.7634] device (tap05b270a8-00): carrier: link connected
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.771 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[1eba16c6-3699-4b3d-8ddf-138425dcc512]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.799 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[991d8edb-6a3f-4540-8653-a50a067df6b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05b270a8-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:8b:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472043, 'reachable_time': 42652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235550, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.832 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[890bd104-b286-4e2d-bbe2-7cc56a82744d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:8b80'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 472043, 'tstamp': 472043}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235551, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.860 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a7cadfc3-2717-4c36-8de9-a3c2926c8a7b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05b270a8-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:8b:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472043, 'reachable_time': 42652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235552, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.877 2 DEBUG nova.compute.manager [req-875215ec-c69c-4677-9178-1d0af5b426fe req-35584a03-a475-4de4-81ef-ac92fc481125 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.877 2 DEBUG oslo_concurrency.lockutils [req-875215ec-c69c-4677-9178-1d0af5b426fe req-35584a03-a475-4de4-81ef-ac92fc481125 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.878 2 DEBUG oslo_concurrency.lockutils [req-875215ec-c69c-4677-9178-1d0af5b426fe req-35584a03-a475-4de4-81ef-ac92fc481125 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.878 2 DEBUG oslo_concurrency.lockutils [req-875215ec-c69c-4677-9178-1d0af5b426fe req-35584a03-a475-4de4-81ef-ac92fc481125 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:44 compute-1 nova_compute[192795]: 2025-09-30 21:33:44.878 2 DEBUG nova.compute.manager [req-875215ec-c69c-4677-9178-1d0af5b426fe req-35584a03-a475-4de4-81ef-ac92fc481125 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Processing event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:33:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:44.922 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2b840b32-67ee-4932-88d4-01ebd2460853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:45.015 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a58e4033-b840-450c-ba22-4af550a8f903]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:45.017 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05b270a8-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:45.018 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:45.019 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05b270a8-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:45 compute-1 kernel: tap05b270a8-00: entered promiscuous mode
Sep 30 21:33:45 compute-1 NetworkManager[51724]: <info>  [1759268025.0237] manager: (tap05b270a8-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:45.028 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap05b270a8-00, col_values=(('external_ids', {'iface-id': '28bea95f-2c3a-4d33-8bad-3373e1efde4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:45 compute-1 ovn_controller[94902]: 2025-09-30T21:33:45Z|00363|binding|INFO|Releasing lport 28bea95f-2c3a-4d33-8bad-3373e1efde4f from this chassis (sb_readonly=0)
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:45.032 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/05b270a8-0653-4995-ab43-826254451140.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/05b270a8-0653-4995-ab43-826254451140.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:45.040 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[57519f06-c4cd-49bc-b7fd-4f5ef21067ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:45.041 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-05b270a8-0653-4995-ab43-826254451140
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/05b270a8-0653-4995-ab43-826254451140.pid.haproxy
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 05b270a8-0653-4995-ab43-826254451140
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:33:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:45.043 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'env', 'PROCESS_TAG=haproxy-05b270a8-0653-4995-ab43-826254451140', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/05b270a8-0653-4995-ab43-826254451140.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:33:45 compute-1 podman[235592]: 2025-09-30 21:33:45.503631238 +0000 UTC m=+0.066082051 container create 6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:33:45 compute-1 podman[235592]: 2025-09-30 21:33:45.466350552 +0000 UTC m=+0.028801465 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:33:45 compute-1 systemd[1]: Started libpod-conmon-6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85.scope.
Sep 30 21:33:45 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:33:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/100ea17deaa24d47a55ecd48c66c23e595ccc023e501941d6c39e20667bf8fca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:33:45 compute-1 podman[235592]: 2025-09-30 21:33:45.625857087 +0000 UTC m=+0.188307910 container init 6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:33:45 compute-1 podman[235592]: 2025-09-30 21:33:45.637950397 +0000 UTC m=+0.200401210 container start 6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:33:45 compute-1 podman[235610]: 2025-09-30 21:33:45.644586108 +0000 UTC m=+0.075187180 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:33:45 compute-1 podman[235605]: 2025-09-30 21:33:45.669311061 +0000 UTC m=+0.103627974 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:33:45 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235621]: [NOTICE]   (235665) : New worker (235675) forked
Sep 30 21:33:45 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235621]: [NOTICE]   (235665) : Loading success.
Sep 30 21:33:45 compute-1 podman[235609]: 2025-09-30 21:33:45.686762437 +0000 UTC m=+0.122318203 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.805 2 DEBUG nova.compute.manager [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.806 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 210f7950-671f-44dd-8721-fac4227dd74b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.806 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268025.804442, 210f7950-671f-44dd-8721-fac4227dd74b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.807 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] VM Started (Lifecycle Event)
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.812 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.817 2 INFO nova.virt.libvirt.driver [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance spawned successfully.
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.817 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.863 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.868 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.868 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.868 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.869 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.869 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.870 2 DEBUG nova.virt.libvirt.driver [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.874 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.909 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.910 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268025.8047893, 210f7950-671f-44dd-8721-fac4227dd74b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.910 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] VM Paused (Lifecycle Event)
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.962 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.967 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268025.8123791, 210f7950-671f-44dd-8721-fac4227dd74b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.967 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] VM Resumed (Lifecycle Event)
Sep 30 21:33:45 compute-1 nova_compute[192795]: 2025-09-30 21:33:45.995 2 DEBUG nova.compute.manager [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.000 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.010 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.062 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.116 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.120 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.156 2 INFO nova.compute.manager [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] bringing vm to original state: 'stopped'
Sep 30 21:33:46 compute-1 unix_chkpwd[235686]: password check failed for user (root)
Sep 30 21:33:46 compute-1 sshd-session[235569]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.285 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.286 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.287 2 DEBUG nova.compute.manager [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.292 2 DEBUG nova.compute.manager [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Sep 30 21:33:46 compute-1 kernel: tap01ee41cf-cc (unregistering): left promiscuous mode
Sep 30 21:33:46 compute-1 NetworkManager[51724]: <info>  [1759268026.3526] device (tap01ee41cf-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:46 compute-1 ovn_controller[94902]: 2025-09-30T21:33:46Z|00364|binding|INFO|Releasing lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 from this chassis (sb_readonly=0)
Sep 30 21:33:46 compute-1 ovn_controller[94902]: 2025-09-30T21:33:46Z|00365|binding|INFO|Setting lport 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 down in Southbound
Sep 30 21:33:46 compute-1 ovn_controller[94902]: 2025-09-30T21:33:46Z|00366|binding|INFO|Removing iface tap01ee41cf-cc ovn-installed in OVS
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.387 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:bd:70 10.100.0.10'], port_security=['fa:16:3e:b8:bd:70 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '210f7950-671f-44dd-8721-fac4227dd74b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05b270a8-0653-4995-ab43-826254451140', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b33d27d5088343569f4459643d0da580', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'e317a017-785e-4ba5-91f5-e79d51ecd764', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=babcc0e7-e8b1-4d4c-a7eb-2970ba0dde6e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=01ee41cf-cc1d-4380-8c0b-59da2846c8f9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.388 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 01ee41cf-cc1d-4380-8c0b-59da2846c8f9 in datapath 05b270a8-0653-4995-ab43-826254451140 unbound from our chassis
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.390 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 05b270a8-0653-4995-ab43-826254451140, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.391 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c77356-0c12-4df3-86de-e2f502295690]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.392 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-05b270a8-0653-4995-ab43-826254451140 namespace which is not needed anymore
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:46 compute-1 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Sep 30 21:33:46 compute-1 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005c.scope: Consumed 1.489s CPU time.
Sep 30 21:33:46 compute-1 systemd-machined[152783]: Machine qemu-45-instance-0000005c terminated.
Sep 30 21:33:46 compute-1 NetworkManager[51724]: <info>  [1759268026.5557] manager: (tap01ee41cf-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/185)
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:46 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235621]: [NOTICE]   (235665) : haproxy version is 2.8.14-c23fe91
Sep 30 21:33:46 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235621]: [NOTICE]   (235665) : path to executable is /usr/sbin/haproxy
Sep 30 21:33:46 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235621]: [WARNING]  (235665) : Exiting Master process...
Sep 30 21:33:46 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235621]: [ALERT]    (235665) : Current worker (235675) exited with code 143 (Terminated)
Sep 30 21:33:46 compute-1 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[235621]: [WARNING]  (235665) : All workers exited. Exiting... (0)
Sep 30 21:33:46 compute-1 systemd[1]: libpod-6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85.scope: Deactivated successfully.
Sep 30 21:33:46 compute-1 conmon[235621]: conmon 6a8b1af66baa75f20a84 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85.scope/container/memory.events
Sep 30 21:33:46 compute-1 podman[235709]: 2025-09-30 21:33:46.579954196 +0000 UTC m=+0.061225829 container died 6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.613 2 INFO nova.virt.libvirt.driver [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance destroyed successfully.
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.615 2 DEBUG nova.compute.manager [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:46 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85-userdata-shm.mount: Deactivated successfully.
Sep 30 21:33:46 compute-1 systemd[1]: var-lib-containers-storage-overlay-100ea17deaa24d47a55ecd48c66c23e595ccc023e501941d6c39e20667bf8fca-merged.mount: Deactivated successfully.
Sep 30 21:33:46 compute-1 podman[235709]: 2025-09-30 21:33:46.652222224 +0000 UTC m=+0.133493767 container cleanup 6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:33:46 compute-1 systemd[1]: libpod-conmon-6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85.scope: Deactivated successfully.
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.736 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.736 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.737 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.737 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:33:46 compute-1 podman[235748]: 2025-09-30 21:33:46.753688218 +0000 UTC m=+0.065238659 container remove 6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.764 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[88a74f23-1d00-4f4a-8748-5632f40583c8]: (4, ('Tue Sep 30 09:33:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140 (6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85)\n6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85\nTue Sep 30 09:33:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140 (6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85)\n6a8b1af66baa75f20a8495b6ab731cfae91229b4f9699bfad07e0deb6819cb85\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.768 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9f2239f4-cc10-4783-9e93-8fa2e29bbff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.774 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05b270a8-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:46 compute-1 kernel: tap05b270a8-00: left promiscuous mode
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.784 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 0.498s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.806 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f4f20c08-0162-43e9-aeda-8e608889e8ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.843 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[49026172-7966-4cf5-ab3d-54b33507fd45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.845 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f21b267d-ef31-480b-8387-a4bcf52162d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.862 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.872 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[330953e2-e11f-4a12-9208-03d88f23b828]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472032, 'reachable_time': 23068, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235768, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.878 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-05b270a8-0653-4995-ab43-826254451140 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:33:46 compute-1 systemd[1]: run-netns-ovnmeta\x2d05b270a8\x2d0653\x2d4995\x2dab43\x2d826254451140.mount: Deactivated successfully.
Sep 30 21:33:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:46.879 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[a7275ea9-9d5a-4b91-a2dc-9e51df6d69cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.897 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.898 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.898 2 DEBUG nova.objects.instance [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.953 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:46 compute-1 nova_compute[192795]: 2025-09-30 21:33:46.955 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.019 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.100 2 DEBUG oslo_concurrency.lockutils [None req-68ab987a-5b42-44d9-ad30-b2a502248f2d c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.188 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.189 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5586MB free_disk=73.3509407043457GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.189 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.190 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.210 2 DEBUG nova.compute.manager [req-74efa391-6e31-4cdd-bc9f-5d0737496ce6 req-b1709ae7-87d4-42da-81f7-638d69d024bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.210 2 DEBUG oslo_concurrency.lockutils [req-74efa391-6e31-4cdd-bc9f-5d0737496ce6 req-b1709ae7-87d4-42da-81f7-638d69d024bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.211 2 DEBUG oslo_concurrency.lockutils [req-74efa391-6e31-4cdd-bc9f-5d0737496ce6 req-b1709ae7-87d4-42da-81f7-638d69d024bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.211 2 DEBUG oslo_concurrency.lockutils [req-74efa391-6e31-4cdd-bc9f-5d0737496ce6 req-b1709ae7-87d4-42da-81f7-638d69d024bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.211 2 DEBUG nova.compute.manager [req-74efa391-6e31-4cdd-bc9f-5d0737496ce6 req-b1709ae7-87d4-42da-81f7-638d69d024bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] No waiting events found dispatching network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.212 2 WARNING nova.compute.manager [req-74efa391-6e31-4cdd-bc9f-5d0737496ce6 req-b1709ae7-87d4-42da-81f7-638d69d024bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received unexpected event network-vif-plugged-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 for instance with vm_state stopped and task_state None.
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.263 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 210f7950-671f-44dd-8721-fac4227dd74b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.264 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.264 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.300 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.318 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.407 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.407 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:47 compute-1 sshd-session[235569]: Failed password for root from 185.156.73.233 port 18582 ssh2
Sep 30 21:33:47 compute-1 nova_compute[192795]: 2025-09-30 21:33:47.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:47 compute-1 sshd-session[235569]: Connection closed by authenticating user root 185.156.73.233 port 18582 [preauth]
Sep 30 21:33:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:33:48.123 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:48 compute-1 nova_compute[192795]: 2025-09-30 21:33:48.403 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:48 compute-1 nova_compute[192795]: 2025-09-30 21:33:48.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:49 compute-1 nova_compute[192795]: 2025-09-30 21:33:49.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:49 compute-1 nova_compute[192795]: 2025-09-30 21:33:49.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.075 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268015.0729244, c783547f-0799-4e53-8cdc-8784800b3c2d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.075 2 INFO nova.compute.manager [-] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] VM Stopped (Lifecycle Event)
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.136 2 DEBUG nova.compute.manager [None req-9ae4822d-2afb-4f73-995f-dddf7e6ca03b - - - - - -] [instance: c783547f-0799-4e53-8cdc-8784800b3c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.534 2 DEBUG oslo_concurrency.lockutils [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.534 2 DEBUG oslo_concurrency.lockutils [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.535 2 DEBUG oslo_concurrency.lockutils [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "210f7950-671f-44dd-8721-fac4227dd74b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.535 2 DEBUG oslo_concurrency.lockutils [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.535 2 DEBUG oslo_concurrency.lockutils [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.595 2 INFO nova.compute.manager [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Terminating instance
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.673 2 DEBUG nova.compute.manager [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.681 2 INFO nova.virt.libvirt.driver [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Instance destroyed successfully.
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.682 2 DEBUG nova.objects.instance [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'resources' on Instance uuid 210f7950-671f-44dd-8721-fac4227dd74b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.757 2 DEBUG nova.virt.libvirt.vif [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:33:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-139248646',display_name='tempest-tempest.common.compute-instance-139248646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-139248646',id=92,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:33:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b33d27d5088343569f4459643d0da580',ramdisk_id='',reservation_id='r-uwei7ucy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-553860193',owner_user_name='tempest-ServerActionsTestOtherA-553860193-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:33:46Z,user_data=None,user_id='c27a02706b9d43ffaab2c5fa833fec04',uuid=210f7950-671f-44dd-8721-fac4227dd74b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.758 2 DEBUG nova.network.os_vif_util [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converting VIF {"id": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "address": "fa:16:3e:b8:bd:70", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ee41cf-cc", "ovs_interfaceid": "01ee41cf-cc1d-4380-8c0b-59da2846c8f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.759 2 DEBUG nova.network.os_vif_util [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.759 2 DEBUG os_vif [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.761 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01ee41cf-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.768 2 INFO os_vif [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:bd:70,bridge_name='br-int',has_traffic_filtering=True,id=01ee41cf-cc1d-4380-8c0b-59da2846c8f9,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ee41cf-cc')
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.768 2 INFO nova.virt.libvirt.driver [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Deleting instance files /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b_del
Sep 30 21:33:50 compute-1 nova_compute[192795]: 2025-09-30 21:33:50.769 2 INFO nova.virt.libvirt.driver [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Deletion of /var/lib/nova/instances/210f7950-671f-44dd-8721-fac4227dd74b_del complete
Sep 30 21:33:51 compute-1 nova_compute[192795]: 2025-09-30 21:33:51.105 2 INFO nova.compute.manager [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:33:51 compute-1 nova_compute[192795]: 2025-09-30 21:33:51.106 2 DEBUG oslo.service.loopingcall [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:33:51 compute-1 nova_compute[192795]: 2025-09-30 21:33:51.107 2 DEBUG nova.compute.manager [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:33:51 compute-1 nova_compute[192795]: 2025-09-30 21:33:51.107 2 DEBUG nova.network.neutron [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:33:51 compute-1 podman[235775]: 2025-09-30 21:33:51.261869134 +0000 UTC m=+0.096541411 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:33:51 compute-1 nova_compute[192795]: 2025-09-30 21:33:51.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:51 compute-1 nova_compute[192795]: 2025-09-30 21:33:51.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.201 2 DEBUG nova.network.neutron [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.220 2 INFO nova.compute.manager [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Took 1.11 seconds to deallocate network for instance.
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.282 2 DEBUG oslo_concurrency.lockutils [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.283 2 DEBUG oslo_concurrency.lockutils [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.292 2 DEBUG nova.compute.manager [req-cb0aa7ee-74e9-4cae-8681-2a5e16b97bb5 req-38598081-06d8-49a3-b4bd-fe2797a1cdf1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Received event network-vif-deleted-01ee41cf-cc1d-4380-8c0b-59da2846c8f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.345 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.346 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.362 2 DEBUG nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.367 2 DEBUG nova.compute.provider_tree [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.396 2 DEBUG nova.scheduler.client.report [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.443 2 DEBUG oslo_concurrency.lockutils [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.476 2 INFO nova.scheduler.client.report [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Deleted allocations for instance 210f7950-671f-44dd-8721-fac4227dd74b
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.488 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.488 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.495 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.495 2 INFO nova.compute.claims [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.620 2 DEBUG oslo_concurrency.lockutils [None req-0fd347dd-42f3-4733-9140-abacba91f957 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "210f7950-671f-44dd-8721-fac4227dd74b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.690 2 DEBUG nova.compute.provider_tree [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.723 2 DEBUG nova.scheduler.client.report [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.750 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.751 2 DEBUG nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.820 2 DEBUG nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.821 2 DEBUG nova.network.neutron [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.841 2 INFO nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.860 2 DEBUG nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.983 2 DEBUG nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.985 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.986 2 INFO nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Creating image(s)
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.987 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "/var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.988 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "/var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:52 compute-1 nova_compute[192795]: 2025-09-30 21:33:52.990 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "/var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.019 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.116 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.118 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.119 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.146 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.233 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.235 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.275 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.277 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.278 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.338 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.340 2 DEBUG nova.virt.disk.api [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Checking if we can resize image /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.340 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.404 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.405 2 DEBUG nova.virt.disk.api [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Cannot resize image /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.405 2 DEBUG nova.objects.instance [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d6350eb-06de-41fb-b4c1-b9eccaf32004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.421 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.421 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Ensure instance console log exists: /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.421 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.422 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.422 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:53 compute-1 nova_compute[192795]: 2025-09-30 21:33:53.981 2 DEBUG nova.policy [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c2095460ad548128b184eef310a91cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '722921aac42f4b118de61dc2fb90ee28', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:33:54 compute-1 nova_compute[192795]: 2025-09-30 21:33:54.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:54 compute-1 nova_compute[192795]: 2025-09-30 21:33:54.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:55 compute-1 nova_compute[192795]: 2025-09-30 21:33:55.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:56 compute-1 nova_compute[192795]: 2025-09-30 21:33:56.389 2 DEBUG nova.network.neutron [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Successfully created port: fa1be576-06d0-4226-be22-23ea6f78268b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:33:57 compute-1 nova_compute[192795]: 2025-09-30 21:33:57.500 2 DEBUG nova.network.neutron [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Successfully updated port: fa1be576-06d0-4226-be22-23ea6f78268b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:33:57 compute-1 nova_compute[192795]: 2025-09-30 21:33:57.517 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "refresh_cache-6d6350eb-06de-41fb-b4c1-b9eccaf32004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:57 compute-1 nova_compute[192795]: 2025-09-30 21:33:57.518 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquired lock "refresh_cache-6d6350eb-06de-41fb-b4c1-b9eccaf32004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:57 compute-1 nova_compute[192795]: 2025-09-30 21:33:57.518 2 DEBUG nova.network.neutron [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:33:57 compute-1 nova_compute[192795]: 2025-09-30 21:33:57.580 2 DEBUG nova.compute.manager [req-fb7a95e2-cc2c-4756-a3aa-ee4c40f20e66 req-26147bad-4cff-4197-9e88-18285e43e4d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Received event network-changed-fa1be576-06d0-4226-be22-23ea6f78268b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:57 compute-1 nova_compute[192795]: 2025-09-30 21:33:57.581 2 DEBUG nova.compute.manager [req-fb7a95e2-cc2c-4756-a3aa-ee4c40f20e66 req-26147bad-4cff-4197-9e88-18285e43e4d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Refreshing instance network info cache due to event network-changed-fa1be576-06d0-4226-be22-23ea6f78268b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:33:57 compute-1 nova_compute[192795]: 2025-09-30 21:33:57.581 2 DEBUG oslo_concurrency.lockutils [req-fb7a95e2-cc2c-4756-a3aa-ee4c40f20e66 req-26147bad-4cff-4197-9e88-18285e43e4d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-6d6350eb-06de-41fb-b4c1-b9eccaf32004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:57 compute-1 nova_compute[192795]: 2025-09-30 21:33:57.654 2 DEBUG nova.network.neutron [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:33:57 compute-1 nova_compute[192795]: 2025-09-30 21:33:57.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:58 compute-1 podman[235812]: 2025-09-30 21:33:58.232973767 +0000 UTC m=+0.064934250 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:33:58 compute-1 podman[235811]: 2025-09-30 21:33:58.268399862 +0000 UTC m=+0.104821026 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:33:58 compute-1 podman[235813]: 2025-09-30 21:33:58.268369501 +0000 UTC m=+0.092853020 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.942 2 DEBUG nova.network.neutron [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Updating instance_info_cache with network_info: [{"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.973 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Releasing lock "refresh_cache-6d6350eb-06de-41fb-b4c1-b9eccaf32004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.974 2 DEBUG nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Instance network_info: |[{"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.975 2 DEBUG oslo_concurrency.lockutils [req-fb7a95e2-cc2c-4756-a3aa-ee4c40f20e66 req-26147bad-4cff-4197-9e88-18285e43e4d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-6d6350eb-06de-41fb-b4c1-b9eccaf32004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.975 2 DEBUG nova.network.neutron [req-fb7a95e2-cc2c-4756-a3aa-ee4c40f20e66 req-26147bad-4cff-4197-9e88-18285e43e4d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Refreshing network info cache for port fa1be576-06d0-4226-be22-23ea6f78268b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.978 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Start _get_guest_xml network_info=[{"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.985 2 WARNING nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.997 2 DEBUG nova.virt.libvirt.host [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:33:59 compute-1 nova_compute[192795]: 2025-09-30 21:33:59.998 2 DEBUG nova.virt.libvirt.host [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.002 2 DEBUG nova.virt.libvirt.host [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.002 2 DEBUG nova.virt.libvirt.host [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.003 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.004 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.004 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.004 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.004 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.005 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.005 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.005 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.005 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.006 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.006 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.006 2 DEBUG nova.virt.hardware [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.009 2 DEBUG nova.virt.libvirt.vif [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:33:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-1642627434',display_name='tempest-ServerMetadataNegativeTestJSON-server-1642627434',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-1642627434',id=95,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='722921aac42f4b118de61dc2fb90ee28',ramdisk_id='',reservation_id='r-n7ldgz83',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-1687512189',owner_user_name='tempest-ServerMetadataNegativeTestJSON-1687512189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:52Z,user_data=None,user_id='0c2095460ad548128b184eef310a91cd',uuid=6d6350eb-06de-41fb-b4c1-b9eccaf32004,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.010 2 DEBUG nova.network.os_vif_util [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Converting VIF {"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.010 2 DEBUG nova.network.os_vif_util [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:db:ef,bridge_name='br-int',has_traffic_filtering=True,id=fa1be576-06d0-4226-be22-23ea6f78268b,network=Network(543b1d30-4a8f-440d-912e-5ddc4b10ac21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1be576-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.011 2 DEBUG nova.objects.instance [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d6350eb-06de-41fb-b4c1-b9eccaf32004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.028 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <uuid>6d6350eb-06de-41fb-b4c1-b9eccaf32004</uuid>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <name>instance-0000005f</name>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerMetadataNegativeTestJSON-server-1642627434</nova:name>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:33:59</nova:creationTime>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:34:00 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:34:00 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:34:00 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:34:00 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:34:00 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:34:00 compute-1 nova_compute[192795]:         <nova:user uuid="0c2095460ad548128b184eef310a91cd">tempest-ServerMetadataNegativeTestJSON-1687512189-project-member</nova:user>
Sep 30 21:34:00 compute-1 nova_compute[192795]:         <nova:project uuid="722921aac42f4b118de61dc2fb90ee28">tempest-ServerMetadataNegativeTestJSON-1687512189</nova:project>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:34:00 compute-1 nova_compute[192795]:         <nova:port uuid="fa1be576-06d0-4226-be22-23ea6f78268b">
Sep 30 21:34:00 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <system>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <entry name="serial">6d6350eb-06de-41fb-b4c1-b9eccaf32004</entry>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <entry name="uuid">6d6350eb-06de-41fb-b4c1-b9eccaf32004</entry>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     </system>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <os>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   </os>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <features>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   </features>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk.config"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:13:db:ef"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <target dev="tapfa1be576-06"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/console.log" append="off"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <video>
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     </video>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:34:00 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:34:00 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:34:00 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:34:00 compute-1 nova_compute[192795]: </domain>
Sep 30 21:34:00 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.029 2 DEBUG nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Preparing to wait for external event network-vif-plugged-fa1be576-06d0-4226-be22-23ea6f78268b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.030 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.031 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.031 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.033 2 DEBUG nova.virt.libvirt.vif [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:33:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-1642627434',display_name='tempest-ServerMetadataNegativeTestJSON-server-1642627434',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-1642627434',id=95,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='722921aac42f4b118de61dc2fb90ee28',ramdisk_id='',reservation_id='r-n7ldgz83',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-1687512189',owner_user_name='tempest-ServerMetadataNegativeTestJSON-1687512189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:33:52Z,user_data=None,user_id='0c2095460ad548128b184eef310a91cd',uuid=6d6350eb-06de-41fb-b4c1-b9eccaf32004,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.033 2 DEBUG nova.network.os_vif_util [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Converting VIF {"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.034 2 DEBUG nova.network.os_vif_util [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:db:ef,bridge_name='br-int',has_traffic_filtering=True,id=fa1be576-06d0-4226-be22-23ea6f78268b,network=Network(543b1d30-4a8f-440d-912e-5ddc4b10ac21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1be576-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.035 2 DEBUG os_vif [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:db:ef,bridge_name='br-int',has_traffic_filtering=True,id=fa1be576-06d0-4226-be22-23ea6f78268b,network=Network(543b1d30-4a8f-440d-912e-5ddc4b10ac21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1be576-06') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.038 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.040 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.048 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa1be576-06, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.049 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfa1be576-06, col_values=(('external_ids', {'iface-id': 'fa1be576-06d0-4226-be22-23ea6f78268b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:db:ef', 'vm-uuid': '6d6350eb-06de-41fb-b4c1-b9eccaf32004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:00 compute-1 NetworkManager[51724]: <info>  [1759268040.0546] manager: (tapfa1be576-06): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.064 2 INFO os_vif [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:db:ef,bridge_name='br-int',has_traffic_filtering=True,id=fa1be576-06d0-4226-be22-23ea6f78268b,network=Network(543b1d30-4a8f-440d-912e-5ddc4b10ac21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1be576-06')
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.117 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.117 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.117 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] No VIF found with MAC fa:16:3e:13:db:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.118 2 INFO nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Using config drive
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.720 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:34:00 compute-1 nova_compute[192795]: 2025-09-30 21:34:00.721 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:34:01 compute-1 nova_compute[192795]: 2025-09-30 21:34:01.612 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268026.610572, 210f7950-671f-44dd-8721-fac4227dd74b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:01 compute-1 nova_compute[192795]: 2025-09-30 21:34:01.613 2 INFO nova.compute.manager [-] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] VM Stopped (Lifecycle Event)
Sep 30 21:34:01 compute-1 nova_compute[192795]: 2025-09-30 21:34:01.637 2 DEBUG nova.compute.manager [None req-b276a255-38e9-4087-ae18-6ff913c63918 - - - - - -] [instance: 210f7950-671f-44dd-8721-fac4227dd74b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:01 compute-1 nova_compute[192795]: 2025-09-30 21:34:01.748 2 INFO nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Creating config drive at /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk.config
Sep 30 21:34:01 compute-1 nova_compute[192795]: 2025-09-30 21:34:01.754 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz7u5zpb7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:01 compute-1 nova_compute[192795]: 2025-09-30 21:34:01.892 2 DEBUG oslo_concurrency.processutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz7u5zpb7" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:01 compute-1 kernel: tapfa1be576-06: entered promiscuous mode
Sep 30 21:34:01 compute-1 NetworkManager[51724]: <info>  [1759268041.9733] manager: (tapfa1be576-06): new Tun device (/org/freedesktop/NetworkManager/Devices/187)
Sep 30 21:34:01 compute-1 nova_compute[192795]: 2025-09-30 21:34:01.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:01 compute-1 ovn_controller[94902]: 2025-09-30T21:34:01Z|00367|binding|INFO|Claiming lport fa1be576-06d0-4226-be22-23ea6f78268b for this chassis.
Sep 30 21:34:01 compute-1 ovn_controller[94902]: 2025-09-30T21:34:01Z|00368|binding|INFO|fa1be576-06d0-4226-be22-23ea6f78268b: Claiming fa:16:3e:13:db:ef 10.100.0.7
Sep 30 21:34:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:01.982 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:db:ef 10.100.0.7'], port_security=['fa:16:3e:13:db:ef 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6d6350eb-06de-41fb-b4c1-b9eccaf32004', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-543b1d30-4a8f-440d-912e-5ddc4b10ac21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '722921aac42f4b118de61dc2fb90ee28', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a288a139-60af-4cf6-8230-0e1fa75f4662', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=614e46ff-246d-4aff-aaef-57667b07ef26, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=fa1be576-06d0-4226-be22-23ea6f78268b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:34:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:01.984 103861 INFO neutron.agent.ovn.metadata.agent [-] Port fa1be576-06d0-4226-be22-23ea6f78268b in datapath 543b1d30-4a8f-440d-912e-5ddc4b10ac21 bound to our chassis
Sep 30 21:34:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:01.985 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 543b1d30-4a8f-440d-912e-5ddc4b10ac21
Sep 30 21:34:01 compute-1 ovn_controller[94902]: 2025-09-30T21:34:01Z|00369|binding|INFO|Setting lport fa1be576-06d0-4226-be22-23ea6f78268b ovn-installed in OVS
Sep 30 21:34:01 compute-1 ovn_controller[94902]: 2025-09-30T21:34:01Z|00370|binding|INFO|Setting lport fa1be576-06d0-4226-be22-23ea6f78268b up in Southbound
Sep 30 21:34:01 compute-1 nova_compute[192795]: 2025-09-30 21:34:01.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:01 compute-1 nova_compute[192795]: 2025-09-30 21:34:01.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.002 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa10727-0bc9-446d-a31d-3a51937b7266]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.003 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap543b1d30-41 in ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.005 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap543b1d30-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.005 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1759a7d5-8922-4a04-a291-7da1f83d41f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.007 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d748abab-c500-4dc2-aaa4-9ecd1b5ddc1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 systemd-udevd[235899]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:34:02 compute-1 systemd-machined[152783]: New machine qemu-46-instance-0000005f.
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.027 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[224de128-1947-4658-b710-9fe90c09fcc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 NetworkManager[51724]: <info>  [1759268042.0334] device (tapfa1be576-06): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:34:02 compute-1 NetworkManager[51724]: <info>  [1759268042.0341] device (tapfa1be576-06): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:34:02 compute-1 systemd[1]: Started Virtual Machine qemu-46-instance-0000005f.
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.060 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9d8e4252-d163-4909-aba4-ac4703a70882]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.098 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[33fceb92-59b1-4a9e-aa5c-7c65a5fde7d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.105 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5d92d5-11c8-4888-a24d-abecdb63b32a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 NetworkManager[51724]: <info>  [1759268042.1062] manager: (tap543b1d30-40): new Veth device (/org/freedesktop/NetworkManager/Devices/188)
Sep 30 21:34:02 compute-1 systemd-udevd[235902]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.138 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[56ff682f-1ee1-423f-969c-b6eb69f704a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.141 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[005fd6e9-c559-4492-ad82-1b594cf7706e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 NetworkManager[51724]: <info>  [1759268042.1709] device (tap543b1d30-40): carrier: link connected
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.178 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7698b4fb-89d6-47ba-a8cb-f7e5a082fd79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.196 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[62e82198-3f9f-488e-a68b-d1203bfc5196]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap543b1d30-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:ae:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473784, 'reachable_time': 35347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235931, 'error': None, 'target': 'ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.212 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[90e8cb1f-b9ef-483f-a45e-7e57a3ba1b7d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:ae0d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473784, 'tstamp': 473784}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235932, 'error': None, 'target': 'ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.229 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6a775981-1e0d-48c7-a0a0-e2d603ced575]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap543b1d30-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:ae:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473784, 'reachable_time': 35347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235933, 'error': None, 'target': 'ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.268 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0e074095-da2e-4bbf-a7be-faf9d86d3c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.350 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb700b3-a01b-41e2-94b2-e2f1d860e318]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.352 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap543b1d30-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.353 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.353 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap543b1d30-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:02 compute-1 kernel: tap543b1d30-40: entered promiscuous mode
Sep 30 21:34:02 compute-1 NetworkManager[51724]: <info>  [1759268042.3565] manager: (tap543b1d30-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.359 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap543b1d30-40, col_values=(('external_ids', {'iface-id': '6d756019-20b0-4357-808a-a3b19d6251b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:02 compute-1 ovn_controller[94902]: 2025-09-30T21:34:02Z|00371|binding|INFO|Releasing lport 6d756019-20b0-4357-808a-a3b19d6251b6 from this chassis (sb_readonly=0)
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.381 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/543b1d30-4a8f-440d-912e-5ddc4b10ac21.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/543b1d30-4a8f-440d-912e-5ddc4b10ac21.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.383 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c73afc52-a69b-4ec9-9cb1-9befb5e4bc06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.384 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-543b1d30-4a8f-440d-912e-5ddc4b10ac21
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/543b1d30-4a8f-440d-912e-5ddc4b10ac21.pid.haproxy
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 543b1d30-4a8f-440d-912e-5ddc4b10ac21
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:34:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:02.385 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21', 'env', 'PROCESS_TAG=haproxy-543b1d30-4a8f-440d-912e-5ddc4b10ac21', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/543b1d30-4a8f-440d-912e-5ddc4b10ac21.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:02 compute-1 podman[235972]: 2025-09-30 21:34:02.813492171 +0000 UTC m=+0.062054670 container create 4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:34:02 compute-1 systemd[1]: Started libpod-conmon-4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73.scope.
Sep 30 21:34:02 compute-1 podman[235972]: 2025-09-30 21:34:02.784215704 +0000 UTC m=+0.032778223 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.877 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268042.8770504, 6d6350eb-06de-41fb-b4c1-b9eccaf32004 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.879 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] VM Started (Lifecycle Event)
Sep 30 21:34:02 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:34:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0877aebadbedf7dcdb3102a364ededeb0a3d01d8d65cf494bbd4724758fdda6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:34:02 compute-1 podman[235972]: 2025-09-30 21:34:02.902574668 +0000 UTC m=+0.151137197 container init 4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:34:02 compute-1 podman[235972]: 2025-09-30 21:34:02.909180788 +0000 UTC m=+0.157743287 container start 4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.914 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.920 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268042.878756, 6d6350eb-06de-41fb-b4c1-b9eccaf32004 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.920 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] VM Paused (Lifecycle Event)
Sep 30 21:34:02 compute-1 neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21[235988]: [NOTICE]   (235992) : New worker (235994) forked
Sep 30 21:34:02 compute-1 neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21[235988]: [NOTICE]   (235992) : Loading success.
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.944 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:02 compute-1 nova_compute[192795]: 2025-09-30 21:34:02.949 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.003 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.094 2 DEBUG nova.compute.manager [req-82355de9-8854-447d-b70a-dff842aa6879 req-7b1ab2b3-1e99-4b41-8bf1-d243c231cd97 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Received event network-vif-plugged-fa1be576-06d0-4226-be22-23ea6f78268b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.094 2 DEBUG oslo_concurrency.lockutils [req-82355de9-8854-447d-b70a-dff842aa6879 req-7b1ab2b3-1e99-4b41-8bf1-d243c231cd97 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.094 2 DEBUG oslo_concurrency.lockutils [req-82355de9-8854-447d-b70a-dff842aa6879 req-7b1ab2b3-1e99-4b41-8bf1-d243c231cd97 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.095 2 DEBUG oslo_concurrency.lockutils [req-82355de9-8854-447d-b70a-dff842aa6879 req-7b1ab2b3-1e99-4b41-8bf1-d243c231cd97 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.095 2 DEBUG nova.compute.manager [req-82355de9-8854-447d-b70a-dff842aa6879 req-7b1ab2b3-1e99-4b41-8bf1-d243c231cd97 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Processing event network-vif-plugged-fa1be576-06d0-4226-be22-23ea6f78268b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.095 2 DEBUG nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.098 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268043.098576, 6d6350eb-06de-41fb-b4c1-b9eccaf32004 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.099 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] VM Resumed (Lifecycle Event)
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.103 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.106 2 INFO nova.virt.libvirt.driver [-] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Instance spawned successfully.
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.106 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.128 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.134 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.138 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.138 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.138 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.139 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.139 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.140 2 DEBUG nova.virt.libvirt.driver [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.176 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.227 2 INFO nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Took 10.24 seconds to spawn the instance on the hypervisor.
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.227 2 DEBUG nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.336 2 INFO nova.compute.manager [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Took 10.89 seconds to build instance.
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.369 2 DEBUG oslo_concurrency.lockutils [None req-bd833d9f-0bbd-45c9-983b-49d1ea5c290b 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.664 2 DEBUG nova.network.neutron [req-fb7a95e2-cc2c-4756-a3aa-ee4c40f20e66 req-26147bad-4cff-4197-9e88-18285e43e4d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Updated VIF entry in instance network info cache for port fa1be576-06d0-4226-be22-23ea6f78268b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.665 2 DEBUG nova.network.neutron [req-fb7a95e2-cc2c-4756-a3aa-ee4c40f20e66 req-26147bad-4cff-4197-9e88-18285e43e4d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Updating instance_info_cache with network_info: [{"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.688 2 DEBUG oslo_concurrency.lockutils [req-fb7a95e2-cc2c-4756-a3aa-ee4c40f20e66 req-26147bad-4cff-4197-9e88-18285e43e4d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-6d6350eb-06de-41fb-b4c1-b9eccaf32004" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:03 compute-1 nova_compute[192795]: 2025-09-30 21:34:03.716 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:05 compute-1 nova_compute[192795]: 2025-09-30 21:34:05.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:05 compute-1 nova_compute[192795]: 2025-09-30 21:34:05.305 2 DEBUG nova.compute.manager [req-27907526-1375-465e-a47e-8387f1e3458f req-de36a8c5-f456-47e6-a665-cce774c7d716 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Received event network-vif-plugged-fa1be576-06d0-4226-be22-23ea6f78268b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:05 compute-1 nova_compute[192795]: 2025-09-30 21:34:05.306 2 DEBUG oslo_concurrency.lockutils [req-27907526-1375-465e-a47e-8387f1e3458f req-de36a8c5-f456-47e6-a665-cce774c7d716 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:05 compute-1 nova_compute[192795]: 2025-09-30 21:34:05.306 2 DEBUG oslo_concurrency.lockutils [req-27907526-1375-465e-a47e-8387f1e3458f req-de36a8c5-f456-47e6-a665-cce774c7d716 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:05 compute-1 nova_compute[192795]: 2025-09-30 21:34:05.306 2 DEBUG oslo_concurrency.lockutils [req-27907526-1375-465e-a47e-8387f1e3458f req-de36a8c5-f456-47e6-a665-cce774c7d716 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:05 compute-1 nova_compute[192795]: 2025-09-30 21:34:05.306 2 DEBUG nova.compute.manager [req-27907526-1375-465e-a47e-8387f1e3458f req-de36a8c5-f456-47e6-a665-cce774c7d716 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] No waiting events found dispatching network-vif-plugged-fa1be576-06d0-4226-be22-23ea6f78268b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:05 compute-1 nova_compute[192795]: 2025-09-30 21:34:05.307 2 WARNING nova.compute.manager [req-27907526-1375-465e-a47e-8387f1e3458f req-de36a8c5-f456-47e6-a665-cce774c7d716 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Received unexpected event network-vif-plugged-fa1be576-06d0-4226-be22-23ea6f78268b for instance with vm_state active and task_state None.
Sep 30 21:34:07 compute-1 nova_compute[192795]: 2025-09-30 21:34:07.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:08 compute-1 podman[236003]: 2025-09-30 21:34:08.223879741 +0000 UTC m=+0.063347616 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid)
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.326 2 DEBUG oslo_concurrency.lockutils [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.327 2 DEBUG oslo_concurrency.lockutils [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.328 2 DEBUG oslo_concurrency.lockutils [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.328 2 DEBUG oslo_concurrency.lockutils [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.328 2 DEBUG oslo_concurrency.lockutils [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.344 2 INFO nova.compute.manager [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Terminating instance
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.428 2 DEBUG nova.compute.manager [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:34:09 compute-1 kernel: tapfa1be576-06 (unregistering): left promiscuous mode
Sep 30 21:34:09 compute-1 NetworkManager[51724]: <info>  [1759268049.4534] device (tapfa1be576-06): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:34:09 compute-1 ovn_controller[94902]: 2025-09-30T21:34:09Z|00372|binding|INFO|Releasing lport fa1be576-06d0-4226-be22-23ea6f78268b from this chassis (sb_readonly=0)
Sep 30 21:34:09 compute-1 ovn_controller[94902]: 2025-09-30T21:34:09Z|00373|binding|INFO|Setting lport fa1be576-06d0-4226-be22-23ea6f78268b down in Southbound
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:09 compute-1 ovn_controller[94902]: 2025-09-30T21:34:09Z|00374|binding|INFO|Removing iface tapfa1be576-06 ovn-installed in OVS
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.510 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:db:ef 10.100.0.7'], port_security=['fa:16:3e:13:db:ef 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6d6350eb-06de-41fb-b4c1-b9eccaf32004', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-543b1d30-4a8f-440d-912e-5ddc4b10ac21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '722921aac42f4b118de61dc2fb90ee28', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a288a139-60af-4cf6-8230-0e1fa75f4662', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=614e46ff-246d-4aff-aaef-57667b07ef26, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=fa1be576-06d0-4226-be22-23ea6f78268b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.511 103861 INFO neutron.agent.ovn.metadata.agent [-] Port fa1be576-06d0-4226-be22-23ea6f78268b in datapath 543b1d30-4a8f-440d-912e-5ddc4b10ac21 unbound from our chassis
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.513 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 543b1d30-4a8f-440d-912e-5ddc4b10ac21, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.514 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[00169ca7-37b3-4ac9-bb2d-5f28a09bf8c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.515 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21 namespace which is not needed anymore
Sep 30 21:34:09 compute-1 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Sep 30 21:34:09 compute-1 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000005f.scope: Consumed 7.183s CPU time.
Sep 30 21:34:09 compute-1 systemd-machined[152783]: Machine qemu-46-instance-0000005f terminated.
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:09 compute-1 neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21[235988]: [NOTICE]   (235992) : haproxy version is 2.8.14-c23fe91
Sep 30 21:34:09 compute-1 neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21[235988]: [NOTICE]   (235992) : path to executable is /usr/sbin/haproxy
Sep 30 21:34:09 compute-1 neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21[235988]: [WARNING]  (235992) : Exiting Master process...
Sep 30 21:34:09 compute-1 neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21[235988]: [ALERT]    (235992) : Current worker (235994) exited with code 143 (Terminated)
Sep 30 21:34:09 compute-1 neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21[235988]: [WARNING]  (235992) : All workers exited. Exiting... (0)
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:09 compute-1 systemd[1]: libpod-4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73.scope: Deactivated successfully.
Sep 30 21:34:09 compute-1 podman[236050]: 2025-09-30 21:34:09.671592105 +0000 UTC m=+0.054147517 container died 4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:34:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73-userdata-shm.mount: Deactivated successfully.
Sep 30 21:34:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-e0877aebadbedf7dcdb3102a364ededeb0a3d01d8d65cf494bbd4724758fdda6-merged.mount: Deactivated successfully.
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.716 2 INFO nova.virt.libvirt.driver [-] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Instance destroyed successfully.
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.717 2 DEBUG nova.objects.instance [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lazy-loading 'resources' on Instance uuid 6d6350eb-06de-41fb-b4c1-b9eccaf32004 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:09 compute-1 podman[236050]: 2025-09-30 21:34:09.721248328 +0000 UTC m=+0.103803730 container cleanup 4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.730 2 DEBUG nova.virt.libvirt.vif [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:33:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-1642627434',display_name='tempest-ServerMetadataNegativeTestJSON-server-1642627434',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-1642627434',id=95,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:34:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='722921aac42f4b118de61dc2fb90ee28',ramdisk_id='',reservation_id='r-n7ldgz83',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-1687512189',owner_user_name='tempest-ServerMetadataNegativeTestJSON-1687512189-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:34:03Z,user_data=None,user_id='0c2095460ad548128b184eef310a91cd',uuid=6d6350eb-06de-41fb-b4c1-b9eccaf32004,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.731 2 DEBUG nova.network.os_vif_util [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Converting VIF {"id": "fa1be576-06d0-4226-be22-23ea6f78268b", "address": "fa:16:3e:13:db:ef", "network": {"id": "543b1d30-4a8f-440d-912e-5ddc4b10ac21", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1931634106-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722921aac42f4b118de61dc2fb90ee28", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1be576-06", "ovs_interfaceid": "fa1be576-06d0-4226-be22-23ea6f78268b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.731 2 DEBUG nova.network.os_vif_util [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:db:ef,bridge_name='br-int',has_traffic_filtering=True,id=fa1be576-06d0-4226-be22-23ea6f78268b,network=Network(543b1d30-4a8f-440d-912e-5ddc4b10ac21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1be576-06') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.732 2 DEBUG os_vif [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:db:ef,bridge_name='br-int',has_traffic_filtering=True,id=fa1be576-06d0-4226-be22-23ea6f78268b,network=Network(543b1d30-4a8f-440d-912e-5ddc4b10ac21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1be576-06') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:34:09 compute-1 systemd[1]: libpod-conmon-4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73.scope: Deactivated successfully.
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.734 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa1be576-06, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.743 2 INFO os_vif [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:db:ef,bridge_name='br-int',has_traffic_filtering=True,id=fa1be576-06d0-4226-be22-23ea6f78268b,network=Network(543b1d30-4a8f-440d-912e-5ddc4b10ac21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1be576-06')
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.744 2 INFO nova.virt.libvirt.driver [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Deleting instance files /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004_del
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.745 2 INFO nova.virt.libvirt.driver [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Deletion of /var/lib/nova/instances/6d6350eb-06de-41fb-b4c1-b9eccaf32004_del complete
Sep 30 21:34:09 compute-1 podman[236093]: 2025-09-30 21:34:09.797054072 +0000 UTC m=+0.047973608 container remove 4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.804 2 INFO nova.compute.manager [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Took 0.37 seconds to destroy the instance on the hypervisor.
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.805 2 DEBUG oslo.service.loopingcall [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.805 2 DEBUG nova.compute.manager [-] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.805 2 DEBUG nova.network.neutron [-] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.805 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[625a1993-3153-4dab-857d-90381259be81]: (4, ('Tue Sep 30 09:34:09 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21 (4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73)\n4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73\nTue Sep 30 09:34:09 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21 (4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73)\n4daa7649ce91043ec6388874498ea8697edea3f5f7d3025f978b8e572c4cab73\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.807 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3e3f5d-8139-478f-be0d-495bef4b077d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.808 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap543b1d30-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:09 compute-1 kernel: tap543b1d30-40: left promiscuous mode
Sep 30 21:34:09 compute-1 nova_compute[192795]: 2025-09-30 21:34:09.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.829 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[54265353-362c-4946-b541-34f06719d50d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.865 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc6a6b1-645e-4eb0-a104-7247fc2ec95d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.868 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[53dcb3c4-c284-4e3e-87a4-4fb26db8bce2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.884 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d392a1d8-63b8-4cc7-a959-ea3a306b378a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473776, 'reachable_time': 36729, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236108, 'error': None, 'target': 'ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.887 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-543b1d30-4a8f-440d-912e-5ddc4b10ac21 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:34:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:09.887 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[c554bfc2-d195-41d9-b6a0-ffb932141ebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:09 compute-1 systemd[1]: run-netns-ovnmeta\x2d543b1d30\x2d4a8f\x2d440d\x2d912e\x2d5ddc4b10ac21.mount: Deactivated successfully.
Sep 30 21:34:10 compute-1 nova_compute[192795]: 2025-09-30 21:34:10.105 2 DEBUG nova.compute.manager [req-d6443811-6c77-4da5-b5f9-d3ef4556f69f req-cd8c3b63-2e82-4785-bae4-3f26712a3c69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Received event network-vif-unplugged-fa1be576-06d0-4226-be22-23ea6f78268b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:10 compute-1 nova_compute[192795]: 2025-09-30 21:34:10.107 2 DEBUG oslo_concurrency.lockutils [req-d6443811-6c77-4da5-b5f9-d3ef4556f69f req-cd8c3b63-2e82-4785-bae4-3f26712a3c69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:10 compute-1 nova_compute[192795]: 2025-09-30 21:34:10.107 2 DEBUG oslo_concurrency.lockutils [req-d6443811-6c77-4da5-b5f9-d3ef4556f69f req-cd8c3b63-2e82-4785-bae4-3f26712a3c69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:10 compute-1 nova_compute[192795]: 2025-09-30 21:34:10.107 2 DEBUG oslo_concurrency.lockutils [req-d6443811-6c77-4da5-b5f9-d3ef4556f69f req-cd8c3b63-2e82-4785-bae4-3f26712a3c69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:10 compute-1 nova_compute[192795]: 2025-09-30 21:34:10.108 2 DEBUG nova.compute.manager [req-d6443811-6c77-4da5-b5f9-d3ef4556f69f req-cd8c3b63-2e82-4785-bae4-3f26712a3c69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] No waiting events found dispatching network-vif-unplugged-fa1be576-06d0-4226-be22-23ea6f78268b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:10 compute-1 nova_compute[192795]: 2025-09-30 21:34:10.108 2 DEBUG nova.compute.manager [req-d6443811-6c77-4da5-b5f9-d3ef4556f69f req-cd8c3b63-2e82-4785-bae4-3f26712a3c69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Received event network-vif-unplugged-fa1be576-06d0-4226-be22-23ea6f78268b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:34:10 compute-1 nova_compute[192795]: 2025-09-30 21:34:10.908 2 DEBUG nova.network.neutron [-] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:10 compute-1 nova_compute[192795]: 2025-09-30 21:34:10.941 2 INFO nova.compute.manager [-] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Took 1.14 seconds to deallocate network for instance.
Sep 30 21:34:11 compute-1 nova_compute[192795]: 2025-09-30 21:34:11.090 2 DEBUG oslo_concurrency.lockutils [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:11 compute-1 nova_compute[192795]: 2025-09-30 21:34:11.091 2 DEBUG oslo_concurrency.lockutils [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:11 compute-1 nova_compute[192795]: 2025-09-30 21:34:11.169 2 DEBUG nova.compute.provider_tree [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:11 compute-1 nova_compute[192795]: 2025-09-30 21:34:11.410 2 DEBUG nova.scheduler.client.report [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:11 compute-1 nova_compute[192795]: 2025-09-30 21:34:11.441 2 DEBUG oslo_concurrency.lockutils [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:11 compute-1 nova_compute[192795]: 2025-09-30 21:34:11.473 2 INFO nova.scheduler.client.report [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Deleted allocations for instance 6d6350eb-06de-41fb-b4c1-b9eccaf32004
Sep 30 21:34:11 compute-1 nova_compute[192795]: 2025-09-30 21:34:11.686 2 DEBUG oslo_concurrency.lockutils [None req-34117de3-edb3-4196-bb78-1376228a450a 0c2095460ad548128b184eef310a91cd 722921aac42f4b118de61dc2fb90ee28 - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:12 compute-1 nova_compute[192795]: 2025-09-30 21:34:12.224 2 DEBUG nova.compute.manager [req-bd885845-7473-4008-8d11-88574c525dee req-a67e8744-ef10-41c2-876c-4efc7acd3db8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Received event network-vif-plugged-fa1be576-06d0-4226-be22-23ea6f78268b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:12 compute-1 nova_compute[192795]: 2025-09-30 21:34:12.225 2 DEBUG oslo_concurrency.lockutils [req-bd885845-7473-4008-8d11-88574c525dee req-a67e8744-ef10-41c2-876c-4efc7acd3db8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:12 compute-1 nova_compute[192795]: 2025-09-30 21:34:12.225 2 DEBUG oslo_concurrency.lockutils [req-bd885845-7473-4008-8d11-88574c525dee req-a67e8744-ef10-41c2-876c-4efc7acd3db8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:12 compute-1 nova_compute[192795]: 2025-09-30 21:34:12.226 2 DEBUG oslo_concurrency.lockutils [req-bd885845-7473-4008-8d11-88574c525dee req-a67e8744-ef10-41c2-876c-4efc7acd3db8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6d6350eb-06de-41fb-b4c1-b9eccaf32004-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:12 compute-1 nova_compute[192795]: 2025-09-30 21:34:12.226 2 DEBUG nova.compute.manager [req-bd885845-7473-4008-8d11-88574c525dee req-a67e8744-ef10-41c2-876c-4efc7acd3db8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] No waiting events found dispatching network-vif-plugged-fa1be576-06d0-4226-be22-23ea6f78268b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:12 compute-1 nova_compute[192795]: 2025-09-30 21:34:12.226 2 WARNING nova.compute.manager [req-bd885845-7473-4008-8d11-88574c525dee req-a67e8744-ef10-41c2-876c-4efc7acd3db8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Received unexpected event network-vif-plugged-fa1be576-06d0-4226-be22-23ea6f78268b for instance with vm_state deleted and task_state None.
Sep 30 21:34:12 compute-1 nova_compute[192795]: 2025-09-30 21:34:12.227 2 DEBUG nova.compute.manager [req-bd885845-7473-4008-8d11-88574c525dee req-a67e8744-ef10-41c2-876c-4efc7acd3db8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Received event network-vif-deleted-fa1be576-06d0-4226-be22-23ea6f78268b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:12 compute-1 nova_compute[192795]: 2025-09-30 21:34:12.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:13 compute-1 sshd-session[236109]: Invalid user ps from 167.71.248.239 port 43252
Sep 30 21:34:13 compute-1 sshd-session[236109]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:34:13 compute-1 sshd-session[236109]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239
Sep 30 21:34:14 compute-1 nova_compute[192795]: 2025-09-30 21:34:14.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:15 compute-1 sshd-session[236109]: Failed password for invalid user ps from 167.71.248.239 port 43252 ssh2
Sep 30 21:34:16 compute-1 podman[236111]: 2025-09-30 21:34:16.243202766 +0000 UTC m=+0.086243510 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250923)
Sep 30 21:34:16 compute-1 podman[236113]: 2025-09-30 21:34:16.249539548 +0000 UTC m=+0.083184846 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:34:16 compute-1 podman[236112]: 2025-09-30 21:34:16.279047662 +0000 UTC m=+0.119217368 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:34:16 compute-1 nova_compute[192795]: 2025-09-30 21:34:16.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:16 compute-1 nova_compute[192795]: 2025-09-30 21:34:16.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:17 compute-1 sshd-session[236109]: Connection closed by invalid user ps 167.71.248.239 port 43252 [preauth]
Sep 30 21:34:17 compute-1 nova_compute[192795]: 2025-09-30 21:34:17.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:19 compute-1 nova_compute[192795]: 2025-09-30 21:34:19.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.297 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.298 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.322 2 DEBUG nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.429 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.429 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.438 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.438 2 INFO nova.compute.claims [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.564 2 DEBUG nova.compute.provider_tree [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.576 2 DEBUG nova.scheduler.client.report [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.596 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.597 2 DEBUG nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.650 2 DEBUG nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.650 2 DEBUG nova.network.neutron [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.670 2 INFO nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.689 2 DEBUG nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.812 2 DEBUG nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.814 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.815 2 INFO nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Creating image(s)
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.816 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "/var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.816 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "/var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.817 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "/var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.834 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.909 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.911 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.913 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.937 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:20 compute-1 nova_compute[192795]: 2025-09-30 21:34:20.967 2 DEBUG nova.policy [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.008 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.009 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.053 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.055 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.056 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.118 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.120 2 DEBUG nova.virt.disk.api [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Checking if we can resize image /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.120 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.211 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.212 2 DEBUG nova.virt.disk.api [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Cannot resize image /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.212 2 DEBUG nova.objects.instance [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'migration_context' on Instance uuid 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.229 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.229 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Ensure instance console log exists: /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.230 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.230 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:21 compute-1 nova_compute[192795]: 2025-09-30 21:34:21.230 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:22 compute-1 nova_compute[192795]: 2025-09-30 21:34:22.013 2 DEBUG nova.network.neutron [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Successfully created port: f9b8dac1-460b-4686-b9ae-e2c1744259fb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:34:22 compute-1 podman[236193]: 2025-09-30 21:34:22.213385795 +0000 UTC m=+0.060478309 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:34:22 compute-1 nova_compute[192795]: 2025-09-30 21:34:22.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:23.915 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:34:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:23.916 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:34:23 compute-1 nova_compute[192795]: 2025-09-30 21:34:23.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:23 compute-1 nova_compute[192795]: 2025-09-30 21:34:23.942 2 DEBUG nova.network.neutron [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Successfully updated port: f9b8dac1-460b-4686-b9ae-e2c1744259fb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:34:23 compute-1 nova_compute[192795]: 2025-09-30 21:34:23.954 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:23 compute-1 nova_compute[192795]: 2025-09-30 21:34:23.955 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquired lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:23 compute-1 nova_compute[192795]: 2025-09-30 21:34:23.955 2 DEBUG nova.network.neutron [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:34:24 compute-1 nova_compute[192795]: 2025-09-30 21:34:24.538 2 DEBUG nova.compute.manager [req-e2f7330c-11a4-4469-bf2c-c40ea7f0a898 req-82294858-2b63-44ba-836f-7d462214399e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Received event network-changed-f9b8dac1-460b-4686-b9ae-e2c1744259fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:24 compute-1 nova_compute[192795]: 2025-09-30 21:34:24.539 2 DEBUG nova.compute.manager [req-e2f7330c-11a4-4469-bf2c-c40ea7f0a898 req-82294858-2b63-44ba-836f-7d462214399e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Refreshing instance network info cache due to event network-changed-f9b8dac1-460b-4686-b9ae-e2c1744259fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:34:24 compute-1 nova_compute[192795]: 2025-09-30 21:34:24.539 2 DEBUG oslo_concurrency.lockutils [req-e2f7330c-11a4-4469-bf2c-c40ea7f0a898 req-82294858-2b63-44ba-836f-7d462214399e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:24 compute-1 nova_compute[192795]: 2025-09-30 21:34:24.713 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268049.7118516, 6d6350eb-06de-41fb-b4c1-b9eccaf32004 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:24 compute-1 nova_compute[192795]: 2025-09-30 21:34:24.713 2 INFO nova.compute.manager [-] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] VM Stopped (Lifecycle Event)
Sep 30 21:34:24 compute-1 nova_compute[192795]: 2025-09-30 21:34:24.728 2 DEBUG nova.compute.manager [None req-fd8cfe45-f086-4ca1-9dd4-83d6e7d736d1 - - - - - -] [instance: 6d6350eb-06de-41fb-b4c1-b9eccaf32004] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:24 compute-1 nova_compute[192795]: 2025-09-30 21:34:24.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:24 compute-1 nova_compute[192795]: 2025-09-30 21:34:24.926 2 DEBUG nova.network.neutron [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:34:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:25.917 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.922 2 DEBUG nova.network.neutron [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Updating instance_info_cache with network_info: [{"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.943 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Releasing lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.944 2 DEBUG nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Instance network_info: |[{"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.944 2 DEBUG oslo_concurrency.lockutils [req-e2f7330c-11a4-4469-bf2c-c40ea7f0a898 req-82294858-2b63-44ba-836f-7d462214399e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.945 2 DEBUG nova.network.neutron [req-e2f7330c-11a4-4469-bf2c-c40ea7f0a898 req-82294858-2b63-44ba-836f-7d462214399e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Refreshing network info cache for port f9b8dac1-460b-4686-b9ae-e2c1744259fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.947 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Start _get_guest_xml network_info=[{"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.951 2 WARNING nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.958 2 DEBUG nova.virt.libvirt.host [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.959 2 DEBUG nova.virt.libvirt.host [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.962 2 DEBUG nova.virt.libvirt.host [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.962 2 DEBUG nova.virt.libvirt.host [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.963 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.963 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.964 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.964 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.964 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.964 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.964 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.965 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.965 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.965 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.965 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.965 2 DEBUG nova.virt.hardware [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.968 2 DEBUG nova.virt.libvirt.vif [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:34:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-238931323',display_name='tempest-ServerActionsTestOtherB-server-238931323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-238931323',id=97,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-nn940p3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:34:20Z,user_data=None,user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.969 2 DEBUG nova.network.os_vif_util [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.969 2 DEBUG nova.network.os_vif_util [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:0a:17,bridge_name='br-int',has_traffic_filtering=True,id=f9b8dac1-460b-4686-b9ae-e2c1744259fb,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b8dac1-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.970 2 DEBUG nova.objects.instance [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'pci_devices' on Instance uuid 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.992 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <uuid>25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e</uuid>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <name>instance-00000061</name>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerActionsTestOtherB-server-238931323</nova:name>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:34:26</nova:creationTime>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:34:26 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:34:26 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:34:26 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:34:26 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:34:26 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:34:26 compute-1 nova_compute[192795]:         <nova:user uuid="b9b3e9f2523944539f57a1ff5d565cb4">tempest-ServerActionsTestOtherB-463525410-project-member</nova:user>
Sep 30 21:34:26 compute-1 nova_compute[192795]:         <nova:project uuid="d876c85b6ca5418eb657e48391a6503b">tempest-ServerActionsTestOtherB-463525410</nova:project>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:34:26 compute-1 nova_compute[192795]:         <nova:port uuid="f9b8dac1-460b-4686-b9ae-e2c1744259fb">
Sep 30 21:34:26 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <system>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <entry name="serial">25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e</entry>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <entry name="uuid">25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e</entry>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     </system>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <os>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   </os>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <features>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   </features>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk.config"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:f2:0a:17"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <target dev="tapf9b8dac1-46"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/console.log" append="off"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <video>
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     </video>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:34:26 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:34:26 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:34:26 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:34:26 compute-1 nova_compute[192795]: </domain>
Sep 30 21:34:26 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.993 2 DEBUG nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Preparing to wait for external event network-vif-plugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.993 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.993 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.994 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.994 2 DEBUG nova.virt.libvirt.vif [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:34:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-238931323',display_name='tempest-ServerActionsTestOtherB-server-238931323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-238931323',id=97,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-nn940p3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:34:20Z,user_data=None,user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.995 2 DEBUG nova.network.os_vif_util [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.995 2 DEBUG nova.network.os_vif_util [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:0a:17,bridge_name='br-int',has_traffic_filtering=True,id=f9b8dac1-460b-4686-b9ae-e2c1744259fb,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b8dac1-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.995 2 DEBUG os_vif [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:0a:17,bridge_name='br-int',has_traffic_filtering=True,id=f9b8dac1-460b-4686-b9ae-e2c1744259fb,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b8dac1-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.996 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:26 compute-1 nova_compute[192795]: 2025-09-30 21:34:26.996 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.000 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9b8dac1-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.000 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf9b8dac1-46, col_values=(('external_ids', {'iface-id': 'f9b8dac1-460b-4686-b9ae-e2c1744259fb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:0a:17', 'vm-uuid': '25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:27 compute-1 NetworkManager[51724]: <info>  [1759268067.0035] manager: (tapf9b8dac1-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.010 2 INFO os_vif [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:0a:17,bridge_name='br-int',has_traffic_filtering=True,id=f9b8dac1-460b-4686-b9ae-e2c1744259fb,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b8dac1-46')
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.066 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.066 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.066 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No VIF found with MAC fa:16:3e:f2:0a:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.067 2 INFO nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Using config drive
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.517 2 INFO nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Creating config drive at /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk.config
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.522 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpantc3256 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.651 2 DEBUG oslo_concurrency.processutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpantc3256" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:27 compute-1 kernel: tapf9b8dac1-46: entered promiscuous mode
Sep 30 21:34:27 compute-1 NetworkManager[51724]: <info>  [1759268067.7386] manager: (tapf9b8dac1-46): new Tun device (/org/freedesktop/NetworkManager/Devices/191)
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:27 compute-1 ovn_controller[94902]: 2025-09-30T21:34:27Z|00375|binding|INFO|Claiming lport f9b8dac1-460b-4686-b9ae-e2c1744259fb for this chassis.
Sep 30 21:34:27 compute-1 ovn_controller[94902]: 2025-09-30T21:34:27Z|00376|binding|INFO|f9b8dac1-460b-4686-b9ae-e2c1744259fb: Claiming fa:16:3e:f2:0a:17 10.100.0.3
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:27 compute-1 NetworkManager[51724]: <info>  [1759268067.7685] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Sep 30 21:34:27 compute-1 NetworkManager[51724]: <info>  [1759268067.7698] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.772 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:0a:17 10.100.0.3'], port_security=['fa:16:3e:f2:0a:17 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd876c85b6ca5418eb657e48391a6503b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '59aeae25-c90e-4b40-9e86-3ac03fe94073', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b003c3b3-124e-4f30-8c82-ee588d17c214, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f9b8dac1-460b-4686-b9ae-e2c1744259fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.773 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f9b8dac1-460b-4686-b9ae-e2c1744259fb in datapath 91c84c55-96ab-4682-a6e7-9e96514ca8a5 bound to our chassis
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.775 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:34:27 compute-1 systemd-udevd[236232]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.789 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d4d108-2f57-4425-bca4-c38ba37bad10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.790 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap91c84c55-91 in ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:34:27 compute-1 NetworkManager[51724]: <info>  [1759268067.7918] device (tapf9b8dac1-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:34:27 compute-1 NetworkManager[51724]: <info>  [1759268067.7931] device (tapf9b8dac1-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.793 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap91c84c55-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.794 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[44207fb6-bf04-4881-be03-ecbc80101fda]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.795 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2dea7f-ba04-4218-a148-877c01b29b3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 systemd-machined[152783]: New machine qemu-47-instance-00000061.
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.810 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e4d36c-f0f6-453d-9758-edcfa5a74b29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 systemd[1]: Started Virtual Machine qemu-47-instance-00000061.
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.846 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[18655ba6-b170-47f9-bb5d-12028e2b4254]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.876 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a880e9-3cd0-41d8-ad43-1c2adc9266f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 NetworkManager[51724]: <info>  [1759268067.9001] manager: (tap91c84c55-90): new Veth device (/org/freedesktop/NetworkManager/Devices/194)
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.899 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[33d53335-cdce-487e-976d-464dad8f2675]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 systemd-udevd[236235]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.934 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[cff1328e-bd4e-430c-ad87-37d904b5daf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.939 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[29d2b2b2-1f85-4ecb-9e90-4fe7dac2dffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:27 compute-1 NetworkManager[51724]: <info>  [1759268067.9673] device (tap91c84c55-90): carrier: link connected
Sep 30 21:34:27 compute-1 ovn_controller[94902]: 2025-09-30T21:34:27Z|00377|binding|INFO|Setting lport f9b8dac1-460b-4686-b9ae-e2c1744259fb ovn-installed in OVS
Sep 30 21:34:27 compute-1 ovn_controller[94902]: 2025-09-30T21:34:27Z|00378|binding|INFO|Setting lport f9b8dac1-460b-4686-b9ae-e2c1744259fb up in Southbound
Sep 30 21:34:27 compute-1 nova_compute[192795]: 2025-09-30 21:34:27.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.972 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f9908998-c9bd-4ca9-93d5-4f7a7f821b95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:27.993 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[822805a9-3cb9-4445-9a33-0537e0028fcc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91c84c55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:a7:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476364, 'reachable_time': 27714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236266, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.015 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6e664541-bcea-49fc-946e-5890a0a02518]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef9:a7ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 476364, 'tstamp': 476364}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236267, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.042 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc4deda-4a8b-4d92-9782-c1d06d10f2b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91c84c55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:a7:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476364, 'reachable_time': 27714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236268, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.084 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c24c47cd-e953-4d0a-9ee7-ae853e7f8676]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.162 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7d393ce8-047b-43cb-9750-d26d7b0d5079]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.164 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91c84c55-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.165 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.166 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91c84c55-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:28 compute-1 NetworkManager[51724]: <info>  [1759268068.1969] manager: (tap91c84c55-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Sep 30 21:34:28 compute-1 kernel: tap91c84c55-90: entered promiscuous mode
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.200 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap91c84c55-90, col_values=(('external_ids', {'iface-id': '3996e682-c20c-41c5-9547-9688a18f316c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:28 compute-1 ovn_controller[94902]: 2025-09-30T21:34:28Z|00379|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.219 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.220 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a7537fd2-efbd-4153-b3e5-85598bca7649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.221 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:34:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:28.221 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'env', 'PROCESS_TAG=haproxy-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/91c84c55-96ab-4682-a6e7-9e96514ca8a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:34:28 compute-1 podman[236306]: 2025-09-30 21:34:28.566372869 +0000 UTC m=+0.053670103 container create 3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 21:34:28 compute-1 systemd[1]: Started libpod-conmon-3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49.scope.
Sep 30 21:34:28 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:34:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cd38d604733bd3d66ce2e3e6b5e27ac4e0ebfc02db82e667154ca3e00dd0b3a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:34:28 compute-1 podman[236306]: 2025-09-30 21:34:28.538688644 +0000 UTC m=+0.025985898 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:34:28 compute-1 podman[236306]: 2025-09-30 21:34:28.640683683 +0000 UTC m=+0.127980987 container init 3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:34:28 compute-1 podman[236306]: 2025-09-30 21:34:28.647424586 +0000 UTC m=+0.134721860 container start 3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:34:28 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[236325]: [NOTICE]   (236365) : New worker (236379) forked
Sep 30 21:34:28 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[236325]: [NOTICE]   (236365) : Loading success.
Sep 30 21:34:28 compute-1 podman[236319]: 2025-09-30 21:34:28.673216098 +0000 UTC m=+0.073123642 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:34:28 compute-1 podman[236322]: 2025-09-30 21:34:28.683229631 +0000 UTC m=+0.081115540 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.695 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268068.6946094, 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.696 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] VM Started (Lifecycle Event)
Sep 30 21:34:28 compute-1 podman[236323]: 2025-09-30 21:34:28.712797157 +0000 UTC m=+0.105965807 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.720 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.725 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268068.6950912, 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.725 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] VM Paused (Lifecycle Event)
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.746 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.750 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:28 compute-1 nova_compute[192795]: 2025-09-30 21:34:28.794 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.842 2 DEBUG nova.compute.manager [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Received event network-vif-plugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.843 2 DEBUG oslo_concurrency.lockutils [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.844 2 DEBUG oslo_concurrency.lockutils [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.844 2 DEBUG oslo_concurrency.lockutils [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.845 2 DEBUG nova.compute.manager [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Processing event network-vif-plugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.845 2 DEBUG nova.compute.manager [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Received event network-vif-plugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.846 2 DEBUG oslo_concurrency.lockutils [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.846 2 DEBUG oslo_concurrency.lockutils [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.847 2 DEBUG oslo_concurrency.lockutils [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.847 2 DEBUG nova.compute.manager [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] No waiting events found dispatching network-vif-plugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.848 2 WARNING nova.compute.manager [req-3e16abd1-8a06-4504-a970-175928f627b3 req-3df301ef-a3f5-47d2-a3b4-0574e7eeb7a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Received unexpected event network-vif-plugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb for instance with vm_state building and task_state spawning.
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.849 2 DEBUG nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.853 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268069.8526149, 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.853 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] VM Resumed (Lifecycle Event)
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.856 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.861 2 INFO nova.virt.libvirt.driver [-] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Instance spawned successfully.
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.861 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.883 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.893 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.898 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.898 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.899 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.899 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.899 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.900 2 DEBUG nova.virt.libvirt.driver [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.929 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.988 2 INFO nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Took 9.17 seconds to spawn the instance on the hypervisor.
Sep 30 21:34:29 compute-1 nova_compute[192795]: 2025-09-30 21:34:29.989 2 DEBUG nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:30 compute-1 nova_compute[192795]: 2025-09-30 21:34:30.094 2 INFO nova.compute.manager [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Took 9.70 seconds to build instance.
Sep 30 21:34:30 compute-1 nova_compute[192795]: 2025-09-30 21:34:30.179 2 DEBUG oslo_concurrency.lockutils [None req-dde72351-16c6-4420-b418-3f21c366b846 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:30 compute-1 nova_compute[192795]: 2025-09-30 21:34:30.279 2 DEBUG nova.network.neutron [req-e2f7330c-11a4-4469-bf2c-c40ea7f0a898 req-82294858-2b63-44ba-836f-7d462214399e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Updated VIF entry in instance network info cache for port f9b8dac1-460b-4686-b9ae-e2c1744259fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:34:30 compute-1 nova_compute[192795]: 2025-09-30 21:34:30.280 2 DEBUG nova.network.neutron [req-e2f7330c-11a4-4469-bf2c-c40ea7f0a898 req-82294858-2b63-44ba-836f-7d462214399e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Updating instance_info_cache with network_info: [{"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:30 compute-1 nova_compute[192795]: 2025-09-30 21:34:30.312 2 DEBUG oslo_concurrency.lockutils [req-e2f7330c-11a4-4469-bf2c-c40ea7f0a898 req-82294858-2b63-44ba-836f-7d462214399e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.413 2 INFO nova.compute.manager [None req-16707d00-9547-421c-abfc-22715160fc4c b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Pausing
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.414 2 DEBUG nova.objects.instance [None req-16707d00-9547-421c-abfc-22715160fc4c b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'flavor' on Instance uuid 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.448 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268072.4479618, 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.448 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] VM Paused (Lifecycle Event)
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.450 2 DEBUG nova.compute.manager [None req-16707d00-9547-421c-abfc-22715160fc4c b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.471 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.475 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.497 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] During sync_power_state the instance has a pending task (pausing). Skip.
Sep 30 21:34:32 compute-1 nova_compute[192795]: 2025-09-30 21:34:32.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:34 compute-1 ovn_controller[94902]: 2025-09-30T21:34:34Z|00380|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:34:35 compute-1 nova_compute[192795]: 2025-09-30 21:34:35.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:35 compute-1 nova_compute[192795]: 2025-09-30 21:34:35.808 2 DEBUG oslo_concurrency.lockutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:35 compute-1 nova_compute[192795]: 2025-09-30 21:34:35.809 2 DEBUG oslo_concurrency.lockutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:35 compute-1 nova_compute[192795]: 2025-09-30 21:34:35.810 2 INFO nova.compute.manager [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Shelving
Sep 30 21:34:35 compute-1 kernel: tapf9b8dac1-46 (unregistering): left promiscuous mode
Sep 30 21:34:35 compute-1 NetworkManager[51724]: <info>  [1759268075.9266] device (tapf9b8dac1-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:34:36 compute-1 ovn_controller[94902]: 2025-09-30T21:34:36Z|00381|binding|INFO|Releasing lport f9b8dac1-460b-4686-b9ae-e2c1744259fb from this chassis (sb_readonly=0)
Sep 30 21:34:36 compute-1 ovn_controller[94902]: 2025-09-30T21:34:36Z|00382|binding|INFO|Setting lport f9b8dac1-460b-4686-b9ae-e2c1744259fb down in Southbound
Sep 30 21:34:36 compute-1 ovn_controller[94902]: 2025-09-30T21:34:36Z|00383|binding|INFO|Removing iface tapf9b8dac1-46 ovn-installed in OVS
Sep 30 21:34:36 compute-1 nova_compute[192795]: 2025-09-30 21:34:36.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.024 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:0a:17 10.100.0.3'], port_security=['fa:16:3e:f2:0a:17 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd876c85b6ca5418eb657e48391a6503b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '59aeae25-c90e-4b40-9e86-3ac03fe94073', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b003c3b3-124e-4f30-8c82-ee588d17c214, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f9b8dac1-460b-4686-b9ae-e2c1744259fb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:34:36 compute-1 nova_compute[192795]: 2025-09-30 21:34:36.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.028 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f9b8dac1-460b-4686-b9ae-e2c1744259fb in datapath 91c84c55-96ab-4682-a6e7-9e96514ca8a5 unbound from our chassis
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.032 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91c84c55-96ab-4682-a6e7-9e96514ca8a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.035 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9978a9-f02a-49f7-8c31-18616ce20687]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.037 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 namespace which is not needed anymore
Sep 30 21:34:36 compute-1 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000061.scope: Deactivated successfully.
Sep 30 21:34:36 compute-1 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000061.scope: Consumed 3.257s CPU time.
Sep 30 21:34:36 compute-1 systemd-machined[152783]: Machine qemu-47-instance-00000061 terminated.
Sep 30 21:34:36 compute-1 kernel: tapf9b8dac1-46: entered promiscuous mode
Sep 30 21:34:36 compute-1 NetworkManager[51724]: <info>  [1759268076.1224] manager: (tapf9b8dac1-46): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Sep 30 21:34:36 compute-1 kernel: tapf9b8dac1-46 (unregistering): left promiscuous mode
Sep 30 21:34:36 compute-1 nova_compute[192795]: 2025-09-30 21:34:36.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:36 compute-1 nova_compute[192795]: 2025-09-30 21:34:36.182 2 INFO nova.virt.libvirt.driver [-] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Instance destroyed successfully.
Sep 30 21:34:36 compute-1 nova_compute[192795]: 2025-09-30 21:34:36.183 2 DEBUG nova.objects.instance [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'numa_topology' on Instance uuid 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:36 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[236325]: [NOTICE]   (236365) : haproxy version is 2.8.14-c23fe91
Sep 30 21:34:36 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[236325]: [NOTICE]   (236365) : path to executable is /usr/sbin/haproxy
Sep 30 21:34:36 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[236325]: [WARNING]  (236365) : Exiting Master process...
Sep 30 21:34:36 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[236325]: [ALERT]    (236365) : Current worker (236379) exited with code 143 (Terminated)
Sep 30 21:34:36 compute-1 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[236325]: [WARNING]  (236365) : All workers exited. Exiting... (0)
Sep 30 21:34:36 compute-1 systemd[1]: libpod-3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49.scope: Deactivated successfully.
Sep 30 21:34:36 compute-1 podman[236423]: 2025-09-30 21:34:36.211905831 +0000 UTC m=+0.052042099 container died 3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:34:36 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49-userdata-shm.mount: Deactivated successfully.
Sep 30 21:34:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-6cd38d604733bd3d66ce2e3e6b5e27ac4e0ebfc02db82e667154ca3e00dd0b3a-merged.mount: Deactivated successfully.
Sep 30 21:34:36 compute-1 podman[236423]: 2025-09-30 21:34:36.256494745 +0000 UTC m=+0.096630983 container cleanup 3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:34:36 compute-1 systemd[1]: libpod-conmon-3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49.scope: Deactivated successfully.
Sep 30 21:34:36 compute-1 podman[236456]: 2025-09-30 21:34:36.326360668 +0000 UTC m=+0.044467141 container remove 3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.332 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a111e9-e195-4632-9460-17214b1a4252]: (4, ('Tue Sep 30 09:34:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 (3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49)\n3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49\nTue Sep 30 09:34:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 (3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49)\n3b90ff6549a01495065d8d78ed53ccff254e69e7302aefb9cfa991108cf25f49\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.334 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed33de9-b9a4-4c3a-bee0-7ed9f9a26cbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.336 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91c84c55-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:36 compute-1 nova_compute[192795]: 2025-09-30 21:34:36.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:36 compute-1 kernel: tap91c84c55-90: left promiscuous mode
Sep 30 21:34:36 compute-1 nova_compute[192795]: 2025-09-30 21:34:36.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.364 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3d73ad20-19c7-4113-8fb2-3ba746279b29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.404 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[53317ee3-1f72-498d-a710-3deef5741b97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.406 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e7977389-26be-4e5c-b4e8-f6e42f186f2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.437 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e23bac21-ad14-40c9-8f23-23b796ee2d39]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476354, 'reachable_time': 44379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236474, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.441 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:34:36 compute-1 systemd[1]: run-netns-ovnmeta\x2d91c84c55\x2d96ab\x2d4682\x2da6e7\x2d9e96514ca8a5.mount: Deactivated successfully.
Sep 30 21:34:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:36.441 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[1362d8ea-bdab-432d-9927-f96a225fe2a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:36 compute-1 nova_compute[192795]: 2025-09-30 21:34:36.867 2 INFO nova.virt.libvirt.driver [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Beginning cold snapshot process
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.084 2 DEBUG nova.privsep.utils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.085 2 DEBUG oslo_concurrency.processutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk /var/lib/nova/instances/snapshots/tmp7vm0yy_u/cd1db3cf6abd45c5a304ccc588c90cbf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.237 2 DEBUG nova.compute.manager [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Received event network-vif-unplugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.238 2 DEBUG oslo_concurrency.lockutils [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.238 2 DEBUG oslo_concurrency.lockutils [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.238 2 DEBUG oslo_concurrency.lockutils [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.238 2 DEBUG nova.compute.manager [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] No waiting events found dispatching network-vif-unplugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.239 2 WARNING nova.compute.manager [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Received unexpected event network-vif-unplugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb for instance with vm_state paused and task_state shelving_image_pending_upload.
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.239 2 DEBUG nova.compute.manager [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Received event network-vif-plugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.239 2 DEBUG oslo_concurrency.lockutils [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.239 2 DEBUG oslo_concurrency.lockutils [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.240 2 DEBUG oslo_concurrency.lockutils [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.240 2 DEBUG nova.compute.manager [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] No waiting events found dispatching network-vif-plugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.240 2 WARNING nova.compute.manager [req-e892f03a-03f7-4fee-a109-e50cc19efe1d req-2f783383-314f-438f-a096-0366f2bcc203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Received unexpected event network-vif-plugged-f9b8dac1-460b-4686-b9ae-e2c1744259fb for instance with vm_state paused and task_state shelving_image_pending_upload.
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.256 2 DEBUG oslo_concurrency.processutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e/disk /var/lib/nova/instances/snapshots/tmp7vm0yy_u/cd1db3cf6abd45c5a304ccc588c90cbf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.257 2 INFO nova.virt.libvirt.driver [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Snapshot extracted, beginning image upload
Sep 30 21:34:37 compute-1 nova_compute[192795]: 2025-09-30 21:34:37.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:38.694 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:38.695 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:34:38.695 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:39 compute-1 podman[236481]: 2025-09-30 21:34:39.224212971 +0000 UTC m=+0.061350303 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS)
Sep 30 21:34:39 compute-1 nova_compute[192795]: 2025-09-30 21:34:39.615 2 INFO nova.virt.libvirt.driver [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Snapshot image upload complete
Sep 30 21:34:39 compute-1 nova_compute[192795]: 2025-09-30 21:34:39.615 2 DEBUG nova.compute.manager [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:39 compute-1 nova_compute[192795]: 2025-09-30 21:34:39.708 2 INFO nova.compute.manager [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Shelve offloading
Sep 30 21:34:39 compute-1 nova_compute[192795]: 2025-09-30 21:34:39.732 2 INFO nova.virt.libvirt.driver [-] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Instance destroyed successfully.
Sep 30 21:34:39 compute-1 nova_compute[192795]: 2025-09-30 21:34:39.733 2 DEBUG nova.compute.manager [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:39 compute-1 nova_compute[192795]: 2025-09-30 21:34:39.737 2 DEBUG oslo_concurrency.lockutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:39 compute-1 nova_compute[192795]: 2025-09-30 21:34:39.738 2 DEBUG oslo_concurrency.lockutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquired lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:39 compute-1 nova_compute[192795]: 2025-09-30 21:34:39.738 2 DEBUG nova.network.neutron [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:34:41 compute-1 nova_compute[192795]: 2025-09-30 21:34:41.233 2 DEBUG nova.network.neutron [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Updating instance_info_cache with network_info: [{"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:41 compute-1 nova_compute[192795]: 2025-09-30 21:34:41.247 2 DEBUG oslo_concurrency.lockutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Releasing lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.975 2 INFO nova.virt.libvirt.driver [-] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Instance destroyed successfully.
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.976 2 DEBUG nova.objects.instance [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'resources' on Instance uuid 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.994 2 DEBUG nova.virt.libvirt.vif [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:34:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-238931323',display_name='tempest-ServerActionsTestOtherB-server-238931323',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-238931323',id=97,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:34:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-nn940p3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member',shelved_at='2025-09-30T21:34:39.615689',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='2788af77-220b-4903-a4b7-4f05cb3c7f88'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:34:37Z,user_data=None,user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.995 2 DEBUG nova.network.os_vif_util [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.996 2 DEBUG nova.network.os_vif_util [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:0a:17,bridge_name='br-int',has_traffic_filtering=True,id=f9b8dac1-460b-4686-b9ae-e2c1744259fb,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b8dac1-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.996 2 DEBUG os_vif [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:0a:17,bridge_name='br-int',has_traffic_filtering=True,id=f9b8dac1-460b-4686-b9ae-e2c1744259fb,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b8dac1-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:42 compute-1 nova_compute[192795]: 2025-09-30 21:34:42.999 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9b8dac1-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.005 2 INFO os_vif [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:0a:17,bridge_name='br-int',has_traffic_filtering=True,id=f9b8dac1-460b-4686-b9ae-e2c1744259fb,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b8dac1-46')
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.005 2 INFO nova.virt.libvirt.driver [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Deleting instance files /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e_del
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.006 2 INFO nova.virt.libvirt.driver [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Deletion of /var/lib/nova/instances/25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e_del complete
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.099 2 DEBUG nova.compute.manager [req-5694a95a-284c-4f77-b96f-e5d5d86195f4 req-82204798-2cf8-4969-bf09-14c36c6223cb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Received event network-changed-f9b8dac1-460b-4686-b9ae-e2c1744259fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.100 2 DEBUG nova.compute.manager [req-5694a95a-284c-4f77-b96f-e5d5d86195f4 req-82204798-2cf8-4969-bf09-14c36c6223cb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Refreshing instance network info cache due to event network-changed-f9b8dac1-460b-4686-b9ae-e2c1744259fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.100 2 DEBUG oslo_concurrency.lockutils [req-5694a95a-284c-4f77-b96f-e5d5d86195f4 req-82204798-2cf8-4969-bf09-14c36c6223cb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.101 2 DEBUG oslo_concurrency.lockutils [req-5694a95a-284c-4f77-b96f-e5d5d86195f4 req-82204798-2cf8-4969-bf09-14c36c6223cb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.101 2 DEBUG nova.network.neutron [req-5694a95a-284c-4f77-b96f-e5d5d86195f4 req-82204798-2cf8-4969-bf09-14c36c6223cb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Refreshing network info cache for port f9b8dac1-460b-4686-b9ae-e2c1744259fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.175 2 INFO nova.scheduler.client.report [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Deleted allocations for instance 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.256 2 DEBUG oslo_concurrency.lockutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.257 2 DEBUG oslo_concurrency.lockutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.415 2 DEBUG nova.compute.provider_tree [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.435 2 DEBUG nova.scheduler.client.report [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.471 2 DEBUG oslo_concurrency.lockutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:43 compute-1 nova_compute[192795]: 2025-09-30 21:34:43.552 2 DEBUG oslo_concurrency.lockutils [None req-b79f1548-55f6-47b4-8d43-7d6688fad5bf b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 7.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.020 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:34:44.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:34:44 compute-1 nova_compute[192795]: 2025-09-30 21:34:44.599 2 DEBUG nova.network.neutron [req-5694a95a-284c-4f77-b96f-e5d5d86195f4 req-82204798-2cf8-4969-bf09-14c36c6223cb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Updated VIF entry in instance network info cache for port f9b8dac1-460b-4686-b9ae-e2c1744259fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:34:44 compute-1 nova_compute[192795]: 2025-09-30 21:34:44.600 2 DEBUG nova.network.neutron [req-5694a95a-284c-4f77-b96f-e5d5d86195f4 req-82204798-2cf8-4969-bf09-14c36c6223cb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Updating instance_info_cache with network_info: [{"id": "f9b8dac1-460b-4686-b9ae-e2c1744259fb", "address": "fa:16:3e:f2:0a:17", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": null, "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapf9b8dac1-46", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:44 compute-1 nova_compute[192795]: 2025-09-30 21:34:44.662 2 DEBUG oslo_concurrency.lockutils [req-5694a95a-284c-4f77-b96f-e5d5d86195f4 req-82204798-2cf8-4969-bf09-14c36c6223cb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:47 compute-1 podman[236504]: 2025-09-30 21:34:47.26947705 +0000 UTC m=+0.093139189 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:34:47 compute-1 podman[236506]: 2025-09-30 21:34:47.277980991 +0000 UTC m=+0.090276979 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:34:47 compute-1 podman[236505]: 2025-09-30 21:34:47.303675361 +0000 UTC m=+0.128839770 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:34:47 compute-1 nova_compute[192795]: 2025-09-30 21:34:47.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:47 compute-1 nova_compute[192795]: 2025-09-30 21:34:47.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:47 compute-1 nova_compute[192795]: 2025-09-30 21:34:47.720 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:47 compute-1 nova_compute[192795]: 2025-09-30 21:34:47.721 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:47 compute-1 nova_compute[192795]: 2025-09-30 21:34:47.721 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:47 compute-1 nova_compute[192795]: 2025-09-30 21:34:47.722 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:34:47 compute-1 nova_compute[192795]: 2025-09-30 21:34:47.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.012 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.013 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5698MB free_disk=73.35163879394531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.013 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.014 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.081 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.081 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.104 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.139 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.169 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.169 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.170 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:48 compute-1 nova_compute[192795]: 2025-09-30 21:34:48.170 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:34:51 compute-1 nova_compute[192795]: 2025-09-30 21:34:51.179 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268076.1783798, 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:51 compute-1 nova_compute[192795]: 2025-09-30 21:34:51.180 2 INFO nova.compute.manager [-] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] VM Stopped (Lifecycle Event)
Sep 30 21:34:51 compute-1 nova_compute[192795]: 2025-09-30 21:34:51.205 2 DEBUG nova.compute.manager [None req-fc796722-aab0-480d-a9e1-e64e2f09c493 - - - - - -] [instance: 25bed0d8-6ed8-4cc8-97b8-e1e5efd7231e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:52 compute-1 nova_compute[192795]: 2025-09-30 21:34:52.187 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:52 compute-1 nova_compute[192795]: 2025-09-30 21:34:52.188 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:52 compute-1 nova_compute[192795]: 2025-09-30 21:34:52.189 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:34:52 compute-1 nova_compute[192795]: 2025-09-30 21:34:52.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:53 compute-1 nova_compute[192795]: 2025-09-30 21:34:53.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:53 compute-1 nova_compute[192795]: 2025-09-30 21:34:53.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:53 compute-1 podman[236574]: 2025-09-30 21:34:53.236528521 +0000 UTC m=+0.073078641 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:34:54 compute-1 nova_compute[192795]: 2025-09-30 21:34:54.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:55 compute-1 nova_compute[192795]: 2025-09-30 21:34:55.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:58 compute-1 nova_compute[192795]: 2025-09-30 21:34:58.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:59 compute-1 podman[236595]: 2025-09-30 21:34:59.241672852 +0000 UTC m=+0.076557017 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41)
Sep 30 21:34:59 compute-1 podman[236597]: 2025-09-30 21:34:59.241520898 +0000 UTC m=+0.066685308 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:34:59 compute-1 podman[236596]: 2025-09-30 21:34:59.26914501 +0000 UTC m=+0.102508563 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:35:00 compute-1 nova_compute[192795]: 2025-09-30 21:35:00.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:01 compute-1 nova_compute[192795]: 2025-09-30 21:35:01.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:01 compute-1 nova_compute[192795]: 2025-09-30 21:35:01.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:35:01 compute-1 nova_compute[192795]: 2025-09-30 21:35:01.729 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:35:02 compute-1 nova_compute[192795]: 2025-09-30 21:35:02.730 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:02 compute-1 nova_compute[192795]: 2025-09-30 21:35:02.730 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:35:02 compute-1 nova_compute[192795]: 2025-09-30 21:35:02.731 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:35:02 compute-1 nova_compute[192795]: 2025-09-30 21:35:02.750 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:35:03 compute-1 nova_compute[192795]: 2025-09-30 21:35:03.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:35:03 compute-1 nova_compute[192795]: 2025-09-30 21:35:03.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:03 compute-1 nova_compute[192795]: 2025-09-30 21:35:03.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 21:35:03 compute-1 nova_compute[192795]: 2025-09-30 21:35:03.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:35:03 compute-1 nova_compute[192795]: 2025-09-30 21:35:03.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:35:03 compute-1 nova_compute[192795]: 2025-09-30 21:35:03.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.182 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "b71156d1-78c9-474a-9870-926426fb9e6f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.183 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.208 2 DEBUG nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.316 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.317 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.323 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.323 2 INFO nova.compute.claims [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.539 2 DEBUG nova.compute.provider_tree [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.552 2 DEBUG nova.scheduler.client.report [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.579 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.580 2 DEBUG nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.641 2 DEBUG nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.642 2 DEBUG nova.network.neutron [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.671 2 INFO nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.704 2 DEBUG nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.897 2 DEBUG nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.899 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.900 2 INFO nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Creating image(s)
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.901 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "/var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.902 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "/var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.903 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "/var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.931 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:05 compute-1 nova_compute[192795]: 2025-09-30 21:35:05.968 2 DEBUG nova.policy [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '13539577bbf344628f21d36c9f352284', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d80d4fe4be44abb83716bf20046cbbf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.006 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.007 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.008 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.024 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.089 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.091 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.138 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.140 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.141 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.209 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.211 2 DEBUG nova.virt.disk.api [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Checking if we can resize image /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.211 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.294 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.296 2 DEBUG nova.virt.disk.api [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Cannot resize image /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.296 2 DEBUG nova.objects.instance [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lazy-loading 'migration_context' on Instance uuid b71156d1-78c9-474a-9870-926426fb9e6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.333 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.333 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Ensure instance console log exists: /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.334 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.335 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:06 compute-1 nova_compute[192795]: 2025-09-30 21:35:06.335 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:08 compute-1 nova_compute[192795]: 2025-09-30 21:35:08.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:08 compute-1 nova_compute[192795]: 2025-09-30 21:35:08.320 2 DEBUG nova.network.neutron [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Successfully created port: 6db5464e-d07b-430a-b53a-c6f051308cce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:35:09 compute-1 nova_compute[192795]: 2025-09-30 21:35:09.991 2 DEBUG nova.network.neutron [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Successfully updated port: 6db5464e-d07b-430a-b53a-c6f051308cce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:35:10 compute-1 nova_compute[192795]: 2025-09-30 21:35:10.009 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "refresh_cache-b71156d1-78c9-474a-9870-926426fb9e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:10 compute-1 nova_compute[192795]: 2025-09-30 21:35:10.010 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquired lock "refresh_cache-b71156d1-78c9-474a-9870-926426fb9e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:10 compute-1 nova_compute[192795]: 2025-09-30 21:35:10.010 2 DEBUG nova.network.neutron [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:35:10 compute-1 nova_compute[192795]: 2025-09-30 21:35:10.197 2 DEBUG nova.network.neutron [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:35:10 compute-1 podman[236672]: 2025-09-30 21:35:10.241178379 +0000 UTC m=+0.068191708 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 21:35:11 compute-1 nova_compute[192795]: 2025-09-30 21:35:11.812 2 DEBUG nova.compute.manager [req-2563504f-c4f9-40b6-998d-80b1912e21fd req-293ce9ce-a226-4f39-917f-ae44fd489cd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received event network-changed-6db5464e-d07b-430a-b53a-c6f051308cce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:11 compute-1 nova_compute[192795]: 2025-09-30 21:35:11.813 2 DEBUG nova.compute.manager [req-2563504f-c4f9-40b6-998d-80b1912e21fd req-293ce9ce-a226-4f39-917f-ae44fd489cd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Refreshing instance network info cache due to event network-changed-6db5464e-d07b-430a-b53a-c6f051308cce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:35:11 compute-1 nova_compute[192795]: 2025-09-30 21:35:11.814 2 DEBUG oslo_concurrency.lockutils [req-2563504f-c4f9-40b6-998d-80b1912e21fd req-293ce9ce-a226-4f39-917f-ae44fd489cd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b71156d1-78c9-474a-9870-926426fb9e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.087 2 DEBUG nova.network.neutron [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Updating instance_info_cache with network_info: [{"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.107 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Releasing lock "refresh_cache-b71156d1-78c9-474a-9870-926426fb9e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.108 2 DEBUG nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Instance network_info: |[{"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.109 2 DEBUG oslo_concurrency.lockutils [req-2563504f-c4f9-40b6-998d-80b1912e21fd req-293ce9ce-a226-4f39-917f-ae44fd489cd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b71156d1-78c9-474a-9870-926426fb9e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.109 2 DEBUG nova.network.neutron [req-2563504f-c4f9-40b6-998d-80b1912e21fd req-293ce9ce-a226-4f39-917f-ae44fd489cd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Refreshing network info cache for port 6db5464e-d07b-430a-b53a-c6f051308cce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.115 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Start _get_guest_xml network_info=[{"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.123 2 WARNING nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.129 2 DEBUG nova.virt.libvirt.host [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.130 2 DEBUG nova.virt.libvirt.host [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.138 2 DEBUG nova.virt.libvirt.host [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.139 2 DEBUG nova.virt.libvirt.host [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.140 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.141 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.141 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.141 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.142 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.142 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.142 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.142 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.143 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.143 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.143 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.143 2 DEBUG nova.virt.hardware [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.148 2 DEBUG nova.virt.libvirt.vif [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:35:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-838394241',display_name='tempest-ServersTestJSON-server-838394241',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-838394241',id=100,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJrShvxM2RHQgnc3KtffMb71ozM9m2gl80CH8d7Ddj4INUY/auAX/0aGivjH8oTc8iNMyzabtSOLJ8JgED1fvX+cvBN5VgwStLqSmqWKKI/H2ASp1ri6EPQLG7iEgNA2AA==',key_name='tempest-keypair-1151829237',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d80d4fe4be44abb83716bf20046cbbf',ramdisk_id='',reservation_id='r-j92e2cix',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-2004154551',owner_user_name='tempest-ServersTestJSON-2004154551-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:35:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='13539577bbf344628f21d36c9f352284',uuid=b71156d1-78c9-474a-9870-926426fb9e6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.148 2 DEBUG nova.network.os_vif_util [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Converting VIF {"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.149 2 DEBUG nova.network.os_vif_util [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:f9:84,bridge_name='br-int',has_traffic_filtering=True,id=6db5464e-d07b-430a-b53a-c6f051308cce,network=Network(008ec779-6443-4a36-a02a-fd1886e1b089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6db5464e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.149 2 DEBUG nova.objects.instance [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lazy-loading 'pci_devices' on Instance uuid b71156d1-78c9-474a-9870-926426fb9e6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.172 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <uuid>b71156d1-78c9-474a-9870-926426fb9e6f</uuid>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <name>instance-00000064</name>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersTestJSON-server-838394241</nova:name>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:35:12</nova:creationTime>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:35:12 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:35:12 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:35:12 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:35:12 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:35:12 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:35:12 compute-1 nova_compute[192795]:         <nova:user uuid="13539577bbf344628f21d36c9f352284">tempest-ServersTestJSON-2004154551-project-member</nova:user>
Sep 30 21:35:12 compute-1 nova_compute[192795]:         <nova:project uuid="1d80d4fe4be44abb83716bf20046cbbf">tempest-ServersTestJSON-2004154551</nova:project>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:35:12 compute-1 nova_compute[192795]:         <nova:port uuid="6db5464e-d07b-430a-b53a-c6f051308cce">
Sep 30 21:35:12 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <system>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <entry name="serial">b71156d1-78c9-474a-9870-926426fb9e6f</entry>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <entry name="uuid">b71156d1-78c9-474a-9870-926426fb9e6f</entry>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     </system>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <os>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   </os>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <features>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   </features>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk.config"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:b1:f9:84"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <target dev="tap6db5464e-d0"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/console.log" append="off"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <video>
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     </video>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:35:12 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:35:12 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:35:12 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:35:12 compute-1 nova_compute[192795]: </domain>
Sep 30 21:35:12 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.174 2 DEBUG nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Preparing to wait for external event network-vif-plugged-6db5464e-d07b-430a-b53a-c6f051308cce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.174 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.174 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.174 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.175 2 DEBUG nova.virt.libvirt.vif [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:35:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-838394241',display_name='tempest-ServersTestJSON-server-838394241',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-838394241',id=100,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJrShvxM2RHQgnc3KtffMb71ozM9m2gl80CH8d7Ddj4INUY/auAX/0aGivjH8oTc8iNMyzabtSOLJ8JgED1fvX+cvBN5VgwStLqSmqWKKI/H2ASp1ri6EPQLG7iEgNA2AA==',key_name='tempest-keypair-1151829237',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d80d4fe4be44abb83716bf20046cbbf',ramdisk_id='',reservation_id='r-j92e2cix',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-2004154551',owner_user_name='tempest-ServersTestJSON-2004154551-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:35:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='13539577bbf344628f21d36c9f352284',uuid=b71156d1-78c9-474a-9870-926426fb9e6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.176 2 DEBUG nova.network.os_vif_util [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Converting VIF {"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.176 2 DEBUG nova.network.os_vif_util [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:f9:84,bridge_name='br-int',has_traffic_filtering=True,id=6db5464e-d07b-430a-b53a-c6f051308cce,network=Network(008ec779-6443-4a36-a02a-fd1886e1b089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6db5464e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.176 2 DEBUG os_vif [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:f9:84,bridge_name='br-int',has_traffic_filtering=True,id=6db5464e-d07b-430a-b53a-c6f051308cce,network=Network(008ec779-6443-4a36-a02a-fd1886e1b089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6db5464e-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.177 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6db5464e-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.182 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6db5464e-d0, col_values=(('external_ids', {'iface-id': '6db5464e-d07b-430a-b53a-c6f051308cce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:f9:84', 'vm-uuid': 'b71156d1-78c9-474a-9870-926426fb9e6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:35:12 compute-1 NetworkManager[51724]: <info>  [1759268112.1863] manager: (tap6db5464e-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.195 2 INFO os_vif [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:f9:84,bridge_name='br-int',has_traffic_filtering=True,id=6db5464e-d07b-430a-b53a-c6f051308cce,network=Network(008ec779-6443-4a36-a02a-fd1886e1b089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6db5464e-d0')
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.253 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.253 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.254 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] No VIF found with MAC fa:16:3e:b1:f9:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:35:12 compute-1 nova_compute[192795]: 2025-09-30 21:35:12.255 2 INFO nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Using config drive
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.255 2 INFO nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Creating config drive at /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk.config
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.267 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5dbqpg5y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.411 2 DEBUG oslo_concurrency.processutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5dbqpg5y" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:13 compute-1 kernel: tap6db5464e-d0: entered promiscuous mode
Sep 30 21:35:13 compute-1 NetworkManager[51724]: <info>  [1759268113.4978] manager: (tap6db5464e-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Sep 30 21:35:13 compute-1 ovn_controller[94902]: 2025-09-30T21:35:13Z|00384|binding|INFO|Claiming lport 6db5464e-d07b-430a-b53a-c6f051308cce for this chassis.
Sep 30 21:35:13 compute-1 ovn_controller[94902]: 2025-09-30T21:35:13Z|00385|binding|INFO|6db5464e-d07b-430a-b53a-c6f051308cce: Claiming fa:16:3e:b1:f9:84 10.100.0.7
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.518 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:f9:84 10.100.0.7'], port_security=['fa:16:3e:b1:f9:84 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b71156d1-78c9-474a-9870-926426fb9e6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-008ec779-6443-4a36-a02a-fd1886e1b089', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d80d4fe4be44abb83716bf20046cbbf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a40ccbaa-5529-4bf3-bc6c-335798a7b15d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7f2541c-7ed6-4046-8506-aed09ccb25f7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=6db5464e-d07b-430a-b53a-c6f051308cce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.519 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 6db5464e-d07b-430a-b53a-c6f051308cce in datapath 008ec779-6443-4a36-a02a-fd1886e1b089 bound to our chassis
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.522 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 008ec779-6443-4a36-a02a-fd1886e1b089
Sep 30 21:35:13 compute-1 systemd-udevd[236711]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.536 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe74565-cb40-42fb-b0ae-de486012bc8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 systemd-machined[152783]: New machine qemu-48-instance-00000064.
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.537 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap008ec779-61 in ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.540 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap008ec779-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.540 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a5664372-b78c-49cc-af51-02e34befe586]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.541 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[99a53231-4418-4b7e-abf3-9abee41e063a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 NetworkManager[51724]: <info>  [1759268113.5488] device (tap6db5464e-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:35:13 compute-1 NetworkManager[51724]: <info>  [1759268113.5507] device (tap6db5464e-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.556 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9e9ef1-d38d-4a6c-acc8-b20659008d62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 ovn_controller[94902]: 2025-09-30T21:35:13Z|00386|binding|INFO|Setting lport 6db5464e-d07b-430a-b53a-c6f051308cce ovn-installed in OVS
Sep 30 21:35:13 compute-1 ovn_controller[94902]: 2025-09-30T21:35:13Z|00387|binding|INFO|Setting lport 6db5464e-d07b-430a-b53a-c6f051308cce up in Southbound
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 systemd[1]: Started Virtual Machine qemu-48-instance-00000064.
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.586 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[16689c95-98ec-4999-a401-3d8a2e96015d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.618 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[119c0291-264a-4e48-9f3e-33db44fa2326]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.622 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a2eea568-2e8b-4f21-9ea3-0ce7e772aefd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 systemd-udevd[236716]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:35:13 compute-1 NetworkManager[51724]: <info>  [1759268113.6238] manager: (tap008ec779-60): new Veth device (/org/freedesktop/NetworkManager/Devices/199)
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.658 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[730baf8e-1e39-418f-8020-76e8c7d718ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.660 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[59774dff-2710-454a-888e-ff9116efdf8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 NetworkManager[51724]: <info>  [1759268113.6814] device (tap008ec779-60): carrier: link connected
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.687 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[c3d3664c-9517-42a8-ac5a-a10064dfaa24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.711 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4dcbb2-2798-42bd-99d2-5c1d98d79baf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap008ec779-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:74:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480935, 'reachable_time': 35993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236745, 'error': None, 'target': 'ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.726 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a4c0eb10-4012-4dc8-85fa-e92b44120aca]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe74:747e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480935, 'tstamp': 480935}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236746, 'error': None, 'target': 'ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.753 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[49b25e6f-b8c8-4a6e-a415-081a0f85aa10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap008ec779-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:74:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480935, 'reachable_time': 35993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236747, 'error': None, 'target': 'ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.783 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[352fe2a2-2c26-420d-b216-9067a2191bb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.876 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb2b709-1ce5-4cb8-95c9-0d134882c0ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.878 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap008ec779-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.878 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.878 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap008ec779-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 kernel: tap008ec779-60: entered promiscuous mode
Sep 30 21:35:13 compute-1 NetworkManager[51724]: <info>  [1759268113.8812] manager: (tap008ec779-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.883 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap008ec779-60, col_values=(('external_ids', {'iface-id': '4473aad7-4694-4a8f-8f27-cb0a88a74ff0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 ovn_controller[94902]: 2025-09-30T21:35:13Z|00388|binding|INFO|Releasing lport 4473aad7-4694-4a8f-8f27-cb0a88a74ff0 from this chassis (sb_readonly=0)
Sep 30 21:35:13 compute-1 nova_compute[192795]: 2025-09-30 21:35:13.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.899 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/008ec779-6443-4a36-a02a-fd1886e1b089.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/008ec779-6443-4a36-a02a-fd1886e1b089.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.900 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b4dedd5b-8a9f-40e5-bca9-fadaa5587980]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.900 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-008ec779-6443-4a36-a02a-fd1886e1b089
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/008ec779-6443-4a36-a02a-fd1886e1b089.pid.haproxy
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 008ec779-6443-4a36-a02a-fd1886e1b089
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:35:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:13.901 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089', 'env', 'PROCESS_TAG=haproxy-008ec779-6443-4a36-a02a-fd1886e1b089', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/008ec779-6443-4a36-a02a-fd1886e1b089.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:35:14 compute-1 nova_compute[192795]: 2025-09-30 21:35:14.294 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268114.2937882, b71156d1-78c9-474a-9870-926426fb9e6f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:14 compute-1 nova_compute[192795]: 2025-09-30 21:35:14.296 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] VM Started (Lifecycle Event)
Sep 30 21:35:14 compute-1 podman[236787]: 2025-09-30 21:35:14.313946573 +0000 UTC m=+0.057092335 container create 908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:35:14 compute-1 nova_compute[192795]: 2025-09-30 21:35:14.331 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:14 compute-1 nova_compute[192795]: 2025-09-30 21:35:14.336 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268114.295546, b71156d1-78c9-474a-9870-926426fb9e6f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:14 compute-1 nova_compute[192795]: 2025-09-30 21:35:14.337 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] VM Paused (Lifecycle Event)
Sep 30 21:35:14 compute-1 systemd[1]: Started libpod-conmon-908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5.scope.
Sep 30 21:35:14 compute-1 nova_compute[192795]: 2025-09-30 21:35:14.377 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:14 compute-1 podman[236787]: 2025-09-30 21:35:14.290890276 +0000 UTC m=+0.034036068 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:35:14 compute-1 nova_compute[192795]: 2025-09-30 21:35:14.384 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:14 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:35:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ddee829ea9602322e0f9a82eda753bf4000ffabfd2a43a89ee665173a2d8b1f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:35:14 compute-1 podman[236787]: 2025-09-30 21:35:14.411897582 +0000 UTC m=+0.155043364 container init 908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:35:14 compute-1 nova_compute[192795]: 2025-09-30 21:35:14.413 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:35:14 compute-1 podman[236787]: 2025-09-30 21:35:14.417858245 +0000 UTC m=+0.161003997 container start 908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:35:14 compute-1 neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089[236804]: [NOTICE]   (236808) : New worker (236810) forked
Sep 30 21:35:14 compute-1 neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089[236804]: [NOTICE]   (236808) : Loading success.
Sep 30 21:35:15 compute-1 nova_compute[192795]: 2025-09-30 21:35:15.331 2 DEBUG nova.network.neutron [req-2563504f-c4f9-40b6-998d-80b1912e21fd req-293ce9ce-a226-4f39-917f-ae44fd489cd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Updated VIF entry in instance network info cache for port 6db5464e-d07b-430a-b53a-c6f051308cce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:35:15 compute-1 nova_compute[192795]: 2025-09-30 21:35:15.332 2 DEBUG nova.network.neutron [req-2563504f-c4f9-40b6-998d-80b1912e21fd req-293ce9ce-a226-4f39-917f-ae44fd489cd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Updating instance_info_cache with network_info: [{"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:15 compute-1 nova_compute[192795]: 2025-09-30 21:35:15.365 2 DEBUG oslo_concurrency.lockutils [req-2563504f-c4f9-40b6-998d-80b1912e21fd req-293ce9ce-a226-4f39-917f-ae44fd489cd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b71156d1-78c9-474a-9870-926426fb9e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.405 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received event network-vif-plugged-6db5464e-d07b-430a-b53a-c6f051308cce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.406 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.406 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.406 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.407 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Processing event network-vif-plugged-6db5464e-d07b-430a-b53a-c6f051308cce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.407 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received event network-vif-plugged-6db5464e-d07b-430a-b53a-c6f051308cce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.408 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.408 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.408 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.409 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] No waiting events found dispatching network-vif-plugged-6db5464e-d07b-430a-b53a-c6f051308cce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.409 2 WARNING nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received unexpected event network-vif-plugged-6db5464e-d07b-430a-b53a-c6f051308cce for instance with vm_state building and task_state spawning.
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.410 2 DEBUG nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.415 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268116.4153645, b71156d1-78c9-474a-9870-926426fb9e6f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.416 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] VM Resumed (Lifecycle Event)
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.419 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.423 2 INFO nova.virt.libvirt.driver [-] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Instance spawned successfully.
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.424 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.439 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.450 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.457 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.458 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.459 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.459 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.459 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.460 2 DEBUG nova.virt.libvirt.driver [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.486 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.527 2 INFO nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Took 10.63 seconds to spawn the instance on the hypervisor.
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.528 2 DEBUG nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.624 2 INFO nova.compute.manager [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Took 11.35 seconds to build instance.
Sep 30 21:35:16 compute-1 nova_compute[192795]: 2025-09-30 21:35:16.642 2 DEBUG oslo_concurrency.lockutils [None req-ff2be832-07b4-4139-b796-f816833775db 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:17 compute-1 nova_compute[192795]: 2025-09-30 21:35:17.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:18 compute-1 nova_compute[192795]: 2025-09-30 21:35:18.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:18 compute-1 podman[236821]: 2025-09-30 21:35:18.252050932 +0000 UTC m=+0.071382526 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:35:18 compute-1 podman[236819]: 2025-09-30 21:35:18.259078473 +0000 UTC m=+0.081465910 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd)
Sep 30 21:35:18 compute-1 podman[236820]: 2025-09-30 21:35:18.292914135 +0000 UTC m=+0.116427652 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Sep 30 21:35:20 compute-1 NetworkManager[51724]: <info>  [1759268120.4679] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Sep 30 21:35:20 compute-1 NetworkManager[51724]: <info>  [1759268120.4692] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Sep 30 21:35:20 compute-1 nova_compute[192795]: 2025-09-30 21:35:20.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:20 compute-1 ovn_controller[94902]: 2025-09-30T21:35:20Z|00389|binding|INFO|Releasing lport 4473aad7-4694-4a8f-8f27-cb0a88a74ff0 from this chassis (sb_readonly=0)
Sep 30 21:35:20 compute-1 ovn_controller[94902]: 2025-09-30T21:35:20Z|00390|binding|INFO|Releasing lport 4473aad7-4694-4a8f-8f27-cb0a88a74ff0 from this chassis (sb_readonly=0)
Sep 30 21:35:20 compute-1 nova_compute[192795]: 2025-09-30 21:35:20.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:21 compute-1 nova_compute[192795]: 2025-09-30 21:35:21.439 2 DEBUG nova.compute.manager [req-439da9a7-8e96-48b0-bdbe-b46c21aed757 req-6bccdcff-c93b-4c2e-85b1-f65227773089 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received event network-changed-6db5464e-d07b-430a-b53a-c6f051308cce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:21 compute-1 nova_compute[192795]: 2025-09-30 21:35:21.440 2 DEBUG nova.compute.manager [req-439da9a7-8e96-48b0-bdbe-b46c21aed757 req-6bccdcff-c93b-4c2e-85b1-f65227773089 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Refreshing instance network info cache due to event network-changed-6db5464e-d07b-430a-b53a-c6f051308cce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:35:21 compute-1 nova_compute[192795]: 2025-09-30 21:35:21.441 2 DEBUG oslo_concurrency.lockutils [req-439da9a7-8e96-48b0-bdbe-b46c21aed757 req-6bccdcff-c93b-4c2e-85b1-f65227773089 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b71156d1-78c9-474a-9870-926426fb9e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:21 compute-1 nova_compute[192795]: 2025-09-30 21:35:21.441 2 DEBUG oslo_concurrency.lockutils [req-439da9a7-8e96-48b0-bdbe-b46c21aed757 req-6bccdcff-c93b-4c2e-85b1-f65227773089 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b71156d1-78c9-474a-9870-926426fb9e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:21 compute-1 nova_compute[192795]: 2025-09-30 21:35:21.442 2 DEBUG nova.network.neutron [req-439da9a7-8e96-48b0-bdbe-b46c21aed757 req-6bccdcff-c93b-4c2e-85b1-f65227773089 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Refreshing network info cache for port 6db5464e-d07b-430a-b53a-c6f051308cce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:35:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:22.099 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:22.101 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:35:22 compute-1 nova_compute[192795]: 2025-09-30 21:35:22.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:22 compute-1 nova_compute[192795]: 2025-09-30 21:35:22.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:23 compute-1 nova_compute[192795]: 2025-09-30 21:35:23.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:23.104 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:23 compute-1 nova_compute[192795]: 2025-09-30 21:35:23.382 2 DEBUG nova.network.neutron [req-439da9a7-8e96-48b0-bdbe-b46c21aed757 req-6bccdcff-c93b-4c2e-85b1-f65227773089 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Updated VIF entry in instance network info cache for port 6db5464e-d07b-430a-b53a-c6f051308cce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:35:23 compute-1 nova_compute[192795]: 2025-09-30 21:35:23.383 2 DEBUG nova.network.neutron [req-439da9a7-8e96-48b0-bdbe-b46c21aed757 req-6bccdcff-c93b-4c2e-85b1-f65227773089 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Updating instance_info_cache with network_info: [{"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:23 compute-1 nova_compute[192795]: 2025-09-30 21:35:23.422 2 DEBUG oslo_concurrency.lockutils [req-439da9a7-8e96-48b0-bdbe-b46c21aed757 req-6bccdcff-c93b-4c2e-85b1-f65227773089 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b71156d1-78c9-474a-9870-926426fb9e6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:24 compute-1 podman[236888]: 2025-09-30 21:35:24.234553176 +0000 UTC m=+0.077182973 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:35:26 compute-1 ovn_controller[94902]: 2025-09-30T21:35:26Z|00391|binding|INFO|Releasing lport 4473aad7-4694-4a8f-8f27-cb0a88a74ff0 from this chassis (sb_readonly=0)
Sep 30 21:35:26 compute-1 nova_compute[192795]: 2025-09-30 21:35:26.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:27 compute-1 nova_compute[192795]: 2025-09-30 21:35:27.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:28 compute-1 nova_compute[192795]: 2025-09-30 21:35:28.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:30 compute-1 ovn_controller[94902]: 2025-09-30T21:35:30Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:f9:84 10.100.0.7
Sep 30 21:35:30 compute-1 ovn_controller[94902]: 2025-09-30T21:35:30Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:f9:84 10.100.0.7
Sep 30 21:35:30 compute-1 podman[236924]: 2025-09-30 21:35:30.2123542 +0000 UTC m=+0.051717079 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:35:30 compute-1 podman[236925]: 2025-09-30 21:35:30.225313334 +0000 UTC m=+0.059844431 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:35:30 compute-1 podman[236923]: 2025-09-30 21:35:30.225662303 +0000 UTC m=+0.071391695 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Sep 30 21:35:32 compute-1 nova_compute[192795]: 2025-09-30 21:35:32.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:33 compute-1 nova_compute[192795]: 2025-09-30 21:35:33.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:34 compute-1 nova_compute[192795]: 2025-09-30 21:35:34.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:35 compute-1 ovn_controller[94902]: 2025-09-30T21:35:35Z|00392|binding|INFO|Releasing lport 4473aad7-4694-4a8f-8f27-cb0a88a74ff0 from this chassis (sb_readonly=0)
Sep 30 21:35:35 compute-1 nova_compute[192795]: 2025-09-30 21:35:35.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:37 compute-1 nova_compute[192795]: 2025-09-30 21:35:37.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:38 compute-1 nova_compute[192795]: 2025-09-30 21:35:38.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:38.694 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:38.695 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:38.696 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:40 compute-1 nova_compute[192795]: 2025-09-30 21:35:40.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:41 compute-1 podman[236984]: 2025-09-30 21:35:41.212674163 +0000 UTC m=+0.058429393 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid)
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.108 2 DEBUG oslo_concurrency.lockutils [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "b71156d1-78c9-474a-9870-926426fb9e6f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.108 2 DEBUG oslo_concurrency.lockutils [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.109 2 DEBUG oslo_concurrency.lockutils [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.109 2 DEBUG oslo_concurrency.lockutils [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.109 2 DEBUG oslo_concurrency.lockutils [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.124 2 INFO nova.compute.manager [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Terminating instance
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.138 2 DEBUG nova.compute.manager [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:35:42 compute-1 kernel: tap6db5464e-d0 (unregistering): left promiscuous mode
Sep 30 21:35:42 compute-1 NetworkManager[51724]: <info>  [1759268142.1586] device (tap6db5464e-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:35:42 compute-1 ovn_controller[94902]: 2025-09-30T21:35:42Z|00393|binding|INFO|Releasing lport 6db5464e-d07b-430a-b53a-c6f051308cce from this chassis (sb_readonly=0)
Sep 30 21:35:42 compute-1 ovn_controller[94902]: 2025-09-30T21:35:42Z|00394|binding|INFO|Setting lport 6db5464e-d07b-430a-b53a-c6f051308cce down in Southbound
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 ovn_controller[94902]: 2025-09-30T21:35:42Z|00395|binding|INFO|Removing iface tap6db5464e-d0 ovn-installed in OVS
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.204 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:f9:84 10.100.0.7'], port_security=['fa:16:3e:b1:f9:84 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b71156d1-78c9-474a-9870-926426fb9e6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-008ec779-6443-4a36-a02a-fd1886e1b089', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d80d4fe4be44abb83716bf20046cbbf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a40ccbaa-5529-4bf3-bc6c-335798a7b15d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7f2541c-7ed6-4046-8506-aed09ccb25f7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=6db5464e-d07b-430a-b53a-c6f051308cce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.205 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 6db5464e-d07b-430a-b53a-c6f051308cce in datapath 008ec779-6443-4a36-a02a-fd1886e1b089 unbound from our chassis
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.206 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 008ec779-6443-4a36-a02a-fd1886e1b089, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.208 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9ca8d3-10b6-4f3a-81d6-4e10e47c5329]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.208 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089 namespace which is not needed anymore
Sep 30 21:35:42 compute-1 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000064.scope: Deactivated successfully.
Sep 30 21:35:42 compute-1 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000064.scope: Consumed 13.461s CPU time.
Sep 30 21:35:42 compute-1 systemd-machined[152783]: Machine qemu-48-instance-00000064 terminated.
Sep 30 21:35:42 compute-1 neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089[236804]: [NOTICE]   (236808) : haproxy version is 2.8.14-c23fe91
Sep 30 21:35:42 compute-1 neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089[236804]: [NOTICE]   (236808) : path to executable is /usr/sbin/haproxy
Sep 30 21:35:42 compute-1 neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089[236804]: [WARNING]  (236808) : Exiting Master process...
Sep 30 21:35:42 compute-1 neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089[236804]: [ALERT]    (236808) : Current worker (236810) exited with code 143 (Terminated)
Sep 30 21:35:42 compute-1 neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089[236804]: [WARNING]  (236808) : All workers exited. Exiting... (0)
Sep 30 21:35:42 compute-1 systemd[1]: libpod-908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5.scope: Deactivated successfully.
Sep 30 21:35:42 compute-1 podman[237028]: 2025-09-30 21:35:42.351099072 +0000 UTC m=+0.047547197 container died 908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5-userdata-shm.mount: Deactivated successfully.
Sep 30 21:35:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-6ddee829ea9602322e0f9a82eda753bf4000ffabfd2a43a89ee665173a2d8b1f-merged.mount: Deactivated successfully.
Sep 30 21:35:42 compute-1 podman[237028]: 2025-09-30 21:35:42.395144901 +0000 UTC m=+0.091593036 container cleanup 908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:35:42 compute-1 systemd[1]: libpod-conmon-908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5.scope: Deactivated successfully.
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.421 2 INFO nova.virt.libvirt.driver [-] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Instance destroyed successfully.
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.421 2 DEBUG nova.objects.instance [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lazy-loading 'resources' on Instance uuid b71156d1-78c9-474a-9870-926426fb9e6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.443 2 DEBUG nova.virt.libvirt.vif [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:35:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-838394241',display_name='tempest-ServersTestJSON-server-838394241',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-838394241',id=100,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJrShvxM2RHQgnc3KtffMb71ozM9m2gl80CH8d7Ddj4INUY/auAX/0aGivjH8oTc8iNMyzabtSOLJ8JgED1fvX+cvBN5VgwStLqSmqWKKI/H2ASp1ri6EPQLG7iEgNA2AA==',key_name='tempest-keypair-1151829237',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:35:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1d80d4fe4be44abb83716bf20046cbbf',ramdisk_id='',reservation_id='r-j92e2cix',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-2004154551',owner_user_name='tempest-ServersTestJSON-2004154551-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:35:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='13539577bbf344628f21d36c9f352284',uuid=b71156d1-78c9-474a-9870-926426fb9e6f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.443 2 DEBUG nova.network.os_vif_util [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Converting VIF {"id": "6db5464e-d07b-430a-b53a-c6f051308cce", "address": "fa:16:3e:b1:f9:84", "network": {"id": "008ec779-6443-4a36-a02a-fd1886e1b089", "bridge": "br-int", "label": "tempest-ServersTestJSON-302363823-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1d80d4fe4be44abb83716bf20046cbbf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6db5464e-d0", "ovs_interfaceid": "6db5464e-d07b-430a-b53a-c6f051308cce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.444 2 DEBUG nova.network.os_vif_util [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:f9:84,bridge_name='br-int',has_traffic_filtering=True,id=6db5464e-d07b-430a-b53a-c6f051308cce,network=Network(008ec779-6443-4a36-a02a-fd1886e1b089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6db5464e-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.444 2 DEBUG os_vif [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:f9:84,bridge_name='br-int',has_traffic_filtering=True,id=6db5464e-d07b-430a-b53a-c6f051308cce,network=Network(008ec779-6443-4a36-a02a-fd1886e1b089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6db5464e-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.447 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6db5464e-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.456 2 INFO os_vif [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:f9:84,bridge_name='br-int',has_traffic_filtering=True,id=6db5464e-d07b-430a-b53a-c6f051308cce,network=Network(008ec779-6443-4a36-a02a-fd1886e1b089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6db5464e-d0')
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.456 2 INFO nova.virt.libvirt.driver [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Deleting instance files /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f_del
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.457 2 INFO nova.virt.libvirt.driver [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Deletion of /var/lib/nova/instances/b71156d1-78c9-474a-9870-926426fb9e6f_del complete
Sep 30 21:35:42 compute-1 podman[237070]: 2025-09-30 21:35:42.457860819 +0000 UTC m=+0.039069895 container remove 908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.465 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5cc9c6-d80b-47d8-b039-6e31fe237ce3]: (4, ('Tue Sep 30 09:35:42 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089 (908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5)\n908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5\nTue Sep 30 09:35:42 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089 (908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5)\n908765a8bfc94fd808999360c5724c57461536793c98a3f2bbf1f4aae91c06e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.467 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f32c9c50-80f8-49aa-8ae4-14bda85c5f6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.468 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap008ec779-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:42 compute-1 kernel: tap008ec779-60: left promiscuous mode
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.484 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b9dce447-a751-4d75-9220-5f8622c6dc08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.508 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[974e4f99-cad4-45df-9a5c-c70511a6cbdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.510 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[51edd7ca-4715-4544-a854-54be9e4a4d57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.526 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f8107f-5d3a-4394-bc1d-66461eaf9e79]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480928, 'reachable_time': 26674, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237087, 'error': None, 'target': 'ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.529 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-008ec779-6443-4a36-a02a-fd1886e1b089 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:35:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:42.529 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[7daddfd3-ac8e-43f4-a3ee-858b7f98113b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:42 compute-1 systemd[1]: run-netns-ovnmeta\x2d008ec779\x2d6443\x2d4a36\x2da02a\x2dfd1886e1b089.mount: Deactivated successfully.
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.572 2 INFO nova.compute.manager [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.574 2 DEBUG oslo.service.loopingcall [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.574 2 DEBUG nova.compute.manager [-] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:35:42 compute-1 nova_compute[192795]: 2025-09-30 21:35:42.574 2 DEBUG nova.network.neutron [-] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:35:43 compute-1 nova_compute[192795]: 2025-09-30 21:35:43.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:43 compute-1 nova_compute[192795]: 2025-09-30 21:35:43.579 2 DEBUG nova.compute.manager [req-b201d3ee-5106-4d07-826f-cf12dc4d7050 req-5abfd72d-599a-4bd8-b752-bd53e96568d0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received event network-vif-unplugged-6db5464e-d07b-430a-b53a-c6f051308cce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:43 compute-1 nova_compute[192795]: 2025-09-30 21:35:43.580 2 DEBUG oslo_concurrency.lockutils [req-b201d3ee-5106-4d07-826f-cf12dc4d7050 req-5abfd72d-599a-4bd8-b752-bd53e96568d0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:43 compute-1 nova_compute[192795]: 2025-09-30 21:35:43.580 2 DEBUG oslo_concurrency.lockutils [req-b201d3ee-5106-4d07-826f-cf12dc4d7050 req-5abfd72d-599a-4bd8-b752-bd53e96568d0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:43 compute-1 nova_compute[192795]: 2025-09-30 21:35:43.580 2 DEBUG oslo_concurrency.lockutils [req-b201d3ee-5106-4d07-826f-cf12dc4d7050 req-5abfd72d-599a-4bd8-b752-bd53e96568d0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:43 compute-1 nova_compute[192795]: 2025-09-30 21:35:43.580 2 DEBUG nova.compute.manager [req-b201d3ee-5106-4d07-826f-cf12dc4d7050 req-5abfd72d-599a-4bd8-b752-bd53e96568d0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] No waiting events found dispatching network-vif-unplugged-6db5464e-d07b-430a-b53a-c6f051308cce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:43 compute-1 nova_compute[192795]: 2025-09-30 21:35:43.580 2 DEBUG nova.compute.manager [req-b201d3ee-5106-4d07-826f-cf12dc4d7050 req-5abfd72d-599a-4bd8-b752-bd53e96568d0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received event network-vif-unplugged-6db5464e-d07b-430a-b53a-c6f051308cce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:35:43 compute-1 nova_compute[192795]: 2025-09-30 21:35:43.916 2 DEBUG nova.network.neutron [-] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:43 compute-1 nova_compute[192795]: 2025-09-30 21:35:43.976 2 INFO nova.compute.manager [-] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Took 1.40 seconds to deallocate network for instance.
Sep 30 21:35:44 compute-1 nova_compute[192795]: 2025-09-30 21:35:44.067 2 DEBUG oslo_concurrency.lockutils [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:44 compute-1 nova_compute[192795]: 2025-09-30 21:35:44.068 2 DEBUG oslo_concurrency.lockutils [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:44 compute-1 nova_compute[192795]: 2025-09-30 21:35:44.085 2 DEBUG nova.compute.manager [req-46219ec0-c464-4b85-914b-3395a8e1e991 req-c818ed61-3709-4c1d-9984-7373181b1988 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received event network-vif-deleted-6db5464e-d07b-430a-b53a-c6f051308cce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:44 compute-1 nova_compute[192795]: 2025-09-30 21:35:44.157 2 DEBUG nova.compute.provider_tree [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:35:44 compute-1 nova_compute[192795]: 2025-09-30 21:35:44.177 2 DEBUG nova.scheduler.client.report [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:35:44 compute-1 nova_compute[192795]: 2025-09-30 21:35:44.224 2 DEBUG oslo_concurrency.lockutils [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:44 compute-1 nova_compute[192795]: 2025-09-30 21:35:44.254 2 INFO nova.scheduler.client.report [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Deleted allocations for instance b71156d1-78c9-474a-9870-926426fb9e6f
Sep 30 21:35:44 compute-1 nova_compute[192795]: 2025-09-30 21:35:44.331 2 DEBUG oslo_concurrency.lockutils [None req-b3e14868-397f-4568-abd9-86b188515019 13539577bbf344628f21d36c9f352284 1d80d4fe4be44abb83716bf20046cbbf - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:45 compute-1 nova_compute[192795]: 2025-09-30 21:35:45.939 2 DEBUG nova.compute.manager [req-b7ceba1c-8aad-4fec-81dd-ed968bbe6622 req-335ecf62-b3dc-48b7-b11f-5439fba76275 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received event network-vif-plugged-6db5464e-d07b-430a-b53a-c6f051308cce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:45 compute-1 nova_compute[192795]: 2025-09-30 21:35:45.940 2 DEBUG oslo_concurrency.lockutils [req-b7ceba1c-8aad-4fec-81dd-ed968bbe6622 req-335ecf62-b3dc-48b7-b11f-5439fba76275 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:45 compute-1 nova_compute[192795]: 2025-09-30 21:35:45.940 2 DEBUG oslo_concurrency.lockutils [req-b7ceba1c-8aad-4fec-81dd-ed968bbe6622 req-335ecf62-b3dc-48b7-b11f-5439fba76275 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:45 compute-1 nova_compute[192795]: 2025-09-30 21:35:45.940 2 DEBUG oslo_concurrency.lockutils [req-b7ceba1c-8aad-4fec-81dd-ed968bbe6622 req-335ecf62-b3dc-48b7-b11f-5439fba76275 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b71156d1-78c9-474a-9870-926426fb9e6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:45 compute-1 nova_compute[192795]: 2025-09-30 21:35:45.940 2 DEBUG nova.compute.manager [req-b7ceba1c-8aad-4fec-81dd-ed968bbe6622 req-335ecf62-b3dc-48b7-b11f-5439fba76275 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] No waiting events found dispatching network-vif-plugged-6db5464e-d07b-430a-b53a-c6f051308cce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:45 compute-1 nova_compute[192795]: 2025-09-30 21:35:45.941 2 WARNING nova.compute.manager [req-b7ceba1c-8aad-4fec-81dd-ed968bbe6622 req-335ecf62-b3dc-48b7-b11f-5439fba76275 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Received unexpected event network-vif-plugged-6db5464e-d07b-430a-b53a-c6f051308cce for instance with vm_state deleted and task_state None.
Sep 30 21:35:46 compute-1 nova_compute[192795]: 2025-09-30 21:35:46.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:47 compute-1 nova_compute[192795]: 2025-09-30 21:35:47.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:48 compute-1 nova_compute[192795]: 2025-09-30 21:35:48.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:49 compute-1 podman[237090]: 2025-09-30 21:35:49.235794379 +0000 UTC m=+0.062823372 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:35:49 compute-1 podman[237088]: 2025-09-30 21:35:49.25126473 +0000 UTC m=+0.086179268 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:35:49 compute-1 podman[237089]: 2025-09-30 21:35:49.259580997 +0000 UTC m=+0.093965131 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.388 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "613d38da-47eb-49a5-a0db-2c7565b04273" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.390 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.441 2 DEBUG nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.560 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.561 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.569 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.569 2 INFO nova.compute.claims [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.703 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.703 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.706 2 DEBUG nova.compute.provider_tree [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.725 2 DEBUG nova.scheduler.client.report [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.746 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.747 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.747 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.748 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.765 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "b7fdfb73-f902-48d2-8ec1-fcbcad847d5c" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.766 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "b7fdfb73-f902-48d2-8ec1-fcbcad847d5c" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.774 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "b7fdfb73-f902-48d2-8ec1-fcbcad847d5c" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.775 2 DEBUG nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.835 2 DEBUG nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.835 2 DEBUG nova.network.neutron [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.853 2 INFO nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.869 2 DEBUG nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.957 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.958 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.35160827636719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.958 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:49 compute-1 nova_compute[192795]: 2025-09-30 21:35:49.958 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.052 2 DEBUG nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.054 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.054 2 INFO nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Creating image(s)
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.055 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "/var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.056 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "/var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.056 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "/var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.073 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 613d38da-47eb-49a5-a0db-2c7565b04273 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.074 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.074 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.079 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.140 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.142 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.143 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.153 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.182 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.203 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.213 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.214 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.248 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.249 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.250 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.275 2 DEBUG nova.policy [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '73d8108bca4a4705ac72d3d469961da6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '99b73fe1a5684f429b5266c7255f457f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.280 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.280 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.319 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.320 2 DEBUG nova.virt.disk.api [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Checking if we can resize image /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.320 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.378 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.379 2 DEBUG nova.virt.disk.api [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Cannot resize image /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.380 2 DEBUG nova.objects.instance [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lazy-loading 'migration_context' on Instance uuid 613d38da-47eb-49a5-a0db-2c7565b04273 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.396 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.396 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Ensure instance console log exists: /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.397 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.397 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.397 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:50 compute-1 nova_compute[192795]: 2025-09-30 21:35:50.921 2 DEBUG nova.network.neutron [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Successfully created port: df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:35:52 compute-1 nova_compute[192795]: 2025-09-30 21:35:52.067 2 DEBUG nova.network.neutron [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Successfully updated port: df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:35:52 compute-1 nova_compute[192795]: 2025-09-30 21:35:52.084 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "refresh_cache-613d38da-47eb-49a5-a0db-2c7565b04273" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:52 compute-1 nova_compute[192795]: 2025-09-30 21:35:52.084 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquired lock "refresh_cache-613d38da-47eb-49a5-a0db-2c7565b04273" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:52 compute-1 nova_compute[192795]: 2025-09-30 21:35:52.084 2 DEBUG nova.network.neutron [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:35:52 compute-1 nova_compute[192795]: 2025-09-30 21:35:52.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:52 compute-1 nova_compute[192795]: 2025-09-30 21:35:52.624 2 DEBUG nova.compute.manager [req-490fc59b-dd67-4808-a60a-5c78f6003c04 req-002351a6-978d-4ef1-af45-6cde4f8354f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Received event network-changed-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:52 compute-1 nova_compute[192795]: 2025-09-30 21:35:52.625 2 DEBUG nova.compute.manager [req-490fc59b-dd67-4808-a60a-5c78f6003c04 req-002351a6-978d-4ef1-af45-6cde4f8354f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Refreshing instance network info cache due to event network-changed-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:35:52 compute-1 nova_compute[192795]: 2025-09-30 21:35:52.625 2 DEBUG oslo_concurrency.lockutils [req-490fc59b-dd67-4808-a60a-5c78f6003c04 req-002351a6-978d-4ef1-af45-6cde4f8354f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-613d38da-47eb-49a5-a0db-2c7565b04273" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:52 compute-1 nova_compute[192795]: 2025-09-30 21:35:52.687 2 DEBUG nova.network.neutron [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:35:53 compute-1 nova_compute[192795]: 2025-09-30 21:35:53.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:53 compute-1 nova_compute[192795]: 2025-09-30 21:35:53.272 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:53 compute-1 nova_compute[192795]: 2025-09-30 21:35:53.273 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:53 compute-1 nova_compute[192795]: 2025-09-30 21:35:53.274 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:35:54 compute-1 nova_compute[192795]: 2025-09-30 21:35:54.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.010 2 DEBUG nova.network.neutron [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Updating instance_info_cache with network_info: [{"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.040 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Releasing lock "refresh_cache-613d38da-47eb-49a5-a0db-2c7565b04273" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.040 2 DEBUG nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Instance network_info: |[{"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.041 2 DEBUG oslo_concurrency.lockutils [req-490fc59b-dd67-4808-a60a-5c78f6003c04 req-002351a6-978d-4ef1-af45-6cde4f8354f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-613d38da-47eb-49a5-a0db-2c7565b04273" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.041 2 DEBUG nova.network.neutron [req-490fc59b-dd67-4808-a60a-5c78f6003c04 req-002351a6-978d-4ef1-af45-6cde4f8354f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Refreshing network info cache for port df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.044 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Start _get_guest_xml network_info=[{"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.061 2 WARNING nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.072 2 DEBUG nova.virt.libvirt.host [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.073 2 DEBUG nova.virt.libvirt.host [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.077 2 DEBUG nova.virt.libvirt.host [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.077 2 DEBUG nova.virt.libvirt.host [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.078 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.079 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.080 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.080 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.080 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.080 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.080 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.081 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.081 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.081 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.081 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.082 2 DEBUG nova.virt.hardware [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.085 2 DEBUG nova.virt.libvirt.vif [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:35:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-98053286',display_name='tempest-ServerGroupTestJSON-server-98053286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-98053286',id=103,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='99b73fe1a5684f429b5266c7255f457f',ramdisk_id='',reservation_id='r-3zlncuuk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1929524560',owner_user_name='tempest-ServerGroupTestJSON-1929524560-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:35:49Z,user_data=None,user_id='73d8108bca4a4705ac72d3d469961da6',uuid=613d38da-47eb-49a5-a0db-2c7565b04273,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.085 2 DEBUG nova.network.os_vif_util [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Converting VIF {"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.086 2 DEBUG nova.network.os_vif_util [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:84:a6,bridge_name='br-int',has_traffic_filtering=True,id=df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f,network=Network(815b09bc-37d4-446a-9180-045b401b0282),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf352cf4-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.087 2 DEBUG nova.objects.instance [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lazy-loading 'pci_devices' on Instance uuid 613d38da-47eb-49a5-a0db-2c7565b04273 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.117 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <uuid>613d38da-47eb-49a5-a0db-2c7565b04273</uuid>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <name>instance-00000067</name>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerGroupTestJSON-server-98053286</nova:name>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:35:55</nova:creationTime>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:35:55 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:35:55 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:35:55 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:35:55 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:35:55 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:35:55 compute-1 nova_compute[192795]:         <nova:user uuid="73d8108bca4a4705ac72d3d469961da6">tempest-ServerGroupTestJSON-1929524560-project-member</nova:user>
Sep 30 21:35:55 compute-1 nova_compute[192795]:         <nova:project uuid="99b73fe1a5684f429b5266c7255f457f">tempest-ServerGroupTestJSON-1929524560</nova:project>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:35:55 compute-1 nova_compute[192795]:         <nova:port uuid="df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f">
Sep 30 21:35:55 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <system>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <entry name="serial">613d38da-47eb-49a5-a0db-2c7565b04273</entry>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <entry name="uuid">613d38da-47eb-49a5-a0db-2c7565b04273</entry>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     </system>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <os>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   </os>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <features>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   </features>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk.config"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:d0:84:a6"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <target dev="tapdf352cf4-3c"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/console.log" append="off"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <video>
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     </video>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:35:55 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:35:55 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:35:55 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:35:55 compute-1 nova_compute[192795]: </domain>
Sep 30 21:35:55 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.119 2 DEBUG nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Preparing to wait for external event network-vif-plugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.119 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.119 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.119 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.120 2 DEBUG nova.virt.libvirt.vif [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:35:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-98053286',display_name='tempest-ServerGroupTestJSON-server-98053286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-98053286',id=103,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='99b73fe1a5684f429b5266c7255f457f',ramdisk_id='',reservation_id='r-3zlncuuk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1929524560',owner_user_name='tempest-ServerGroupTestJSON-1929524560-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:35:49Z,user_data=None,user_id='73d8108bca4a4705ac72d3d469961da6',uuid=613d38da-47eb-49a5-a0db-2c7565b04273,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.120 2 DEBUG nova.network.os_vif_util [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Converting VIF {"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.121 2 DEBUG nova.network.os_vif_util [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:84:a6,bridge_name='br-int',has_traffic_filtering=True,id=df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f,network=Network(815b09bc-37d4-446a-9180-045b401b0282),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf352cf4-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.122 2 DEBUG os_vif [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:84:a6,bridge_name='br-int',has_traffic_filtering=True,id=df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f,network=Network(815b09bc-37d4-446a-9180-045b401b0282),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf352cf4-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.122 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.123 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.126 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf352cf4-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.126 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf352cf4-3c, col_values=(('external_ids', {'iface-id': 'df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:84:a6', 'vm-uuid': '613d38da-47eb-49a5-a0db-2c7565b04273'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:55 compute-1 NetworkManager[51724]: <info>  [1759268155.1291] manager: (tapdf352cf4-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.136 2 INFO os_vif [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:84:a6,bridge_name='br-int',has_traffic_filtering=True,id=df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f,network=Network(815b09bc-37d4-446a-9180-045b401b0282),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf352cf4-3c')
Sep 30 21:35:55 compute-1 podman[237175]: 2025-09-30 21:35:55.223217545 +0000 UTC m=+0.066001388 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.225 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.226 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.226 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] No VIF found with MAC fa:16:3e:d0:84:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.226 2 INFO nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Using config drive
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.711 2 INFO nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Creating config drive at /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk.config
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.720 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc0bbxbmv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.865 2 DEBUG oslo_concurrency.processutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc0bbxbmv" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:55 compute-1 kernel: tapdf352cf4-3c: entered promiscuous mode
Sep 30 21:35:55 compute-1 ovn_controller[94902]: 2025-09-30T21:35:55Z|00396|binding|INFO|Claiming lport df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f for this chassis.
Sep 30 21:35:55 compute-1 ovn_controller[94902]: 2025-09-30T21:35:55Z|00397|binding|INFO|df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f: Claiming fa:16:3e:d0:84:a6 10.100.0.14
Sep 30 21:35:55 compute-1 NetworkManager[51724]: <info>  [1759268155.9483] manager: (tapdf352cf4-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:55 compute-1 nova_compute[192795]: 2025-09-30 21:35:55.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:55.979 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:84:a6 10.100.0.14'], port_security=['fa:16:3e:d0:84:a6 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '613d38da-47eb-49a5-a0db-2c7565b04273', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-815b09bc-37d4-446a-9180-045b401b0282', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99b73fe1a5684f429b5266c7255f457f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '05fade71-299b-4950-874d-199681edccc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4dfa74cf-154c-4847-a9d1-c60bdb8f0da0, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:55.980 103861 INFO neutron.agent.ovn.metadata.agent [-] Port df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f in datapath 815b09bc-37d4-446a-9180-045b401b0282 bound to our chassis
Sep 30 21:35:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:55.982 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 815b09bc-37d4-446a-9180-045b401b0282
Sep 30 21:35:55 compute-1 systemd-udevd[237209]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:35:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:55.993 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b68c00b9-7c4f-4664-9714-e625f86ce1ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:55.995 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap815b09bc-31 in ovnmeta-815b09bc-37d4-446a-9180-045b401b0282 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:35:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:55.998 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap815b09bc-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:35:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:55.998 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7bba46-5cd2-4290-997c-c007d5b58c4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 NetworkManager[51724]: <info>  [1759268156.0021] device (tapdf352cf4-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:35:56 compute-1 NetworkManager[51724]: <info>  [1759268156.0038] device (tapdf352cf4-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.004 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7f167043-aff1-4d19-80f7-2666b09aedff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.021 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[9009868b-5c31-4466-b714-268a7be4abba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 systemd-machined[152783]: New machine qemu-49-instance-00000067.
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-1 ovn_controller[94902]: 2025-09-30T21:35:56Z|00398|binding|INFO|Setting lport df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f ovn-installed in OVS
Sep 30 21:35:56 compute-1 ovn_controller[94902]: 2025-09-30T21:35:56Z|00399|binding|INFO|Setting lport df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f up in Southbound
Sep 30 21:35:56 compute-1 systemd[1]: Started Virtual Machine qemu-49-instance-00000067.
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.052 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1df3cddf-0a6f-4245-9100-7f11b6c37321]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.083 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[20ac211b-2748-44dc-819d-ce250d4dc6bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 NetworkManager[51724]: <info>  [1759268156.0900] manager: (tap815b09bc-30): new Veth device (/org/freedesktop/NetworkManager/Devices/205)
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.090 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e259d9b6-d7e6-4848-ac6a-b715c8532e29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.132 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b3835dc3-7ccf-40a5-9dfc-6ac640d7723f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.136 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f91ad019-4573-495d-97b4-46e4239edb8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 NetworkManager[51724]: <info>  [1759268156.1681] device (tap815b09bc-30): carrier: link connected
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.175 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7e2f8245-9b8b-446b-ae75-116259a1abb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.200 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[65d24523-dcd9-47f4-8cb9-9969c9352e3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap815b09bc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:41:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485184, 'reachable_time': 16942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237245, 'error': None, 'target': 'ovnmeta-815b09bc-37d4-446a-9180-045b401b0282', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.221 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d36f7252-08b1-4a4f-9201-59e402493ed5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:4109'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485184, 'tstamp': 485184}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237246, 'error': None, 'target': 'ovnmeta-815b09bc-37d4-446a-9180-045b401b0282', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.244 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7184abcf-f2e0-4224-8863-f9be4ba165c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap815b09bc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:41:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485184, 'reachable_time': 16942, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237247, 'error': None, 'target': 'ovnmeta-815b09bc-37d4-446a-9180-045b401b0282', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.287 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[55f6d42c-f010-4d52-8211-642dd6a6790c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.380 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[97d226b6-804b-458e-9b30-41681ceb5a18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.382 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap815b09bc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.382 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.382 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap815b09bc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:56 compute-1 NetworkManager[51724]: <info>  [1759268156.3857] manager: (tap815b09bc-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Sep 30 21:35:56 compute-1 kernel: tap815b09bc-30: entered promiscuous mode
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.387 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap815b09bc-30, col_values=(('external_ids', {'iface-id': '4aa4eb43-ff25-454c-a2d9-afcc85a25150'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:56 compute-1 ovn_controller[94902]: 2025-09-30T21:35:56Z|00400|binding|INFO|Releasing lport 4aa4eb43-ff25-454c-a2d9-afcc85a25150 from this chassis (sb_readonly=0)
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.390 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/815b09bc-37d4-446a-9180-045b401b0282.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/815b09bc-37d4-446a-9180-045b401b0282.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.406 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5ccb183e-80a2-470c-ad33-411668ce9d48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.408 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-815b09bc-37d4-446a-9180-045b401b0282
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/815b09bc-37d4-446a-9180-045b401b0282.pid.haproxy
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 815b09bc-37d4-446a-9180-045b401b0282
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:35:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:56.409 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-815b09bc-37d4-446a-9180-045b401b0282', 'env', 'PROCESS_TAG=haproxy-815b09bc-37d4-446a-9180-045b401b0282', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/815b09bc-37d4-446a-9180-045b401b0282.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.502 2 DEBUG nova.compute.manager [req-0e37ccac-6594-45d7-b28f-a96ee7c29d25 req-a5d000bf-5ae1-4611-814d-5f314c563ff2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Received event network-vif-plugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.503 2 DEBUG oslo_concurrency.lockutils [req-0e37ccac-6594-45d7-b28f-a96ee7c29d25 req-a5d000bf-5ae1-4611-814d-5f314c563ff2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.503 2 DEBUG oslo_concurrency.lockutils [req-0e37ccac-6594-45d7-b28f-a96ee7c29d25 req-a5d000bf-5ae1-4611-814d-5f314c563ff2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.503 2 DEBUG oslo_concurrency.lockutils [req-0e37ccac-6594-45d7-b28f-a96ee7c29d25 req-a5d000bf-5ae1-4611-814d-5f314c563ff2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.504 2 DEBUG nova.compute.manager [req-0e37ccac-6594-45d7-b28f-a96ee7c29d25 req-a5d000bf-5ae1-4611-814d-5f314c563ff2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Processing event network-vif-plugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:56 compute-1 podman[237285]: 2025-09-30 21:35:56.824226415 +0000 UTC m=+0.088781460 container create 70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:35:56 compute-1 podman[237285]: 2025-09-30 21:35:56.761133357 +0000 UTC m=+0.025688432 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:35:56 compute-1 systemd[1]: Started libpod-conmon-70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38.scope.
Sep 30 21:35:56 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:35:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/011d02a24ee02091456275f0313310d4110ae4e54c7a2a979590b23db668a479/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:35:56 compute-1 podman[237285]: 2025-09-30 21:35:56.933664636 +0000 UTC m=+0.198219701 container init 70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.934 2 DEBUG nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.936 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268156.934131, 613d38da-47eb-49a5-a0db-2c7565b04273 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.936 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] VM Started (Lifecycle Event)
Sep 30 21:35:56 compute-1 podman[237285]: 2025-09-30 21:35:56.942360172 +0000 UTC m=+0.206915217 container start 70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.942 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.947 2 INFO nova.virt.libvirt.driver [-] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Instance spawned successfully.
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.948 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:35:56 compute-1 neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282[237301]: [NOTICE]   (237305) : New worker (237307) forked
Sep 30 21:35:56 compute-1 neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282[237301]: [NOTICE]   (237305) : Loading success.
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.972 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.977 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.982 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.982 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.983 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.983 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.983 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:56 compute-1 nova_compute[192795]: 2025-09-30 21:35:56.984 2 DEBUG nova.virt.libvirt.driver [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.024 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.024 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268156.9356277, 613d38da-47eb-49a5-a0db-2c7565b04273 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.024 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] VM Paused (Lifecycle Event)
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.056 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.060 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268156.9411886, 613d38da-47eb-49a5-a0db-2c7565b04273 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.060 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] VM Resumed (Lifecycle Event)
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.086 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.092 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.107 2 INFO nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Took 7.05 seconds to spawn the instance on the hypervisor.
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.107 2 DEBUG nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.120 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.219 2 INFO nova.compute.manager [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Took 7.70 seconds to build instance.
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.248 2 DEBUG oslo_concurrency.lockutils [None req-14aa99ed-2df8-4bb8-a65c-19c27dd23b61 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.858s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.419 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268142.4182537, b71156d1-78c9-474a-9870-926426fb9e6f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.420 2 INFO nova.compute.manager [-] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] VM Stopped (Lifecycle Event)
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.423 2 DEBUG nova.network.neutron [req-490fc59b-dd67-4808-a60a-5c78f6003c04 req-002351a6-978d-4ef1-af45-6cde4f8354f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Updated VIF entry in instance network info cache for port df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.423 2 DEBUG nova.network.neutron [req-490fc59b-dd67-4808-a60a-5c78f6003c04 req-002351a6-978d-4ef1-af45-6cde4f8354f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Updating instance_info_cache with network_info: [{"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.461 2 DEBUG nova.compute.manager [None req-c9bfbb35-5504-4aa3-b0c7-6796f41d7108 - - - - - -] [instance: b71156d1-78c9-474a-9870-926426fb9e6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:57 compute-1 nova_compute[192795]: 2025-09-30 21:35:57.461 2 DEBUG oslo_concurrency.lockutils [req-490fc59b-dd67-4808-a60a-5c78f6003c04 req-002351a6-978d-4ef1-af45-6cde4f8354f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-613d38da-47eb-49a5-a0db-2c7565b04273" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:58 compute-1 nova_compute[192795]: 2025-09-30 21:35:58.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:58 compute-1 nova_compute[192795]: 2025-09-30 21:35:58.668 2 DEBUG nova.compute.manager [req-c501d553-f31c-4974-b46e-2ec5c84887be req-a80e092d-3b20-426e-85ec-6b761b1cd452 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Received event network-vif-plugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:58 compute-1 nova_compute[192795]: 2025-09-30 21:35:58.670 2 DEBUG oslo_concurrency.lockutils [req-c501d553-f31c-4974-b46e-2ec5c84887be req-a80e092d-3b20-426e-85ec-6b761b1cd452 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:58 compute-1 nova_compute[192795]: 2025-09-30 21:35:58.670 2 DEBUG oslo_concurrency.lockutils [req-c501d553-f31c-4974-b46e-2ec5c84887be req-a80e092d-3b20-426e-85ec-6b761b1cd452 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:58 compute-1 nova_compute[192795]: 2025-09-30 21:35:58.670 2 DEBUG oslo_concurrency.lockutils [req-c501d553-f31c-4974-b46e-2ec5c84887be req-a80e092d-3b20-426e-85ec-6b761b1cd452 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:58 compute-1 nova_compute[192795]: 2025-09-30 21:35:58.671 2 DEBUG nova.compute.manager [req-c501d553-f31c-4974-b46e-2ec5c84887be req-a80e092d-3b20-426e-85ec-6b761b1cd452 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] No waiting events found dispatching network-vif-plugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:58 compute-1 nova_compute[192795]: 2025-09-30 21:35:58.671 2 WARNING nova.compute.manager [req-c501d553-f31c-4974-b46e-2ec5c84887be req-a80e092d-3b20-426e-85ec-6b761b1cd452 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Received unexpected event network-vif-plugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f for instance with vm_state active and task_state None.
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.622 2 DEBUG oslo_concurrency.lockutils [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "613d38da-47eb-49a5-a0db-2c7565b04273" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.623 2 DEBUG oslo_concurrency.lockutils [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.623 2 DEBUG oslo_concurrency.lockutils [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.623 2 DEBUG oslo_concurrency.lockutils [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.623 2 DEBUG oslo_concurrency.lockutils [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.640 2 INFO nova.compute.manager [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Terminating instance
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.651 2 DEBUG nova.compute.manager [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:35:59 compute-1 kernel: tapdf352cf4-3c (unregistering): left promiscuous mode
Sep 30 21:35:59 compute-1 NetworkManager[51724]: <info>  [1759268159.6765] device (tapdf352cf4-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:59 compute-1 ovn_controller[94902]: 2025-09-30T21:35:59Z|00401|binding|INFO|Releasing lport df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f from this chassis (sb_readonly=0)
Sep 30 21:35:59 compute-1 ovn_controller[94902]: 2025-09-30T21:35:59Z|00402|binding|INFO|Setting lport df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f down in Southbound
Sep 30 21:35:59 compute-1 ovn_controller[94902]: 2025-09-30T21:35:59Z|00403|binding|INFO|Removing iface tapdf352cf4-3c ovn-installed in OVS
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:59.740 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:84:a6 10.100.0.14'], port_security=['fa:16:3e:d0:84:a6 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '613d38da-47eb-49a5-a0db-2c7565b04273', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-815b09bc-37d4-446a-9180-045b401b0282', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99b73fe1a5684f429b5266c7255f457f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '05fade71-299b-4950-874d-199681edccc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4dfa74cf-154c-4847-a9d1-c60bdb8f0da0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:59.742 103861 INFO neutron.agent.ovn.metadata.agent [-] Port df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f in datapath 815b09bc-37d4-446a-9180-045b401b0282 unbound from our chassis
Sep 30 21:35:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:59.744 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 815b09bc-37d4-446a-9180-045b401b0282, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:35:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:59.745 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3b3056-4204-4a51-8baf-71b5f2382ddc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:35:59.746 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-815b09bc-37d4-446a-9180-045b401b0282 namespace which is not needed anymore
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:59 compute-1 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000067.scope: Deactivated successfully.
Sep 30 21:35:59 compute-1 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000067.scope: Consumed 3.517s CPU time.
Sep 30 21:35:59 compute-1 systemd-machined[152783]: Machine qemu-49-instance-00000067 terminated.
Sep 30 21:35:59 compute-1 neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282[237301]: [NOTICE]   (237305) : haproxy version is 2.8.14-c23fe91
Sep 30 21:35:59 compute-1 neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282[237301]: [NOTICE]   (237305) : path to executable is /usr/sbin/haproxy
Sep 30 21:35:59 compute-1 neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282[237301]: [WARNING]  (237305) : Exiting Master process...
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:59 compute-1 neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282[237301]: [ALERT]    (237305) : Current worker (237307) exited with code 143 (Terminated)
Sep 30 21:35:59 compute-1 neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282[237301]: [WARNING]  (237305) : All workers exited. Exiting... (0)
Sep 30 21:35:59 compute-1 systemd[1]: libpod-70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38.scope: Deactivated successfully.
Sep 30 21:35:59 compute-1 podman[237341]: 2025-09-30 21:35:59.887890114 +0000 UTC m=+0.048535393 container died 70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:59 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38-userdata-shm.mount: Deactivated successfully.
Sep 30 21:35:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-011d02a24ee02091456275f0313310d4110ae4e54c7a2a979590b23db668a479-merged.mount: Deactivated successfully.
Sep 30 21:35:59 compute-1 podman[237341]: 2025-09-30 21:35:59.934496033 +0000 UTC m=+0.095141282 container cleanup 70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.942 2 INFO nova.virt.libvirt.driver [-] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Instance destroyed successfully.
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.942 2 DEBUG nova.objects.instance [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lazy-loading 'resources' on Instance uuid 613d38da-47eb-49a5-a0db-2c7565b04273 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.964 2 DEBUG nova.virt.libvirt.vif [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:35:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-98053286',display_name='tempest-ServerGroupTestJSON-server-98053286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-98053286',id=103,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:35:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='99b73fe1a5684f429b5266c7255f457f',ramdisk_id='',reservation_id='r-3zlncuuk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-1929524560',owner_user_name='tempest-ServerGroupTestJSON-1929524560-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:35:57Z,user_data=None,user_id='73d8108bca4a4705ac72d3d469961da6',uuid=613d38da-47eb-49a5-a0db-2c7565b04273,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.965 2 DEBUG nova.network.os_vif_util [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Converting VIF {"id": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "address": "fa:16:3e:d0:84:a6", "network": {"id": "815b09bc-37d4-446a-9180-045b401b0282", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1937889974-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99b73fe1a5684f429b5266c7255f457f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf352cf4-3c", "ovs_interfaceid": "df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.966 2 DEBUG nova.network.os_vif_util [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:84:a6,bridge_name='br-int',has_traffic_filtering=True,id=df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f,network=Network(815b09bc-37d4-446a-9180-045b401b0282),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf352cf4-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.967 2 DEBUG os_vif [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:84:a6,bridge_name='br-int',has_traffic_filtering=True,id=df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f,network=Network(815b09bc-37d4-446a-9180-045b401b0282),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf352cf4-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.970 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf352cf4-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.978 2 INFO os_vif [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:84:a6,bridge_name='br-int',has_traffic_filtering=True,id=df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f,network=Network(815b09bc-37d4-446a-9180-045b401b0282),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf352cf4-3c')
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.978 2 INFO nova.virt.libvirt.driver [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Deleting instance files /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273_del
Sep 30 21:35:59 compute-1 nova_compute[192795]: 2025-09-30 21:35:59.980 2 INFO nova.virt.libvirt.driver [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Deletion of /var/lib/nova/instances/613d38da-47eb-49a5-a0db-2c7565b04273_del complete
Sep 30 21:35:59 compute-1 systemd[1]: libpod-conmon-70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38.scope: Deactivated successfully.
Sep 30 21:36:00 compute-1 podman[237382]: 2025-09-30 21:36:00.014068871 +0000 UTC m=+0.055322797 container remove 70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:36:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:00.021 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[864dc7dd-858f-445b-827a-0dd6f2d79e32]: (4, ('Tue Sep 30 09:35:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282 (70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38)\n70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38\nTue Sep 30 09:35:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-815b09bc-37d4-446a-9180-045b401b0282 (70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38)\n70c75eb6af72e6ca78a872393a7e798be202c77baff9c62bada7728166dade38\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:00.023 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb14c0e-c9d8-4ca3-8a0e-e856078e3924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:00.025 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap815b09bc-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:00 compute-1 kernel: tap815b09bc-30: left promiscuous mode
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:00.033 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5651d70d-f10d-4900-b572-68ec78f8ad2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:00.070 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd81b36-bffd-4228-bfb5-d949ffe9ff52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:00.072 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5d4f6336-2f78-4148-9d39-f934edaa4b7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.079 2 INFO nova.compute.manager [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.080 2 DEBUG oslo.service.loopingcall [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.080 2 DEBUG nova.compute.manager [-] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.081 2 DEBUG nova.network.neutron [-] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:36:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:00.092 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d58c0a90-a50b-4b9f-baeb-f2c71fbebb83]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485175, 'reachable_time': 44707, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237401, 'error': None, 'target': 'ovnmeta-815b09bc-37d4-446a-9180-045b401b0282', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:00.095 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-815b09bc-37d4-446a-9180-045b401b0282 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:36:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:00.095 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[f37348bf-48c0-4fbd-9647-76049d13ed08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:00 compute-1 systemd[1]: run-netns-ovnmeta\x2d815b09bc\x2d37d4\x2d446a\x2d9180\x2d045b401b0282.mount: Deactivated successfully.
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.842 2 DEBUG nova.compute.manager [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Received event network-vif-unplugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.845 2 DEBUG oslo_concurrency.lockutils [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.846 2 DEBUG oslo_concurrency.lockutils [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.846 2 DEBUG oslo_concurrency.lockutils [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.847 2 DEBUG nova.compute.manager [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] No waiting events found dispatching network-vif-unplugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.847 2 DEBUG nova.compute.manager [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Received event network-vif-unplugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.847 2 DEBUG nova.compute.manager [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Received event network-vif-plugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.847 2 DEBUG oslo_concurrency.lockutils [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.847 2 DEBUG oslo_concurrency.lockutils [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.847 2 DEBUG oslo_concurrency.lockutils [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.848 2 DEBUG nova.compute.manager [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] No waiting events found dispatching network-vif-plugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:00 compute-1 nova_compute[192795]: 2025-09-30 21:36:00.848 2 WARNING nova.compute.manager [req-85f5d9c1-9c50-43af-bf0d-317cc42b8d5b req-123e3c24-1d59-4134-b404-9d08dcc1ecf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Received unexpected event network-vif-plugged-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f for instance with vm_state active and task_state deleting.
Sep 30 21:36:01 compute-1 podman[237404]: 2025-09-30 21:36:01.228476968 +0000 UTC m=+0.066319897 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:36:01 compute-1 podman[237403]: 2025-09-30 21:36:01.24503405 +0000 UTC m=+0.078757756 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:36:01 compute-1 podman[237402]: 2025-09-30 21:36:01.256009749 +0000 UTC m=+0.090796015 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.540 2 DEBUG nova.network.neutron [-] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.562 2 INFO nova.compute.manager [-] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Took 1.48 seconds to deallocate network for instance.
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.656 2 DEBUG oslo_concurrency.lockutils [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.656 2 DEBUG oslo_concurrency.lockutils [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.775 2 DEBUG nova.compute.provider_tree [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.803 2 DEBUG nova.scheduler.client.report [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.837 2 DEBUG oslo_concurrency.lockutils [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.873 2 INFO nova.scheduler.client.report [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Deleted allocations for instance 613d38da-47eb-49a5-a0db-2c7565b04273
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.925 2 DEBUG nova.compute.manager [req-63707cb2-9bbe-4358-a7e5-3a84fc85cf28 req-8e6e14c6-c503-418c-a456-de4e0a3d0ad4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Received event network-vif-deleted-df352cf4-3cfb-47dd-a9d8-1fc39bb8d31f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:01 compute-1 nova_compute[192795]: 2025-09-30 21:36:01.971 2 DEBUG oslo_concurrency.lockutils [None req-6cf612ec-0fc0-4761-b008-7c577e751399 73d8108bca4a4705ac72d3d469961da6 99b73fe1a5684f429b5266c7255f457f - - default default] Lock "613d38da-47eb-49a5-a0db-2c7565b04273" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.723 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.723 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.821 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.821 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.847 2 DEBUG nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.954 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.955 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.968 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:36:02 compute-1 nova_compute[192795]: 2025-09-30 21:36:02.969 2 INFO nova.compute.claims [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.188 2 DEBUG nova.compute.provider_tree [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.219 2 DEBUG nova.scheduler.client.report [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.251 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.252 2 DEBUG nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.333 2 DEBUG nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.334 2 DEBUG nova.network.neutron [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.363 2 INFO nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.382 2 DEBUG nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.442 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.443 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.468 2 DEBUG nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.562 2 DEBUG nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.563 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.564 2 INFO nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Creating image(s)
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.565 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.565 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.566 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.579 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.601 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.602 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.607 2 DEBUG nova.policy [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.613 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.613 2 INFO nova.compute.claims [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.643 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.645 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.646 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.660 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.724 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.725 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.767 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.769 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.770 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.839 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.840 2 DEBUG nova.virt.disk.api [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Checking if we can resize image /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.841 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.867 2 DEBUG nova.compute.provider_tree [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.893 2 DEBUG nova.scheduler.client.report [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.904 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.905 2 DEBUG nova.virt.disk.api [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Cannot resize image /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.905 2 DEBUG nova.objects.instance [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'migration_context' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.938 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.939 2 DEBUG nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.942 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.943 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Ensure instance console log exists: /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.943 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.944 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:03 compute-1 nova_compute[192795]: 2025-09-30 21:36:03.944 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.021 2 DEBUG nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.021 2 DEBUG nova.network.neutron [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.050 2 INFO nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.075 2 DEBUG nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.296 2 DEBUG nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.298 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.299 2 INFO nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Creating image(s)
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.300 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.302 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.303 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.330 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.408 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.410 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.412 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.438 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.503 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.505 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.540 2 DEBUG nova.policy [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.546 2 DEBUG nova.network.neutron [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Successfully created port: 53d72042-40b3-4719-8611-1684c83c15ea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.554 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.555 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.556 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.621 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.622 2 DEBUG nova.virt.disk.api [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Checking if we can resize image /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.623 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.688 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.689 2 DEBUG nova.virt.disk.api [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Cannot resize image /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.690 2 DEBUG nova.objects.instance [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'migration_context' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.703 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.703 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Ensure instance console log exists: /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.704 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.704 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.704 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:04 compute-1 nova_compute[192795]: 2025-09-30 21:36:04.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:05 compute-1 nova_compute[192795]: 2025-09-30 21:36:05.719 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:05 compute-1 nova_compute[192795]: 2025-09-30 21:36:05.916 2 DEBUG nova.network.neutron [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Successfully created port: 203ac748-c872-4282-8098-aa626889ec81 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:36:06 compute-1 nova_compute[192795]: 2025-09-30 21:36:06.342 2 DEBUG nova.network.neutron [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Successfully updated port: 53d72042-40b3-4719-8611-1684c83c15ea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:36:06 compute-1 nova_compute[192795]: 2025-09-30 21:36:06.495 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "refresh_cache-103b7a79-7ea6-47fa-bd6f-bcc87a96369b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:06 compute-1 nova_compute[192795]: 2025-09-30 21:36:06.495 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquired lock "refresh_cache-103b7a79-7ea6-47fa-bd6f-bcc87a96369b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:06 compute-1 nova_compute[192795]: 2025-09-30 21:36:06.496 2 DEBUG nova.network.neutron [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:36:06 compute-1 nova_compute[192795]: 2025-09-30 21:36:06.549 2 DEBUG nova.compute.manager [req-0699b2b1-241c-4a5a-9532-c6a6c383c176 req-cbf55610-24cb-4baa-bf4f-8a7f3b96e056 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-changed-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:06 compute-1 nova_compute[192795]: 2025-09-30 21:36:06.550 2 DEBUG nova.compute.manager [req-0699b2b1-241c-4a5a-9532-c6a6c383c176 req-cbf55610-24cb-4baa-bf4f-8a7f3b96e056 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Refreshing instance network info cache due to event network-changed-53d72042-40b3-4719-8611-1684c83c15ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:36:06 compute-1 nova_compute[192795]: 2025-09-30 21:36:06.550 2 DEBUG oslo_concurrency.lockutils [req-0699b2b1-241c-4a5a-9532-c6a6c383c176 req-cbf55610-24cb-4baa-bf4f-8a7f3b96e056 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-103b7a79-7ea6-47fa-bd6f-bcc87a96369b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:06 compute-1 nova_compute[192795]: 2025-09-30 21:36:06.963 2 DEBUG nova.network.neutron [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:36:07 compute-1 nova_compute[192795]: 2025-09-30 21:36:07.617 2 DEBUG nova.network.neutron [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Successfully updated port: 203ac748-c872-4282-8098-aa626889ec81 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:36:07 compute-1 nova_compute[192795]: 2025-09-30 21:36:07.641 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:07 compute-1 nova_compute[192795]: 2025-09-30 21:36:07.641 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquired lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:07 compute-1 nova_compute[192795]: 2025-09-30 21:36:07.641 2 DEBUG nova.network.neutron [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:36:07 compute-1 nova_compute[192795]: 2025-09-30 21:36:07.783 2 DEBUG nova.compute.manager [req-86c61d6a-b05e-4966-a3a4-6c3bc4c43d72 req-43943432-b6d8-4866-9e9b-8053df0c9340 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-changed-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:07 compute-1 nova_compute[192795]: 2025-09-30 21:36:07.783 2 DEBUG nova.compute.manager [req-86c61d6a-b05e-4966-a3a4-6c3bc4c43d72 req-43943432-b6d8-4866-9e9b-8053df0c9340 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Refreshing instance network info cache due to event network-changed-203ac748-c872-4282-8098-aa626889ec81. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:36:07 compute-1 nova_compute[192795]: 2025-09-30 21:36:07.784 2 DEBUG oslo_concurrency.lockutils [req-86c61d6a-b05e-4966-a3a4-6c3bc4c43d72 req-43943432-b6d8-4866-9e9b-8053df0c9340 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:07 compute-1 nova_compute[192795]: 2025-09-30 21:36:07.883 2 DEBUG nova.network.neutron [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.790 2 DEBUG nova.network.neutron [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Updating instance_info_cache with network_info: [{"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.845 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Releasing lock "refresh_cache-103b7a79-7ea6-47fa-bd6f-bcc87a96369b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.846 2 DEBUG nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance network_info: |[{"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.846 2 DEBUG oslo_concurrency.lockutils [req-0699b2b1-241c-4a5a-9532-c6a6c383c176 req-cbf55610-24cb-4baa-bf4f-8a7f3b96e056 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-103b7a79-7ea6-47fa-bd6f-bcc87a96369b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.846 2 DEBUG nova.network.neutron [req-0699b2b1-241c-4a5a-9532-c6a6c383c176 req-cbf55610-24cb-4baa-bf4f-8a7f3b96e056 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Refreshing network info cache for port 53d72042-40b3-4719-8611-1684c83c15ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.850 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Start _get_guest_xml network_info=[{"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.854 2 WARNING nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.858 2 DEBUG nova.virt.libvirt.host [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.858 2 DEBUG nova.virt.libvirt.host [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.862 2 DEBUG nova.virt.libvirt.host [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.863 2 DEBUG nova.virt.libvirt.host [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.864 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.864 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.864 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.864 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.865 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.865 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.865 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.865 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.865 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.866 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.866 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.866 2 DEBUG nova.virt.hardware [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.870 2 DEBUG nova.virt.libvirt.vif [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:36:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1141087200',display_name='tempest-ListServerFiltersTestJSON-instance-1141087200',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1141087200',id=105,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='17bd9c2628a94a0b83c4cae3f51b3f7c',ramdisk_id='',reservation_id='r-rydrv4y4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1322408077',owner_user_name='tempest-ListServerFiltersTestJSON-1322408077-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:03Z,user_data=None,user_id='3746d13787f042a1bfad4de0c42015eb',uuid=103b7a79-7ea6-47fa-bd6f-bcc87a96369b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.870 2 DEBUG nova.network.os_vif_util [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converting VIF {"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.871 2 DEBUG nova.network.os_vif_util [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.871 2 DEBUG nova.objects.instance [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.884 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <uuid>103b7a79-7ea6-47fa-bd6f-bcc87a96369b</uuid>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <name>instance-00000069</name>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-1141087200</nova:name>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:36:08</nova:creationTime>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:36:08 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:36:08 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:36:08 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:36:08 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:36:08 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:36:08 compute-1 nova_compute[192795]:         <nova:user uuid="3746d13787f042a1bfad4de0c42015eb">tempest-ListServerFiltersTestJSON-1322408077-project-member</nova:user>
Sep 30 21:36:08 compute-1 nova_compute[192795]:         <nova:project uuid="17bd9c2628a94a0b83c4cae3f51b3f7c">tempest-ListServerFiltersTestJSON-1322408077</nova:project>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:36:08 compute-1 nova_compute[192795]:         <nova:port uuid="53d72042-40b3-4719-8611-1684c83c15ea">
Sep 30 21:36:08 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <system>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <entry name="serial">103b7a79-7ea6-47fa-bd6f-bcc87a96369b</entry>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <entry name="uuid">103b7a79-7ea6-47fa-bd6f-bcc87a96369b</entry>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     </system>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <os>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   </os>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <features>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   </features>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.config"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:fc:6f:ef"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <target dev="tap53d72042-40"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/console.log" append="off"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <video>
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     </video>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:36:08 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:36:08 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:36:08 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:36:08 compute-1 nova_compute[192795]: </domain>
Sep 30 21:36:08 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.885 2 DEBUG nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Preparing to wait for external event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.886 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.886 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.886 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.887 2 DEBUG nova.virt.libvirt.vif [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:36:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1141087200',display_name='tempest-ListServerFiltersTestJSON-instance-1141087200',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1141087200',id=105,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='17bd9c2628a94a0b83c4cae3f51b3f7c',ramdisk_id='',reservation_id='r-rydrv4y4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1322408077',owner_user_name='tempest-ListServerFiltersTestJSON-1322408077-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:03Z,user_data=None,user_id='3746d13787f042a1bfad4de0c42015eb',uuid=103b7a79-7ea6-47fa-bd6f-bcc87a96369b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.887 2 DEBUG nova.network.os_vif_util [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converting VIF {"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.887 2 DEBUG nova.network.os_vif_util [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.888 2 DEBUG os_vif [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.888 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.889 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.892 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap53d72042-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.892 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap53d72042-40, col_values=(('external_ids', {'iface-id': '53d72042-40b3-4719-8611-1684c83c15ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:6f:ef', 'vm-uuid': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:08 compute-1 NetworkManager[51724]: <info>  [1759268168.8948] manager: (tap53d72042-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.901 2 INFO os_vif [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40')
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.970 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.970 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.970 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] No VIF found with MAC fa:16:3e:fc:6f:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:36:08 compute-1 nova_compute[192795]: 2025-09-30 21:36:08.970 2 INFO nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Using config drive
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.451 2 DEBUG nova.network.neutron [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Updating instance_info_cache with network_info: [{"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.485 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Releasing lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.486 2 DEBUG nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance network_info: |[{"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.486 2 DEBUG oslo_concurrency.lockutils [req-86c61d6a-b05e-4966-a3a4-6c3bc4c43d72 req-43943432-b6d8-4866-9e9b-8053df0c9340 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.486 2 DEBUG nova.network.neutron [req-86c61d6a-b05e-4966-a3a4-6c3bc4c43d72 req-43943432-b6d8-4866-9e9b-8053df0c9340 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Refreshing network info cache for port 203ac748-c872-4282-8098-aa626889ec81 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.489 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Start _get_guest_xml network_info=[{"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.493 2 WARNING nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.501 2 DEBUG nova.virt.libvirt.host [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.502 2 DEBUG nova.virt.libvirt.host [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.507 2 DEBUG nova.virt.libvirt.host [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.507 2 DEBUG nova.virt.libvirt.host [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.508 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.508 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.509 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.509 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.509 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.509 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.509 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.510 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.510 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.510 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.510 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.511 2 DEBUG nova.virt.hardware [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.513 2 DEBUG nova.virt.libvirt.vif [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:36:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-2109753286',display_name='tempest-ServerStableDeviceRescueTest-server-2109753286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-2109753286',id=106,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-6inq7tbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:04Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.513 2 DEBUG nova.network.os_vif_util [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.514 2 DEBUG nova.network.os_vif_util [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:15:a0,bridge_name='br-int',has_traffic_filtering=True,id=203ac748-c872-4282-8098-aa626889ec81,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203ac748-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.515 2 DEBUG nova.objects.instance [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'pci_devices' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.532 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <uuid>72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca</uuid>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <name>instance-0000006a</name>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-2109753286</nova:name>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:36:09</nova:creationTime>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:36:09 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:36:09 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:36:09 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:36:09 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:36:09 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:36:09 compute-1 nova_compute[192795]:         <nova:user uuid="8b1ebef014c145cbbe1e367bfd2c2ba3">tempest-ServerStableDeviceRescueTest-1939201844-project-member</nova:user>
Sep 30 21:36:09 compute-1 nova_compute[192795]:         <nova:project uuid="8978d2df88a5434c8794b659033cca5e">tempest-ServerStableDeviceRescueTest-1939201844</nova:project>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:36:09 compute-1 nova_compute[192795]:         <nova:port uuid="203ac748-c872-4282-8098-aa626889ec81">
Sep 30 21:36:09 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <system>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <entry name="serial">72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca</entry>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <entry name="uuid">72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca</entry>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     </system>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <os>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   </os>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <features>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   </features>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:74:15:a0"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <target dev="tap203ac748-c8"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/console.log" append="off"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <video>
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     </video>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:36:09 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:36:09 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:36:09 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:36:09 compute-1 nova_compute[192795]: </domain>
Sep 30 21:36:09 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.533 2 DEBUG nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Preparing to wait for external event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.534 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.534 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.534 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.535 2 DEBUG nova.virt.libvirt.vif [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:36:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-2109753286',display_name='tempest-ServerStableDeviceRescueTest-server-2109753286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-2109753286',id=106,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-6inq7tbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:04Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.535 2 DEBUG nova.network.os_vif_util [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.536 2 DEBUG nova.network.os_vif_util [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:15:a0,bridge_name='br-int',has_traffic_filtering=True,id=203ac748-c872-4282-8098-aa626889ec81,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203ac748-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.536 2 DEBUG os_vif [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:15:a0,bridge_name='br-int',has_traffic_filtering=True,id=203ac748-c872-4282-8098-aa626889ec81,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203ac748-c8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.537 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.537 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.540 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap203ac748-c8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.540 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap203ac748-c8, col_values=(('external_ids', {'iface-id': '203ac748-c872-4282-8098-aa626889ec81', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:15:a0', 'vm-uuid': '72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:09 compute-1 NetworkManager[51724]: <info>  [1759268169.5426] manager: (tap203ac748-c8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/208)
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.550 2 INFO os_vif [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:15:a0,bridge_name='br-int',has_traffic_filtering=True,id=203ac748-c872-4282-8098-aa626889ec81,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203ac748-c8')
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.568 2 INFO nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Creating config drive at /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.config
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.573 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvq54oe5r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.642 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.643 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.643 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No VIF found with MAC fa:16:3e:74:15:a0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.643 2 INFO nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Using config drive
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.699 2 DEBUG oslo_concurrency.processutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvq54oe5r" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:09 compute-1 kernel: tap53d72042-40: entered promiscuous mode
Sep 30 21:36:09 compute-1 NetworkManager[51724]: <info>  [1759268169.7685] manager: (tap53d72042-40): new Tun device (/org/freedesktop/NetworkManager/Devices/209)
Sep 30 21:36:09 compute-1 ovn_controller[94902]: 2025-09-30T21:36:09Z|00404|binding|INFO|Claiming lport 53d72042-40b3-4719-8611-1684c83c15ea for this chassis.
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:09 compute-1 ovn_controller[94902]: 2025-09-30T21:36:09Z|00405|binding|INFO|53d72042-40b3-4719-8611-1684c83c15ea: Claiming fa:16:3e:fc:6f:ef 10.100.0.3
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.784 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:6f:ef 10.100.0.3'], port_security=['fa:16:3e:fc:6f:ef 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20111f98-bf7f-4696-b726-3e06c68cfed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6a3ecea0-9346-40cc-9dbe-25cd68fe08ed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4af2c14-351e-4037-bacd-dca3cfee62e9, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=53d72042-40b3-4719-8611-1684c83c15ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.785 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 53d72042-40b3-4719-8611-1684c83c15ea in datapath 20111f98-bf7f-4696-b726-3e06c68cfed2 bound to our chassis
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.786 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20111f98-bf7f-4696-b726-3e06c68cfed2
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.801 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e43393-a542-4edf-a2e2-1e2e53f13dfa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.802 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap20111f98-b1 in ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:36:09 compute-1 systemd-udevd[237517]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.804 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap20111f98-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.804 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4522c4-7d13-4627-a057-71ae032caa95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.805 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[df63c728-3ed9-4c6f-a27e-041e88a0764e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:09 compute-1 systemd-machined[152783]: New machine qemu-50-instance-00000069.
Sep 30 21:36:09 compute-1 NetworkManager[51724]: <info>  [1759268169.8212] device (tap53d72042-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:36:09 compute-1 NetworkManager[51724]: <info>  [1759268169.8223] device (tap53d72042-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:36:09 compute-1 ovn_controller[94902]: 2025-09-30T21:36:09Z|00406|binding|INFO|Setting lport 53d72042-40b3-4719-8611-1684c83c15ea ovn-installed in OVS
Sep 30 21:36:09 compute-1 ovn_controller[94902]: 2025-09-30T21:36:09Z|00407|binding|INFO|Setting lport 53d72042-40b3-4719-8611-1684c83c15ea up in Southbound
Sep 30 21:36:09 compute-1 nova_compute[192795]: 2025-09-30 21:36:09.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.824 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[a558d6db-1792-45df-800a-a3cff8b396b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:09 compute-1 systemd[1]: Started Virtual Machine qemu-50-instance-00000069.
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.854 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[51458ba0-ec22-4910-895d-11d21180d587]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.889 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[22d4ca4d-df8d-47ba-82e0-82f0d401b5af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.895 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dca5361b-299f-4159-89c3-37cecd4c50b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:09 compute-1 NetworkManager[51724]: <info>  [1759268169.8967] manager: (tap20111f98-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/210)
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.934 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa31e31-ff07-41b3-8d1c-b053473d16fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.938 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f2fd2d18-2c8c-458f-b707-3a39813e46f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:09 compute-1 NetworkManager[51724]: <info>  [1759268169.9735] device (tap20111f98-b0): carrier: link connected
Sep 30 21:36:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:09.982 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[55d49ee1-329c-4e1d-ae3d-870e07376447]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.008 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e593bbc0-d5b4-4ca7-a9a0-a16c038f8a29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20111f98-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:ef:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486564, 'reachable_time': 37735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237555, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.025 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[df9fdbef-6a56-4324-8d0b-196d7f3f0cb6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe55:efd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486564, 'tstamp': 486564}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237556, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.052 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[26ac5b19-4357-421f-80f6-87488c708b54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20111f98-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:ef:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486564, 'reachable_time': 37735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237557, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.087 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d2c6cbec-0103-403f-9bfd-f1ad5ab5933c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.178 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0fbb3ea5-5b80-4e01-86e9-c8bf74ccc3c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.180 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20111f98-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.181 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.181 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20111f98-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:10 compute-1 kernel: tap20111f98-b0: entered promiscuous mode
Sep 30 21:36:10 compute-1 NetworkManager[51724]: <info>  [1759268170.1842] manager: (tap20111f98-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.189 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20111f98-b0, col_values=(('external_ids', {'iface-id': '36b3f8fd-6b0e-45c8-9c31-56cd0812c366'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:10 compute-1 ovn_controller[94902]: 2025-09-30T21:36:10Z|00408|binding|INFO|Releasing lport 36b3f8fd-6b0e-45c8-9c31-56cd0812c366 from this chassis (sb_readonly=0)
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.207 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/20111f98-bf7f-4696-b726-3e06c68cfed2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/20111f98-bf7f-4696-b726-3e06c68cfed2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.209 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[aebd7162-d927-469d-8b4c-631e0cddfdd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.209 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-20111f98-bf7f-4696-b726-3e06c68cfed2
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/20111f98-bf7f-4696-b726-3e06c68cfed2.pid.haproxy
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 20111f98-bf7f-4696-b726-3e06c68cfed2
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.212 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'env', 'PROCESS_TAG=haproxy-20111f98-bf7f-4696-b726-3e06c68cfed2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/20111f98-bf7f-4696-b726-3e06c68cfed2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.461 2 DEBUG nova.compute.manager [req-83ebb852-50fd-4d75-bbe6-dfc19170876c req-4aaf8518-732a-4f5d-a462-48b56ce9110c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.461 2 DEBUG oslo_concurrency.lockutils [req-83ebb852-50fd-4d75-bbe6-dfc19170876c req-4aaf8518-732a-4f5d-a462-48b56ce9110c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.461 2 DEBUG oslo_concurrency.lockutils [req-83ebb852-50fd-4d75-bbe6-dfc19170876c req-4aaf8518-732a-4f5d-a462-48b56ce9110c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.462 2 DEBUG oslo_concurrency.lockutils [req-83ebb852-50fd-4d75-bbe6-dfc19170876c req-4aaf8518-732a-4f5d-a462-48b56ce9110c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.462 2 DEBUG nova.compute.manager [req-83ebb852-50fd-4d75-bbe6-dfc19170876c req-4aaf8518-732a-4f5d-a462-48b56ce9110c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Processing event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:36:10 compute-1 podman[237601]: 2025-09-30 21:36:10.606528272 +0000 UTC m=+0.056900860 container create e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.616 2 INFO nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Creating config drive at /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.622 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkd2xrksd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:10 compute-1 systemd[1]: Started libpod-conmon-e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a.scope.
Sep 30 21:36:10 compute-1 podman[237601]: 2025-09-30 21:36:10.575184649 +0000 UTC m=+0.025557287 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:36:10 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:36:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf0eeeb1cfbff88c5af2c2b7715585f6d2d69cc6f9b867dd6558f4cb69192927/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:36:10 compute-1 podman[237601]: 2025-09-30 21:36:10.748551999 +0000 UTC m=+0.198924607 container init e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:36:10 compute-1 podman[237601]: 2025-09-30 21:36:10.755401396 +0000 UTC m=+0.205773974 container start e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.770 2 DEBUG oslo_concurrency.processutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkd2xrksd" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:10 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[237617]: [NOTICE]   (237623) : New worker (237626) forked
Sep 30 21:36:10 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[237617]: [NOTICE]   (237623) : Loading success.
Sep 30 21:36:10 compute-1 kernel: tap203ac748-c8: entered promiscuous mode
Sep 30 21:36:10 compute-1 NetworkManager[51724]: <info>  [1759268170.8400] manager: (tap203ac748-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Sep 30 21:36:10 compute-1 ovn_controller[94902]: 2025-09-30T21:36:10Z|00409|binding|INFO|Claiming lport 203ac748-c872-4282-8098-aa626889ec81 for this chassis.
Sep 30 21:36:10 compute-1 ovn_controller[94902]: 2025-09-30T21:36:10Z|00410|binding|INFO|203ac748-c872-4282-8098-aa626889ec81: Claiming fa:16:3e:74:15:a0 10.100.0.5
Sep 30 21:36:10 compute-1 systemd-udevd[237546]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:10 compute-1 NetworkManager[51724]: <info>  [1759268170.8612] device (tap203ac748-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:36:10 compute-1 NetworkManager[51724]: <info>  [1759268170.8624] device (tap203ac748-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.861 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:15:a0 10.100.0.5'], port_security=['fa:16:3e:74:15:a0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=203ac748-c872-4282-8098-aa626889ec81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.863 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 203ac748-c872-4282-8098-aa626889ec81 in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 bound to our chassis
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.865 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.883 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9885f0b4-87f3-40f4-a2b7-2f6765e13724]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.884 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf5a6396a-b1 in ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.887 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf5a6396a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.887 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4c285522-be54-4d4c-b697-6bb185b1f875]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.888 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[97f867c4-b008-44a7-969a-38914cacc7da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:10 compute-1 ovn_controller[94902]: 2025-09-30T21:36:10Z|00411|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 ovn-installed in OVS
Sep 30 21:36:10 compute-1 ovn_controller[94902]: 2025-09-30T21:36:10Z|00412|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 up in Southbound
Sep 30 21:36:10 compute-1 nova_compute[192795]: 2025-09-30 21:36:10.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:10 compute-1 systemd-machined[152783]: New machine qemu-51-instance-0000006a.
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.904 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[5e162acd-604d-4fcc-bf9a-752108e3b8e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 systemd[1]: Started Virtual Machine qemu-51-instance-0000006a.
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.936 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3f6a61dc-7195-46f8-a607-1cc7ef12cc0d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.976 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[53541082-011a-48eb-878a-638bab970b5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:10.988 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[062b5d8c-57e1-4680-8610-72a67e861411]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:10 compute-1 NetworkManager[51724]: <info>  [1759268170.9895] manager: (tapf5a6396a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.025 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f6642515-8092-4933-a682-67a8d2175e2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.030 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf54053-fe77-48f6-8240-323090d4437f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:11 compute-1 NetworkManager[51724]: <info>  [1759268171.0582] device (tapf5a6396a-b0): carrier: link connected
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.062 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ab801be0-c1b5-4c6b-8635-ef4c16cb58cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.081 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[848da36c-160a-45e3-846b-c99c07ea06c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486673, 'reachable_time': 23768, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237666, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.112 2 DEBUG nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.113 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268171.11145, 103b7a79-7ea6-47fa-bd6f-bcc87a96369b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.113 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] VM Started (Lifecycle Event)
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.114 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5d31ed-069b-4380-b3a4-19d05827575b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:66d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486673, 'tstamp': 486673}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237669, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.124 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.131 2 INFO nova.virt.libvirt.driver [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance spawned successfully.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.132 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.136 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c83c0519-0ac0-4d3a-abbd-1c5b612b401d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486673, 'reachable_time': 23768, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237674, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.153 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.163 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.164 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.165 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.165 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.166 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.167 2 DEBUG nova.virt.libvirt.driver [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.171 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.190 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb5d3ae-c797-413e-9de0-a508d0c34239]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.212 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.213 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268171.1117294, 103b7a79-7ea6-47fa-bd6f-bcc87a96369b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.213 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] VM Paused (Lifecycle Event)
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.232 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.236 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268171.1214857, 103b7a79-7ea6-47fa-bd6f-bcc87a96369b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.236 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] VM Resumed (Lifecycle Event)
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.260 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.261 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e2965e60-6b5b-4d6c-ac32-dda76bc6c629]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.262 2 INFO nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Took 7.70 seconds to spawn the instance on the hypervisor.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.262 2 DEBUG nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.263 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.264 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.264 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.266 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:11 compute-1 NetworkManager[51724]: <info>  [1759268171.2669] manager: (tapf5a6396a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Sep 30 21:36:11 compute-1 kernel: tapf5a6396a-b0: entered promiscuous mode
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.269 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:11 compute-1 ovn_controller[94902]: 2025-09-30T21:36:11Z|00413|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.272 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.273 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d7052a2d-e8c0-462a-925e-f9490b66c893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.273 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:36:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:11.274 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'env', 'PROCESS_TAG=haproxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f5a6396a-b7b7-4ff1-a2af-27477fea2815.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.297 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.376 2 INFO nova.compute.manager [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Took 8.46 seconds to build instance.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.395 2 DEBUG oslo_concurrency.lockutils [None req-68fb2f05-38db-4667-974c-8874ed669255 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.477 2 DEBUG nova.compute.manager [req-bf3eb013-b0e8-43d6-8c7d-84a6417b6801 req-862f1075-7724-4a4d-96d2-365dea9861b1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.478 2 DEBUG oslo_concurrency.lockutils [req-bf3eb013-b0e8-43d6-8c7d-84a6417b6801 req-862f1075-7724-4a4d-96d2-365dea9861b1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.479 2 DEBUG oslo_concurrency.lockutils [req-bf3eb013-b0e8-43d6-8c7d-84a6417b6801 req-862f1075-7724-4a4d-96d2-365dea9861b1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.479 2 DEBUG oslo_concurrency.lockutils [req-bf3eb013-b0e8-43d6-8c7d-84a6417b6801 req-862f1075-7724-4a4d-96d2-365dea9861b1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.480 2 DEBUG nova.compute.manager [req-bf3eb013-b0e8-43d6-8c7d-84a6417b6801 req-862f1075-7724-4a4d-96d2-365dea9861b1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Processing event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.637 2 DEBUG nova.network.neutron [req-0699b2b1-241c-4a5a-9532-c6a6c383c176 req-cbf55610-24cb-4baa-bf4f-8a7f3b96e056 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Updated VIF entry in instance network info cache for port 53d72042-40b3-4719-8611-1684c83c15ea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.639 2 DEBUG nova.network.neutron [req-0699b2b1-241c-4a5a-9532-c6a6c383c176 req-cbf55610-24cb-4baa-bf4f-8a7f3b96e056 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Updating instance_info_cache with network_info: [{"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.641 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268171.6415985, 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.642 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] VM Started (Lifecycle Event)
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.644 2 DEBUG nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.648 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.653 2 INFO nova.virt.libvirt.driver [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance spawned successfully.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.654 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.682 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.685 2 DEBUG oslo_concurrency.lockutils [req-0699b2b1-241c-4a5a-9532-c6a6c383c176 req-cbf55610-24cb-4baa-bf4f-8a7f3b96e056 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-103b7a79-7ea6-47fa-bd6f-bcc87a96369b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.688 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.692 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.692 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.693 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.693 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.694 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.696 2 DEBUG nova.virt.libvirt.driver [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:11 compute-1 podman[237705]: 2025-09-30 21:36:11.705008053 +0000 UTC m=+0.066725419 container create e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:36:11 compute-1 systemd[1]: Started libpod-conmon-e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45.scope.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.741 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.743 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268171.6416914, 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.743 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] VM Paused (Lifecycle Event)
Sep 30 21:36:11 compute-1 podman[237705]: 2025-09-30 21:36:11.673730252 +0000 UTC m=+0.035447618 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:36:11 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.771 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.774 2 INFO nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Took 7.48 seconds to spawn the instance on the hypervisor.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.775 2 DEBUG nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.779 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268171.6475756, 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.779 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] VM Resumed (Lifecycle Event)
Sep 30 21:36:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bda18473b03545dde93b67a3773e7b4af8c0dfa81d51020b4400e64fad9b4117/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:36:11 compute-1 podman[237705]: 2025-09-30 21:36:11.798962562 +0000 UTC m=+0.160679918 container init e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:36:11 compute-1 podman[237705]: 2025-09-30 21:36:11.806328093 +0000 UTC m=+0.168045469 container start e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:11 compute-1 podman[237718]: 2025-09-30 21:36:11.822698709 +0000 UTC m=+0.074253004 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:36:11 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[237721]: [NOTICE]   (237740) : New worker (237743) forked
Sep 30 21:36:11 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[237721]: [NOTICE]   (237740) : Loading success.
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.928 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:11 compute-1 nova_compute[192795]: 2025-09-30 21:36:11.934 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:12 compute-1 nova_compute[192795]: 2025-09-30 21:36:12.015 2 INFO nova.compute.manager [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Took 8.46 seconds to build instance.
Sep 30 21:36:12 compute-1 nova_compute[192795]: 2025-09-30 21:36:12.076 2 DEBUG oslo_concurrency.lockutils [None req-e09ebf18-9413-41ea-b097-a14772a42d15 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:12 compute-1 nova_compute[192795]: 2025-09-30 21:36:12.665 2 DEBUG nova.network.neutron [req-86c61d6a-b05e-4966-a3a4-6c3bc4c43d72 req-43943432-b6d8-4866-9e9b-8053df0c9340 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Updated VIF entry in instance network info cache for port 203ac748-c872-4282-8098-aa626889ec81. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:36:12 compute-1 nova_compute[192795]: 2025-09-30 21:36:12.667 2 DEBUG nova.network.neutron [req-86c61d6a-b05e-4966-a3a4-6c3bc4c43d72 req-43943432-b6d8-4866-9e9b-8053df0c9340 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Updating instance_info_cache with network_info: [{"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:12 compute-1 nova_compute[192795]: 2025-09-30 21:36:12.687 2 DEBUG oslo_concurrency.lockutils [req-86c61d6a-b05e-4966-a3a4-6c3bc4c43d72 req-43943432-b6d8-4866-9e9b-8053df0c9340 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.326 2 DEBUG nova.compute.manager [req-6558cc35-df98-47b7-ac36-a8e1cd5f9d25 req-4a1bb96d-43d1-4c08-9073-aca789d7e3df dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.328 2 DEBUG oslo_concurrency.lockutils [req-6558cc35-df98-47b7-ac36-a8e1cd5f9d25 req-4a1bb96d-43d1-4c08-9073-aca789d7e3df dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.328 2 DEBUG oslo_concurrency.lockutils [req-6558cc35-df98-47b7-ac36-a8e1cd5f9d25 req-4a1bb96d-43d1-4c08-9073-aca789d7e3df dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.329 2 DEBUG oslo_concurrency.lockutils [req-6558cc35-df98-47b7-ac36-a8e1cd5f9d25 req-4a1bb96d-43d1-4c08-9073-aca789d7e3df dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.330 2 DEBUG nova.compute.manager [req-6558cc35-df98-47b7-ac36-a8e1cd5f9d25 req-4a1bb96d-43d1-4c08-9073-aca789d7e3df dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] No waiting events found dispatching network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.330 2 WARNING nova.compute.manager [req-6558cc35-df98-47b7-ac36-a8e1cd5f9d25 req-4a1bb96d-43d1-4c08-9073-aca789d7e3df dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received unexpected event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea for instance with vm_state active and task_state None.
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.667 2 DEBUG nova.compute.manager [req-1a9fe1cf-916c-482b-b1d0-54d640be23e0 req-7593ef8f-2ae1-4e55-adb2-5925719dfa98 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.668 2 DEBUG oslo_concurrency.lockutils [req-1a9fe1cf-916c-482b-b1d0-54d640be23e0 req-7593ef8f-2ae1-4e55-adb2-5925719dfa98 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.668 2 DEBUG oslo_concurrency.lockutils [req-1a9fe1cf-916c-482b-b1d0-54d640be23e0 req-7593ef8f-2ae1-4e55-adb2-5925719dfa98 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.669 2 DEBUG oslo_concurrency.lockutils [req-1a9fe1cf-916c-482b-b1d0-54d640be23e0 req-7593ef8f-2ae1-4e55-adb2-5925719dfa98 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.669 2 DEBUG nova.compute.manager [req-1a9fe1cf-916c-482b-b1d0-54d640be23e0 req-7593ef8f-2ae1-4e55-adb2-5925719dfa98 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:13 compute-1 nova_compute[192795]: 2025-09-30 21:36:13.669 2 WARNING nova.compute.manager [req-1a9fe1cf-916c-482b-b1d0-54d640be23e0 req-7593ef8f-2ae1-4e55-adb2-5925719dfa98 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state active and task_state None.
Sep 30 21:36:14 compute-1 nova_compute[192795]: 2025-09-30 21:36:14.247 2 DEBUG nova.compute.manager [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:14 compute-1 nova_compute[192795]: 2025-09-30 21:36:14.322 2 INFO nova.compute.manager [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] instance snapshotting
Sep 30 21:36:14 compute-1 nova_compute[192795]: 2025-09-30 21:36:14.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:14 compute-1 nova_compute[192795]: 2025-09-30 21:36:14.937 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268159.936037, 613d38da-47eb-49a5-a0db-2c7565b04273 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:14 compute-1 nova_compute[192795]: 2025-09-30 21:36:14.938 2 INFO nova.compute.manager [-] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] VM Stopped (Lifecycle Event)
Sep 30 21:36:14 compute-1 nova_compute[192795]: 2025-09-30 21:36:14.995 2 DEBUG nova.compute.manager [None req-d2c0cdc1-858f-4bf4-991f-fbed9ee6d83c - - - - - -] [instance: 613d38da-47eb-49a5-a0db-2c7565b04273] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.057 2 INFO nova.virt.libvirt.driver [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Beginning live snapshot process
Sep 30 21:36:15 compute-1 virtqemud[192217]: invalid argument: disk vda does not have an active block job
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.296 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.379 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk --force-share --output=json -f qcow2" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.383 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.457 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk --force-share --output=json -f qcow2" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.495 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.559 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.562 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpnom_yqd2/80ad2a607139463f8b1cbafd642eead7.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.608 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpnom_yqd2/80ad2a607139463f8b1cbafd642eead7.delta 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.610 2 INFO nova.virt.libvirt.driver [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.676 2 DEBUG nova.virt.libvirt.guest [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.682 2 INFO nova.virt.libvirt.driver [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.723 2 DEBUG nova.privsep.utils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.725 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpnom_yqd2/80ad2a607139463f8b1cbafd642eead7.delta /var/lib/nova/instances/snapshots/tmpnom_yqd2/80ad2a607139463f8b1cbafd642eead7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.903 2 DEBUG oslo_concurrency.processutils [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpnom_yqd2/80ad2a607139463f8b1cbafd642eead7.delta /var/lib/nova/instances/snapshots/tmpnom_yqd2/80ad2a607139463f8b1cbafd642eead7" returned: 0 in 0.179s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:15 compute-1 nova_compute[192795]: 2025-09-30 21:36:15.906 2 INFO nova.virt.libvirt.driver [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Snapshot extracted, beginning image upload
Sep 30 21:36:18 compute-1 nova_compute[192795]: 2025-09-30 21:36:18.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:19 compute-1 nova_compute[192795]: 2025-09-30 21:36:19.113 2 INFO nova.virt.libvirt.driver [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Snapshot image upload complete
Sep 30 21:36:19 compute-1 nova_compute[192795]: 2025-09-30 21:36:19.115 2 INFO nova.compute.manager [None req-0325f245-3750-4485-bd56-8e5ba322ce90 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Took 4.78 seconds to snapshot the instance on the hypervisor.
Sep 30 21:36:19 compute-1 nova_compute[192795]: 2025-09-30 21:36:19.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:20 compute-1 podman[237779]: 2025-09-30 21:36:20.23881253 +0000 UTC m=+0.078601901 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Sep 30 21:36:20 compute-1 podman[237781]: 2025-09-30 21:36:20.259874564 +0000 UTC m=+0.091302918 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:36:20 compute-1 podman[237780]: 2025-09-30 21:36:20.262749352 +0000 UTC m=+0.098800462 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:36:22 compute-1 nova_compute[192795]: 2025-09-30 21:36:22.174 2 INFO nova.compute.manager [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Rescuing
Sep 30 21:36:22 compute-1 nova_compute[192795]: 2025-09-30 21:36:22.175 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:22 compute-1 nova_compute[192795]: 2025-09-30 21:36:22.175 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquired lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:22 compute-1 nova_compute[192795]: 2025-09-30 21:36:22.175 2 DEBUG nova.network.neutron [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:36:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:22.581 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:22 compute-1 nova_compute[192795]: 2025-09-30 21:36:22.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:22.587 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:36:23 compute-1 nova_compute[192795]: 2025-09-30 21:36:23.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:24 compute-1 nova_compute[192795]: 2025-09-30 21:36:24.067 2 DEBUG nova.network.neutron [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Updating instance_info_cache with network_info: [{"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:24 compute-1 nova_compute[192795]: 2025-09-30 21:36:24.091 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Releasing lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:24 compute-1 nova_compute[192795]: 2025-09-30 21:36:24.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:24 compute-1 ovn_controller[94902]: 2025-09-30T21:36:24Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fc:6f:ef 10.100.0.3
Sep 30 21:36:24 compute-1 ovn_controller[94902]: 2025-09-30T21:36:24Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fc:6f:ef 10.100.0.3
Sep 30 21:36:24 compute-1 nova_compute[192795]: 2025-09-30 21:36:24.691 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:36:26 compute-1 podman[237883]: 2025-09-30 21:36:26.247902279 +0000 UTC m=+0.081881572 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:36:26 compute-1 kernel: tap203ac748-c8 (unregistering): left promiscuous mode
Sep 30 21:36:26 compute-1 NetworkManager[51724]: <info>  [1759268186.8735] device (tap203ac748-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:36:26 compute-1 nova_compute[192795]: 2025-09-30 21:36:26.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:26 compute-1 ovn_controller[94902]: 2025-09-30T21:36:26Z|00414|binding|INFO|Releasing lport 203ac748-c872-4282-8098-aa626889ec81 from this chassis (sb_readonly=0)
Sep 30 21:36:26 compute-1 ovn_controller[94902]: 2025-09-30T21:36:26Z|00415|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 down in Southbound
Sep 30 21:36:26 compute-1 ovn_controller[94902]: 2025-09-30T21:36:26Z|00416|binding|INFO|Removing iface tap203ac748-c8 ovn-installed in OVS
Sep 30 21:36:26 compute-1 nova_compute[192795]: 2025-09-30 21:36:26.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:26 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:26.895 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:15:a0 10.100.0.5'], port_security=['fa:16:3e:74:15:a0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=203ac748-c872-4282-8098-aa626889ec81) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:26 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:26.896 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 203ac748-c872-4282-8098-aa626889ec81 in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:36:26 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:26.898 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5a6396a-b7b7-4ff1-a2af-27477fea2815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:36:26 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:26.899 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e23292b4-86a6-4f5a-a123-46aec6b33ff6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:26 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:26.900 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace which is not needed anymore
Sep 30 21:36:26 compute-1 nova_compute[192795]: 2025-09-30 21:36:26.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:26 compute-1 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Sep 30 21:36:26 compute-1 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006a.scope: Consumed 13.333s CPU time.
Sep 30 21:36:26 compute-1 systemd-machined[152783]: Machine qemu-51-instance-0000006a terminated.
Sep 30 21:36:27 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[237721]: [NOTICE]   (237740) : haproxy version is 2.8.14-c23fe91
Sep 30 21:36:27 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[237721]: [NOTICE]   (237740) : path to executable is /usr/sbin/haproxy
Sep 30 21:36:27 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[237721]: [WARNING]  (237740) : Exiting Master process...
Sep 30 21:36:27 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[237721]: [ALERT]    (237740) : Current worker (237743) exited with code 143 (Terminated)
Sep 30 21:36:27 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[237721]: [WARNING]  (237740) : All workers exited. Exiting... (0)
Sep 30 21:36:27 compute-1 systemd[1]: libpod-e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45.scope: Deactivated successfully.
Sep 30 21:36:27 compute-1 podman[237929]: 2025-09-30 21:36:27.077949057 +0000 UTC m=+0.053819806 container died e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:36:27 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45-userdata-shm.mount: Deactivated successfully.
Sep 30 21:36:27 compute-1 systemd[1]: var-lib-containers-storage-overlay-bda18473b03545dde93b67a3773e7b4af8c0dfa81d51020b4400e64fad9b4117-merged.mount: Deactivated successfully.
Sep 30 21:36:27 compute-1 podman[237929]: 2025-09-30 21:36:27.123677263 +0000 UTC m=+0.099547992 container cleanup e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:36:27 compute-1 systemd[1]: libpod-conmon-e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45.scope: Deactivated successfully.
Sep 30 21:36:27 compute-1 podman[237965]: 2025-09-30 21:36:27.209560472 +0000 UTC m=+0.055322388 container remove e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.217 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e9cd6382-9808-4ca2-bf9a-950dd9d571f3]: (4, ('Tue Sep 30 09:36:27 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45)\ne02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45\nTue Sep 30 09:36:27 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (e02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45)\ne02277c9572810e129f2d3a222977c8586e746173ab5f89da494d9a53dc46d45\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.220 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[586da2d0-8033-488c-bb5e-a2fc55fe3b85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.222 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.226 2 DEBUG nova.compute.manager [req-7e6669c8-305f-444b-9350-53a8ab943325 req-80153523-66fe-42a6-a106-ad03704ee1bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-unplugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:27 compute-1 kernel: tapf5a6396a-b0: left promiscuous mode
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.227 2 DEBUG oslo_concurrency.lockutils [req-7e6669c8-305f-444b-9350-53a8ab943325 req-80153523-66fe-42a6-a106-ad03704ee1bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.228 2 DEBUG oslo_concurrency.lockutils [req-7e6669c8-305f-444b-9350-53a8ab943325 req-80153523-66fe-42a6-a106-ad03704ee1bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.228 2 DEBUG oslo_concurrency.lockutils [req-7e6669c8-305f-444b-9350-53a8ab943325 req-80153523-66fe-42a6-a106-ad03704ee1bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.228 2 DEBUG nova.compute.manager [req-7e6669c8-305f-444b-9350-53a8ab943325 req-80153523-66fe-42a6-a106-ad03704ee1bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-unplugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.229 2 WARNING nova.compute.manager [req-7e6669c8-305f-444b-9350-53a8ab943325 req-80153523-66fe-42a6-a106-ad03704ee1bb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-unplugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state active and task_state rescuing.
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.260 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[61f4202d-816c-4640-9dd5-3224f2460c06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.288 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ae497475-b37a-4266-8cd7-84c633ff48d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.291 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[65286a94-f858-409b-91c5-fbdb109f368d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.311 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dbde25aa-0b7c-4660-a5b1-cb89a0d6e78f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486664, 'reachable_time': 43148, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237990, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.315 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.315 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[20940a26-0452-4812-b2d4-ae86eacdd6a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:27 compute-1 systemd[1]: run-netns-ovnmeta\x2df5a6396a\x2db7b7\x2d4ff1\x2da2af\x2d27477fea2815.mount: Deactivated successfully.
Sep 30 21:36:27 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:27.590 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.715 2 INFO nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance shutdown successfully after 3 seconds.
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.725 2 INFO nova.virt.libvirt.driver [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance destroyed successfully.
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.725 2 DEBUG nova.objects.instance [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'numa_topology' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.751 2 INFO nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Attempting a stable device rescue
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.784 2 DEBUG oslo_concurrency.lockutils [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.784 2 DEBUG oslo_concurrency.lockutils [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.785 2 DEBUG nova.compute.manager [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.789 2 DEBUG nova.compute.manager [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.790 2 DEBUG nova.objects.instance [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'flavor' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.824 2 DEBUG nova.objects.instance [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'info_cache' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.861 2 DEBUG nova.virt.libvirt.driver [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.975 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.982 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.984 2 INFO nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Creating image(s)
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.985 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.986 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.988 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:27 compute-1 nova_compute[192795]: 2025-09-30 21:36:27.988 2 DEBUG nova.objects.instance [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:28 compute-1 nova_compute[192795]: 2025-09-30 21:36:28.011 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "f24a4f4d3b6fa9761e8135d14a901b2ab183c59d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:28 compute-1 nova_compute[192795]: 2025-09-30 21:36:28.012 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "f24a4f4d3b6fa9761e8135d14a901b2ab183c59d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:28 compute-1 nova_compute[192795]: 2025-09-30 21:36:28.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.338 2 DEBUG nova.compute.manager [req-f3f09117-66ca-4747-bf66-0f05c7db04ff req-7c9beb43-23cc-4fdb-99e4-69139acb9ed6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.338 2 DEBUG oslo_concurrency.lockutils [req-f3f09117-66ca-4747-bf66-0f05c7db04ff req-7c9beb43-23cc-4fdb-99e4-69139acb9ed6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.339 2 DEBUG oslo_concurrency.lockutils [req-f3f09117-66ca-4747-bf66-0f05c7db04ff req-7c9beb43-23cc-4fdb-99e4-69139acb9ed6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.339 2 DEBUG oslo_concurrency.lockutils [req-f3f09117-66ca-4747-bf66-0f05c7db04ff req-7c9beb43-23cc-4fdb-99e4-69139acb9ed6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.340 2 DEBUG nova.compute.manager [req-f3f09117-66ca-4747-bf66-0f05c7db04ff req-7c9beb43-23cc-4fdb-99e4-69139acb9ed6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.340 2 WARNING nova.compute.manager [req-f3f09117-66ca-4747-bf66-0f05c7db04ff req-7c9beb43-23cc-4fdb-99e4-69139acb9ed6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state active and task_state rescuing.
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.507 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.576 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d.part --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.577 2 DEBUG nova.virt.images [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] 11bcffb5-c7ee-47b8-9ed7-5a3d59ab02ac was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.579 2 DEBUG nova.privsep.utils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.580 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d.part /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.711 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d.part /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d.converted" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.721 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.786 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d.converted --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.790 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "f24a4f4d3b6fa9761e8135d14a901b2ab183c59d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.818 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "f24a4f4d3b6fa9761e8135d14a901b2ab183c59d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.820 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "f24a4f4d3b6fa9761e8135d14a901b2ab183c59d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.842 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.903 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.905 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d,backing_fmt=raw /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.953 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d,backing_fmt=raw /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.rescue" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.955 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "f24a4f4d3b6fa9761e8135d14a901b2ab183c59d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.956 2 DEBUG nova.objects.instance [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'migration_context' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.981 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.985 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Start _get_guest_xml network_info=[{"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "vif_mac": "fa:16:3e:74:15:a0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '11bcffb5-c7ee-47b8-9ed7-5a3d59ab02ac', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:36:29 compute-1 nova_compute[192795]: 2025-09-30 21:36:29.986 2 DEBUG nova.objects.instance [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'resources' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.020 2 WARNING nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.027 2 DEBUG nova.virt.libvirt.host [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.028 2 DEBUG nova.virt.libvirt.host [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.031 2 DEBUG nova.virt.libvirt.host [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.031 2 DEBUG nova.virt.libvirt.host [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.033 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.033 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.035 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.035 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.036 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.036 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.036 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.036 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.036 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.037 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.037 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.037 2 DEBUG nova.virt.hardware [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.037 2 DEBUG nova.objects.instance [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.059 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.155 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.156 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.157 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.157 2 DEBUG oslo_concurrency.lockutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.159 2 DEBUG nova.virt.libvirt.vif [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:36:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-2109753286',display_name='tempest-ServerStableDeviceRescueTest-server-2109753286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-2109753286',id=106,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:36:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-6inq7tbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:19Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "vif_mac": "fa:16:3e:74:15:a0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.159 2 DEBUG nova.network.os_vif_util [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "vif_mac": "fa:16:3e:74:15:a0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.160 2 DEBUG nova.network.os_vif_util [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:15:a0,bridge_name='br-int',has_traffic_filtering=True,id=203ac748-c872-4282-8098-aa626889ec81,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203ac748-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.160 2 DEBUG nova.objects.instance [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'pci_devices' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.190 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <uuid>72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca</uuid>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <name>instance-0000006a</name>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-2109753286</nova:name>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:36:30</nova:creationTime>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:36:30 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:36:30 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:36:30 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:36:30 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:36:30 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:36:30 compute-1 nova_compute[192795]:         <nova:user uuid="8b1ebef014c145cbbe1e367bfd2c2ba3">tempest-ServerStableDeviceRescueTest-1939201844-project-member</nova:user>
Sep 30 21:36:30 compute-1 nova_compute[192795]:         <nova:project uuid="8978d2df88a5434c8794b659033cca5e">tempest-ServerStableDeviceRescueTest-1939201844</nova:project>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:36:30 compute-1 nova_compute[192795]:         <nova:port uuid="203ac748-c872-4282-8098-aa626889ec81">
Sep 30 21:36:30 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <system>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <entry name="serial">72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca</entry>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <entry name="uuid">72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca</entry>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </system>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <os>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   </os>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <features>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   </features>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.rescue"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <target dev="vdb" bus="virtio"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <boot order="1"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:74:15:a0"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <target dev="tap203ac748-c8"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/console.log" append="off"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <video>
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </video>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:36:30 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:36:30 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:36:30 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:36:30 compute-1 nova_compute[192795]: </domain>
Sep 30 21:36:30 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.198 2 INFO nova.virt.libvirt.driver [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance destroyed successfully.
Sep 30 21:36:30 compute-1 kernel: tap53d72042-40 (unregistering): left promiscuous mode
Sep 30 21:36:30 compute-1 NetworkManager[51724]: <info>  [1759268190.2121] device (tap53d72042-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:30 compute-1 ovn_controller[94902]: 2025-09-30T21:36:30Z|00417|binding|INFO|Releasing lport 53d72042-40b3-4719-8611-1684c83c15ea from this chassis (sb_readonly=0)
Sep 30 21:36:30 compute-1 ovn_controller[94902]: 2025-09-30T21:36:30Z|00418|binding|INFO|Setting lport 53d72042-40b3-4719-8611-1684c83c15ea down in Southbound
Sep 30 21:36:30 compute-1 ovn_controller[94902]: 2025-09-30T21:36:30Z|00419|binding|INFO|Removing iface tap53d72042-40 ovn-installed in OVS
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.249 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:6f:ef 10.100.0.3'], port_security=['fa:16:3e:fc:6f:ef 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20111f98-bf7f-4696-b726-3e06c68cfed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6a3ecea0-9346-40cc-9dbe-25cd68fe08ed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4af2c14-351e-4037-bacd-dca3cfee62e9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=53d72042-40b3-4719-8611-1684c83c15ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.250 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 53d72042-40b3-4719-8611-1684c83c15ea in datapath 20111f98-bf7f-4696-b726-3e06c68cfed2 unbound from our chassis
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.252 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20111f98-bf7f-4696-b726-3e06c68cfed2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.253 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3a83d4a3-3b25-4c5e-957c-b40617668bc1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.254 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 namespace which is not needed anymore
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.263 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.264 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.264 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.264 2 DEBUG nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No VIF found with MAC fa:16:3e:74:15:a0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.264 2 INFO nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Using config drive
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.280 2 DEBUG nova.objects.instance [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:30 compute-1 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000069.scope: Deactivated successfully.
Sep 30 21:36:30 compute-1 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000069.scope: Consumed 14.182s CPU time.
Sep 30 21:36:30 compute-1 systemd-machined[152783]: Machine qemu-50-instance-00000069 terminated.
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.324 2 DEBUG nova.objects.instance [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'keypairs' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:30 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[237617]: [NOTICE]   (237623) : haproxy version is 2.8.14-c23fe91
Sep 30 21:36:30 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[237617]: [NOTICE]   (237623) : path to executable is /usr/sbin/haproxy
Sep 30 21:36:30 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[237617]: [WARNING]  (237623) : Exiting Master process...
Sep 30 21:36:30 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[237617]: [ALERT]    (237623) : Current worker (237626) exited with code 143 (Terminated)
Sep 30 21:36:30 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[237617]: [WARNING]  (237623) : All workers exited. Exiting... (0)
Sep 30 21:36:30 compute-1 systemd[1]: libpod-e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a.scope: Deactivated successfully.
Sep 30 21:36:30 compute-1 podman[238042]: 2025-09-30 21:36:30.398975337 +0000 UTC m=+0.050013412 container died e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:30 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a-userdata-shm.mount: Deactivated successfully.
Sep 30 21:36:30 compute-1 systemd[1]: var-lib-containers-storage-overlay-cf0eeeb1cfbff88c5af2c2b7715585f6d2d69cc6f9b867dd6558f4cb69192927-merged.mount: Deactivated successfully.
Sep 30 21:36:30 compute-1 podman[238042]: 2025-09-30 21:36:30.440773786 +0000 UTC m=+0.091811861 container cleanup e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:36:30 compute-1 systemd[1]: libpod-conmon-e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a.scope: Deactivated successfully.
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:30 compute-1 podman[238075]: 2025-09-30 21:36:30.519626963 +0000 UTC m=+0.046564009 container remove e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.526 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8ebe30-6a4f-4b59-a9a9-357d9e7a9ffc]: (4, ('Tue Sep 30 09:36:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 (e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a)\ne5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a\nTue Sep 30 09:36:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 (e5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a)\ne5bb856d1b1a46bcee320484bbef0baceb12740eed631ff6cbd259165d224d9a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.528 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ed977e73-5825-4650-9237-98f6fc36e07f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.528 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20111f98-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:30 compute-1 kernel: tap20111f98-b0: left promiscuous mode
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.552 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[69f0b507-3efb-4f5e-b2e0-0d928e967352]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.591 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[48874831-7681-435e-b865-297a41db776f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.593 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5a1a9dce-3f8d-425a-bd70-6678010b8e1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.610 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bf75cb37-49ee-4829-a5c6-295b126e211b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486555, 'reachable_time': 20256, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238114, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.613 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:36:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:30.614 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[cbdfbf04-2c08-4c54-b0ca-99193c127017]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:30 compute-1 systemd[1]: run-netns-ovnmeta\x2d20111f98\x2dbf7f\x2d4696\x2db726\x2d3e06c68cfed2.mount: Deactivated successfully.
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.882 2 INFO nova.virt.libvirt.driver [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance shutdown successfully after 3 seconds.
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.887 2 INFO nova.virt.libvirt.driver [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance destroyed successfully.
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.888 2 DEBUG nova.objects.instance [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'numa_topology' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.905 2 DEBUG nova.compute.manager [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.992 2 INFO nova.virt.libvirt.driver [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Creating config drive at /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config.rescue
Sep 30 21:36:30 compute-1 nova_compute[192795]: 2025-09-30 21:36:30.997 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzf74uha1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.040 2 DEBUG oslo_concurrency.lockutils [None req-5cc22c0d-2232-49e7-9895-45758bc17067 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.123 2 DEBUG oslo_concurrency.processutils [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzf74uha1" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:31 compute-1 systemd-udevd[238021]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:36:31 compute-1 NetworkManager[51724]: <info>  [1759268191.2434] manager: (tap203ac748-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Sep 30 21:36:31 compute-1 kernel: tap203ac748-c8: entered promiscuous mode
Sep 30 21:36:31 compute-1 ovn_controller[94902]: 2025-09-30T21:36:31Z|00420|binding|INFO|Claiming lport 203ac748-c872-4282-8098-aa626889ec81 for this chassis.
Sep 30 21:36:31 compute-1 ovn_controller[94902]: 2025-09-30T21:36:31Z|00421|binding|INFO|203ac748-c872-4282-8098-aa626889ec81: Claiming fa:16:3e:74:15:a0 10.100.0.5
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:31 compute-1 NetworkManager[51724]: <info>  [1759268191.2610] device (tap203ac748-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:36:31 compute-1 NetworkManager[51724]: <info>  [1759268191.2618] device (tap203ac748-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.268 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:15:a0 10.100.0.5'], port_security=['fa:16:3e:74:15:a0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=203ac748-c872-4282-8098-aa626889ec81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.269 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 203ac748-c872-4282-8098-aa626889ec81 in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 bound to our chassis
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.272 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:31 compute-1 ovn_controller[94902]: 2025-09-30T21:36:31Z|00422|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 ovn-installed in OVS
Sep 30 21:36:31 compute-1 ovn_controller[94902]: 2025-09-30T21:36:31Z|00423|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 up in Southbound
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.285 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d3abb877-4b28-4f4e-b629-99ec28446561]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.287 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf5a6396a-b1 in ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.288 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf5a6396a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.289 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6c878ed2-ad69-47a1-b29b-096f3de834b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.290 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[12df2449-a9e6-451e-b04e-bc82a93a46d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.312 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[5281f370-9006-42a8-be92-bb8c6ae7e40e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 systemd-machined[152783]: New machine qemu-52-instance-0000006a.
Sep 30 21:36:31 compute-1 systemd[1]: Started Virtual Machine qemu-52-instance-0000006a.
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.342 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dde99174-e841-4d1c-b3d2-d0e0af37b272]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.377 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[0391334e-9e49-42f9-bc96-a38fe3002e07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.385 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[843c6100-0523-47f8-8a33-fca93bb1a3a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 NetworkManager[51724]: <info>  [1759268191.3872] manager: (tapf5a6396a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/216)
Sep 30 21:36:31 compute-1 podman[238138]: 2025-09-30 21:36:31.38626426 +0000 UTC m=+0.081484591 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:31 compute-1 podman[238139]: 2025-09-30 21:36:31.422426455 +0000 UTC m=+0.116196046 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:36:31 compute-1 podman[238135]: 2025-09-30 21:36:31.425652324 +0000 UTC m=+0.122334204 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.428 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5391c624-77f0-4967-af8a-68dc67638c9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.432 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4c68a9f6-db8a-4807-bfcb-44fc21b56582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 NetworkManager[51724]: <info>  [1759268191.4614] device (tapf5a6396a-b0): carrier: link connected
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.468 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2fddaacc-eff6-4a12-b970-a4d09f3ef85b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.490 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[65d8a812-1c3c-4c20-9b11-ee417750bd97]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488713, 'reachable_time': 43690, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238234, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.514 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[25a122c6-122c-4589-9a11-36397eaaeb0a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:66d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488713, 'tstamp': 488713}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238242, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.538 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed7e38b-376b-4bb8-8806-e77ab312d4c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488713, 'reachable_time': 43690, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238243, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.571 2 DEBUG nova.compute.manager [req-2820882c-8b5b-4cf1-9f43-d8e1ef397f79 req-c71db632-7818-4ea6-9fa5-22fa529e30e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.572 2 DEBUG oslo_concurrency.lockutils [req-2820882c-8b5b-4cf1-9f43-d8e1ef397f79 req-c71db632-7818-4ea6-9fa5-22fa529e30e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.572 2 DEBUG oslo_concurrency.lockutils [req-2820882c-8b5b-4cf1-9f43-d8e1ef397f79 req-c71db632-7818-4ea6-9fa5-22fa529e30e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.572 2 DEBUG oslo_concurrency.lockutils [req-2820882c-8b5b-4cf1-9f43-d8e1ef397f79 req-c71db632-7818-4ea6-9fa5-22fa529e30e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.572 2 DEBUG nova.compute.manager [req-2820882c-8b5b-4cf1-9f43-d8e1ef397f79 req-c71db632-7818-4ea6-9fa5-22fa529e30e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.572 2 WARNING nova.compute.manager [req-2820882c-8b5b-4cf1-9f43-d8e1ef397f79 req-c71db632-7818-4ea6-9fa5-22fa529e30e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state active and task_state rescuing.
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.579 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5949f162-2823-49c4-bd9a-dd5d6269d117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.658 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[00c8a331-4eed-482c-971d-0c5cde496ec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.660 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.661 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.662 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:31 compute-1 NetworkManager[51724]: <info>  [1759268191.6661] manager: (tapf5a6396a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Sep 30 21:36:31 compute-1 kernel: tapf5a6396a-b0: entered promiscuous mode
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.672 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:31 compute-1 ovn_controller[94902]: 2025-09-30T21:36:31Z|00424|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:31 compute-1 nova_compute[192795]: 2025-09-30 21:36:31.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.699 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.700 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f4f15e-7f93-4c3e-8f8a-e418381cbff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.701 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:36:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:31.702 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'env', 'PROCESS_TAG=haproxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f5a6396a-b7b7-4ff1-a2af-27477fea2815.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.033 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.034 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268192.0326493, 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.034 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] VM Resumed (Lifecycle Event)
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.049 2 DEBUG nova.compute.manager [None req-1b09c4e1-1853-4cff-969f-3069e6b74b7a 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.076 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.081 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.121 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] During sync_power_state the instance has a pending task (rescuing). Skip.
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.122 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268192.032932, 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.122 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] VM Started (Lifecycle Event)
Sep 30 21:36:32 compute-1 podman[238279]: 2025-09-30 21:36:32.133384351 +0000 UTC m=+0.064534279 container create cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.146 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.150 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:32 compute-1 systemd[1]: Started libpod-conmon-cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290.scope.
Sep 30 21:36:32 compute-1 podman[238279]: 2025-09-30 21:36:32.099637021 +0000 UTC m=+0.030786979 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:36:32 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:36:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ffe72b746813a1a39d86bfba79c95ddfbb25ee72abcbc33b1ebdc5d81729950/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:36:32 compute-1 podman[238279]: 2025-09-30 21:36:32.242877313 +0000 UTC m=+0.174027261 container init cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:32 compute-1 podman[238279]: 2025-09-30 21:36:32.254966702 +0000 UTC m=+0.186116630 container start cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:36:32 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238295]: [NOTICE]   (238299) : New worker (238301) forked
Sep 30 21:36:32 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238295]: [NOTICE]   (238299) : Loading success.
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.373 2 DEBUG nova.compute.manager [req-99ada325-e97a-499e-907b-c78d8d92a906 req-2df33533-9365-4d4e-82c8-f3a19736b27f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-unplugged-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.373 2 DEBUG oslo_concurrency.lockutils [req-99ada325-e97a-499e-907b-c78d8d92a906 req-2df33533-9365-4d4e-82c8-f3a19736b27f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.374 2 DEBUG oslo_concurrency.lockutils [req-99ada325-e97a-499e-907b-c78d8d92a906 req-2df33533-9365-4d4e-82c8-f3a19736b27f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.374 2 DEBUG oslo_concurrency.lockutils [req-99ada325-e97a-499e-907b-c78d8d92a906 req-2df33533-9365-4d4e-82c8-f3a19736b27f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.374 2 DEBUG nova.compute.manager [req-99ada325-e97a-499e-907b-c78d8d92a906 req-2df33533-9365-4d4e-82c8-f3a19736b27f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] No waiting events found dispatching network-vif-unplugged-53d72042-40b3-4719-8611-1684c83c15ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.375 2 WARNING nova.compute.manager [req-99ada325-e97a-499e-907b-c78d8d92a906 req-2df33533-9365-4d4e-82c8-f3a19736b27f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received unexpected event network-vif-unplugged-53d72042-40b3-4719-8611-1684c83c15ea for instance with vm_state stopped and task_state powering-on.
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.464 2 DEBUG nova.objects.instance [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'flavor' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.510 2 DEBUG nova.objects.instance [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'info_cache' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.542 2 DEBUG oslo_concurrency.lockutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "refresh_cache-103b7a79-7ea6-47fa-bd6f-bcc87a96369b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.543 2 DEBUG oslo_concurrency.lockutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquired lock "refresh_cache-103b7a79-7ea6-47fa-bd6f-bcc87a96369b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:32 compute-1 nova_compute[192795]: 2025-09-30 21:36:32.543 2 DEBUG nova.network.neutron [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:36:33 compute-1 nova_compute[192795]: 2025-09-30 21:36:33.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:33 compute-1 nova_compute[192795]: 2025-09-30 21:36:33.693 2 DEBUG nova.compute.manager [req-650c1002-11be-4152-82c8-d2e4d7a5d5b6 req-7229548c-6ee8-4202-8e72-545bc3c15677 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:33 compute-1 nova_compute[192795]: 2025-09-30 21:36:33.693 2 DEBUG oslo_concurrency.lockutils [req-650c1002-11be-4152-82c8-d2e4d7a5d5b6 req-7229548c-6ee8-4202-8e72-545bc3c15677 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:33 compute-1 nova_compute[192795]: 2025-09-30 21:36:33.693 2 DEBUG oslo_concurrency.lockutils [req-650c1002-11be-4152-82c8-d2e4d7a5d5b6 req-7229548c-6ee8-4202-8e72-545bc3c15677 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:33 compute-1 nova_compute[192795]: 2025-09-30 21:36:33.694 2 DEBUG oslo_concurrency.lockutils [req-650c1002-11be-4152-82c8-d2e4d7a5d5b6 req-7229548c-6ee8-4202-8e72-545bc3c15677 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:33 compute-1 nova_compute[192795]: 2025-09-30 21:36:33.694 2 DEBUG nova.compute.manager [req-650c1002-11be-4152-82c8-d2e4d7a5d5b6 req-7229548c-6ee8-4202-8e72-545bc3c15677 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:33 compute-1 nova_compute[192795]: 2025-09-30 21:36:33.694 2 WARNING nova.compute.manager [req-650c1002-11be-4152-82c8-d2e4d7a5d5b6 req-7229548c-6ee8-4202-8e72-545bc3c15677 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state rescued and task_state None.
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.204 2 INFO nova.compute.manager [None req-ab1306ed-e778-4ac7-bdfe-af51e56c3c8b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Unrescuing
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.205 2 DEBUG oslo_concurrency.lockutils [None req-ab1306ed-e778-4ac7-bdfe-af51e56c3c8b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.205 2 DEBUG oslo_concurrency.lockutils [None req-ab1306ed-e778-4ac7-bdfe-af51e56c3c8b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquired lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.205 2 DEBUG nova.network.neutron [None req-ab1306ed-e778-4ac7-bdfe-af51e56c3c8b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.463 2 DEBUG nova.network.neutron [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Updating instance_info_cache with network_info: [{"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.487 2 DEBUG nova.compute.manager [req-98204058-804b-4a96-9922-0f75887ebddd req-eab84d38-0fd0-4f3e-aa49-20347ec6f18e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.488 2 DEBUG oslo_concurrency.lockutils [req-98204058-804b-4a96-9922-0f75887ebddd req-eab84d38-0fd0-4f3e-aa49-20347ec6f18e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.488 2 DEBUG oslo_concurrency.lockutils [req-98204058-804b-4a96-9922-0f75887ebddd req-eab84d38-0fd0-4f3e-aa49-20347ec6f18e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.488 2 DEBUG oslo_concurrency.lockutils [req-98204058-804b-4a96-9922-0f75887ebddd req-eab84d38-0fd0-4f3e-aa49-20347ec6f18e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.488 2 DEBUG nova.compute.manager [req-98204058-804b-4a96-9922-0f75887ebddd req-eab84d38-0fd0-4f3e-aa49-20347ec6f18e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] No waiting events found dispatching network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.489 2 WARNING nova.compute.manager [req-98204058-804b-4a96-9922-0f75887ebddd req-eab84d38-0fd0-4f3e-aa49-20347ec6f18e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received unexpected event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea for instance with vm_state stopped and task_state powering-on.
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.490 2 DEBUG oslo_concurrency.lockutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Releasing lock "refresh_cache-103b7a79-7ea6-47fa-bd6f-bcc87a96369b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.522 2 INFO nova.virt.libvirt.driver [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance destroyed successfully.
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.522 2 DEBUG nova.objects.instance [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'numa_topology' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.535 2 DEBUG nova.objects.instance [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'resources' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.546 2 DEBUG nova.virt.libvirt.vif [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:36:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1141087200',display_name='tempest-ListServerFiltersTestJSON-instance-1141087200',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1141087200',id=105,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:36:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='17bd9c2628a94a0b83c4cae3f51b3f7c',ramdisk_id='',reservation_id='r-rydrv4y4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1322408077',owner_user_name='tempest-ListServerFiltersTestJSON-1322408077-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:36:30Z,user_data=None,user_id='3746d13787f042a1bfad4de0c42015eb',uuid=103b7a79-7ea6-47fa-bd6f-bcc87a96369b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.546 2 DEBUG nova.network.os_vif_util [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converting VIF {"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.547 2 DEBUG nova.network.os_vif_util [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.547 2 DEBUG os_vif [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.550 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53d72042-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.605 2 INFO os_vif [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40')
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.611 2 DEBUG nova.virt.libvirt.driver [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Start _get_guest_xml network_info=[{"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.615 2 WARNING nova.virt.libvirt.driver [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.623 2 DEBUG nova.virt.libvirt.host [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.623 2 DEBUG nova.virt.libvirt.host [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.626 2 DEBUG nova.virt.libvirt.host [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.627 2 DEBUG nova.virt.libvirt.host [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.628 2 DEBUG nova.virt.libvirt.driver [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.628 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.629 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.629 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.629 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.629 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.629 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.630 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.630 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.630 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.630 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.631 2 DEBUG nova.virt.hardware [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.631 2 DEBUG nova.objects.instance [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.662 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.721 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.722 2 DEBUG oslo_concurrency.lockutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.722 2 DEBUG oslo_concurrency.lockutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.723 2 DEBUG oslo_concurrency.lockutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.724 2 DEBUG nova.virt.libvirt.vif [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:36:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1141087200',display_name='tempest-ListServerFiltersTestJSON-instance-1141087200',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1141087200',id=105,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:36:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='17bd9c2628a94a0b83c4cae3f51b3f7c',ramdisk_id='',reservation_id='r-rydrv4y4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1322408077',owner_user_name='tempest-ListServerFiltersTestJSON-1322408077-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:36:30Z,user_data=None,user_id='3746d13787f042a1bfad4de0c42015eb',uuid=103b7a79-7ea6-47fa-bd6f-bcc87a96369b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.724 2 DEBUG nova.network.os_vif_util [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converting VIF {"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.725 2 DEBUG nova.network.os_vif_util [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.726 2 DEBUG nova.objects.instance [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.740 2 DEBUG nova.virt.libvirt.driver [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <uuid>103b7a79-7ea6-47fa-bd6f-bcc87a96369b</uuid>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <name>instance-00000069</name>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-1141087200</nova:name>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:36:34</nova:creationTime>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:36:34 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:36:34 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:36:34 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:36:34 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:36:34 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:36:34 compute-1 nova_compute[192795]:         <nova:user uuid="3746d13787f042a1bfad4de0c42015eb">tempest-ListServerFiltersTestJSON-1322408077-project-member</nova:user>
Sep 30 21:36:34 compute-1 nova_compute[192795]:         <nova:project uuid="17bd9c2628a94a0b83c4cae3f51b3f7c">tempest-ListServerFiltersTestJSON-1322408077</nova:project>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:36:34 compute-1 nova_compute[192795]:         <nova:port uuid="53d72042-40b3-4719-8611-1684c83c15ea">
Sep 30 21:36:34 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <system>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <entry name="serial">103b7a79-7ea6-47fa-bd6f-bcc87a96369b</entry>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <entry name="uuid">103b7a79-7ea6-47fa-bd6f-bcc87a96369b</entry>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     </system>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <os>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   </os>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <features>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   </features>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.config"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:fc:6f:ef"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <target dev="tap53d72042-40"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/console.log" append="off"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <video>
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     </video>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <input type="keyboard" bus="usb"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:36:34 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:36:34 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:36:34 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:36:34 compute-1 nova_compute[192795]: </domain>
Sep 30 21:36:34 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.742 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.802 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.804 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.868 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.870 2 DEBUG nova.objects.instance [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.887 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.947 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.948 2 DEBUG nova.virt.disk.api [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Checking if we can resize image /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:36:34 compute-1 nova_compute[192795]: 2025-09-30 21:36:34.949 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.014 2 DEBUG oslo_concurrency.processutils [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.015 2 DEBUG nova.virt.disk.api [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Cannot resize image /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.015 2 DEBUG nova.objects.instance [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'migration_context' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.035 2 DEBUG nova.virt.libvirt.vif [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:36:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1141087200',display_name='tempest-ListServerFiltersTestJSON-instance-1141087200',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1141087200',id=105,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:36:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='17bd9c2628a94a0b83c4cae3f51b3f7c',ramdisk_id='',reservation_id='r-rydrv4y4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1322408077',owner_user_name='tempest-ListServerFiltersTestJSON-1322408077-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:30Z,user_data=None,user_id='3746d13787f042a1bfad4de0c42015eb',uuid=103b7a79-7ea6-47fa-bd6f-bcc87a96369b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.035 2 DEBUG nova.network.os_vif_util [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converting VIF {"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.036 2 DEBUG nova.network.os_vif_util [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.037 2 DEBUG os_vif [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.038 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.039 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.043 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap53d72042-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.043 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap53d72042-40, col_values=(('external_ids', {'iface-id': '53d72042-40b3-4719-8611-1684c83c15ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:6f:ef', 'vm-uuid': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:35 compute-1 NetworkManager[51724]: <info>  [1759268195.0460] manager: (tap53d72042-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.054 2 INFO os_vif [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40')
Sep 30 21:36:35 compute-1 kernel: tap53d72042-40: entered promiscuous mode
Sep 30 21:36:35 compute-1 ovn_controller[94902]: 2025-09-30T21:36:35Z|00425|binding|INFO|Claiming lport 53d72042-40b3-4719-8611-1684c83c15ea for this chassis.
Sep 30 21:36:35 compute-1 ovn_controller[94902]: 2025-09-30T21:36:35Z|00426|binding|INFO|53d72042-40b3-4719-8611-1684c83c15ea: Claiming fa:16:3e:fc:6f:ef 10.100.0.3
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:35 compute-1 NetworkManager[51724]: <info>  [1759268195.1612] manager: (tap53d72042-40): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Sep 30 21:36:35 compute-1 ovn_controller[94902]: 2025-09-30T21:36:35Z|00427|binding|INFO|Setting lport 53d72042-40b3-4719-8611-1684c83c15ea ovn-installed in OVS
Sep 30 21:36:35 compute-1 ovn_controller[94902]: 2025-09-30T21:36:35Z|00428|binding|INFO|Setting lport 53d72042-40b3-4719-8611-1684c83c15ea up in Southbound
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.171 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:6f:ef 10.100.0.3'], port_security=['fa:16:3e:fc:6f:ef 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20111f98-bf7f-4696-b726-3e06c68cfed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6a3ecea0-9346-40cc-9dbe-25cd68fe08ed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4af2c14-351e-4037-bacd-dca3cfee62e9, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=53d72042-40b3-4719-8611-1684c83c15ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.174 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 53d72042-40b3-4719-8611-1684c83c15ea in datapath 20111f98-bf7f-4696-b726-3e06c68cfed2 bound to our chassis
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.176 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20111f98-bf7f-4696-b726-3e06c68cfed2
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.190 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[64c0d415-5f30-4570-bc76-17b482f7e3b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.193 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap20111f98-b1 in ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.195 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap20111f98-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.195 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ac227692-f013-4ed1-bd1f-2d2ba2455422]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.197 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb44a43-4add-414e-98bf-8946147dccab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 systemd-udevd[238344]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.209 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[18bb0e65-d571-49e8-9d1f-bec31532756b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 systemd-machined[152783]: New machine qemu-53-instance-00000069.
Sep 30 21:36:35 compute-1 NetworkManager[51724]: <info>  [1759268195.2238] device (tap53d72042-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:36:35 compute-1 NetworkManager[51724]: <info>  [1759268195.2245] device (tap53d72042-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:36:35 compute-1 systemd[1]: Started Virtual Machine qemu-53-instance-00000069.
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.236 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6d977b-46d7-47d8-8f26-a700ea7e95de]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.271 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7f308bd5-344a-4499-af02-f3dd3a01a0e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.279 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6794739e-f1a8-470e-af6e-3571640d553d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 NetworkManager[51724]: <info>  [1759268195.2818] manager: (tap20111f98-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/220)
Sep 30 21:36:35 compute-1 systemd-udevd[238347]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.322 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a28436d6-94dd-40c3-a596-390a38a86c4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.325 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ea934a-c759-41b4-b634-f659f5a4bde3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 NetworkManager[51724]: <info>  [1759268195.3598] device (tap20111f98-b0): carrier: link connected
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.368 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[364f677d-8455-485e-835f-22b84883bf33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.390 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ee2fbf38-7e86-4147-927c-cf09f18d1742]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20111f98-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:ef:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489103, 'reachable_time': 23562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238375, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.411 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9c97563f-a420-48a6-82fb-0744f9d7174f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe55:efd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489103, 'tstamp': 489103}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238376, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.431 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[13f48502-a1e1-478e-888b-e27463998544]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20111f98-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:ef:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489103, 'reachable_time': 23562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238377, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.488 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f50a4a3b-dfd2-4469-bca4-32e081623107]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.584 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa3dbe0-4376-4c07-9d6c-5608503501b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.586 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20111f98-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.587 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.587 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20111f98-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:35 compute-1 NetworkManager[51724]: <info>  [1759268195.6144] manager: (tap20111f98-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Sep 30 21:36:35 compute-1 kernel: tap20111f98-b0: entered promiscuous mode
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.619 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20111f98-b0, col_values=(('external_ids', {'iface-id': '36b3f8fd-6b0e-45c8-9c31-56cd0812c366'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:35 compute-1 ovn_controller[94902]: 2025-09-30T21:36:35Z|00429|binding|INFO|Releasing lport 36b3f8fd-6b0e-45c8-9c31-56cd0812c366 from this chassis (sb_readonly=0)
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.623 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/20111f98-bf7f-4696-b726-3e06c68cfed2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/20111f98-bf7f-4696-b726-3e06c68cfed2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.624 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[858683af-5ab0-4eba-a0d2-4b1f3c2b5ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.625 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-20111f98-bf7f-4696-b726-3e06c68cfed2
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/20111f98-bf7f-4696-b726-3e06c68cfed2.pid.haproxy
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 20111f98-bf7f-4696-b726-3e06c68cfed2
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:36:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:35.626 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'env', 'PROCESS_TAG=haproxy-20111f98-bf7f-4696-b726-3e06c68cfed2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/20111f98-bf7f-4696-b726-3e06c68cfed2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:36:35 compute-1 nova_compute[192795]: 2025-09-30 21:36:35.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:36 compute-1 podman[238417]: 2025-09-30 21:36:36.077681757 +0000 UTC m=+0.082593511 container create 92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:36:36 compute-1 podman[238417]: 2025-09-30 21:36:36.020477509 +0000 UTC m=+0.025389283 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:36:36 compute-1 systemd[1]: Started libpod-conmon-92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104.scope.
Sep 30 21:36:36 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:36:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/290a0218553f45a7718ce8043ea7f1610cbe6d2f25f976a4013cf029518e47ae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:36:36 compute-1 podman[238417]: 2025-09-30 21:36:36.196066632 +0000 UTC m=+0.200978386 container init 92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:36:36 compute-1 podman[238417]: 2025-09-30 21:36:36.206925437 +0000 UTC m=+0.211837191 container start 92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:36:36 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[238432]: [NOTICE]   (238436) : New worker (238438) forked
Sep 30 21:36:36 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[238432]: [NOTICE]   (238436) : Loading success.
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.288 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 103b7a79-7ea6-47fa-bd6f-bcc87a96369b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.289 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268196.2878606, 103b7a79-7ea6-47fa-bd6f-bcc87a96369b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.289 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] VM Resumed (Lifecycle Event)
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.291 2 DEBUG nova.compute.manager [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.296 2 INFO nova.virt.libvirt.driver [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance rebooted successfully.
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.297 2 DEBUG nova.compute.manager [None req-85d0f3aa-398b-4824-985b-31852427f6b7 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.436 2 DEBUG nova.network.neutron [None req-ab1306ed-e778-4ac7-bdfe-af51e56c3c8b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Updating instance_info_cache with network_info: [{"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.464 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.469 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.493 2 DEBUG oslo_concurrency.lockutils [None req-ab1306ed-e778-4ac7-bdfe-af51e56c3c8b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Releasing lock "refresh_cache-72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.494 2 DEBUG nova.objects.instance [None req-ab1306ed-e778-4ac7-bdfe-af51e56c3c8b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'flavor' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.496 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] During sync_power_state the instance has a pending task (powering-on). Skip.
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.497 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268196.2894635, 103b7a79-7ea6-47fa-bd6f-bcc87a96369b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.497 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] VM Started (Lifecycle Event)
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.560 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.572 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:36 compute-1 kernel: tap203ac748-c8 (unregistering): left promiscuous mode
Sep 30 21:36:36 compute-1 NetworkManager[51724]: <info>  [1759268196.6126] device (tap203ac748-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:36 compute-1 ovn_controller[94902]: 2025-09-30T21:36:36Z|00430|binding|INFO|Releasing lport 203ac748-c872-4282-8098-aa626889ec81 from this chassis (sb_readonly=0)
Sep 30 21:36:36 compute-1 ovn_controller[94902]: 2025-09-30T21:36:36Z|00431|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 down in Southbound
Sep 30 21:36:36 compute-1 ovn_controller[94902]: 2025-09-30T21:36:36Z|00432|binding|INFO|Removing iface tap203ac748-c8 ovn-installed in OVS
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.629 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:15:a0 10.100.0.5'], port_security=['fa:16:3e:74:15:a0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=203ac748-c872-4282-8098-aa626889ec81) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.631 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 203ac748-c872-4282-8098-aa626889ec81 in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.634 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5a6396a-b7b7-4ff1-a2af-27477fea2815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.635 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab22665-193a-4cf6-ad1c-1c5be2f19389]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.635 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace which is not needed anymore
Sep 30 21:36:36 compute-1 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Sep 30 21:36:36 compute-1 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006a.scope: Consumed 5.215s CPU time.
Sep 30 21:36:36 compute-1 systemd-machined[152783]: Machine qemu-52-instance-0000006a terminated.
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.684 2 DEBUG nova.compute.manager [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.685 2 DEBUG oslo_concurrency.lockutils [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.686 2 DEBUG oslo_concurrency.lockutils [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.686 2 DEBUG oslo_concurrency.lockutils [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.687 2 DEBUG nova.compute.manager [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] No waiting events found dispatching network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.687 2 WARNING nova.compute.manager [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received unexpected event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea for instance with vm_state active and task_state None.
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.687 2 DEBUG nova.compute.manager [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.688 2 DEBUG oslo_concurrency.lockutils [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.688 2 DEBUG oslo_concurrency.lockutils [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.688 2 DEBUG oslo_concurrency.lockutils [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.689 2 DEBUG nova.compute.manager [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] No waiting events found dispatching network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.689 2 WARNING nova.compute.manager [req-5c910cbe-4440-4926-bbc0-e6753428b5a6 req-e9eca009-b7fc-40ca-b69b-4d6c3f506491 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received unexpected event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea for instance with vm_state active and task_state None.
Sep 30 21:36:36 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238295]: [NOTICE]   (238299) : haproxy version is 2.8.14-c23fe91
Sep 30 21:36:36 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238295]: [NOTICE]   (238299) : path to executable is /usr/sbin/haproxy
Sep 30 21:36:36 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238295]: [WARNING]  (238299) : Exiting Master process...
Sep 30 21:36:36 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238295]: [WARNING]  (238299) : Exiting Master process...
Sep 30 21:36:36 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238295]: [ALERT]    (238299) : Current worker (238301) exited with code 143 (Terminated)
Sep 30 21:36:36 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238295]: [WARNING]  (238299) : All workers exited. Exiting... (0)
Sep 30 21:36:36 compute-1 systemd[1]: libpod-cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290.scope: Deactivated successfully.
Sep 30 21:36:36 compute-1 podman[238467]: 2025-09-30 21:36:36.778677541 +0000 UTC m=+0.044448072 container died cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:36:36 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290-userdata-shm.mount: Deactivated successfully.
Sep 30 21:36:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-6ffe72b746813a1a39d86bfba79c95ddfbb25ee72abcbc33b1ebdc5d81729950-merged.mount: Deactivated successfully.
Sep 30 21:36:36 compute-1 podman[238467]: 2025-09-30 21:36:36.840873145 +0000 UTC m=+0.106643666 container cleanup cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:36 compute-1 systemd[1]: libpod-conmon-cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290.scope: Deactivated successfully.
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.862 2 INFO nova.virt.libvirt.driver [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance destroyed successfully.
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.863 2 DEBUG nova.objects.instance [None req-ab1306ed-e778-4ac7-bdfe-af51e56c3c8b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'numa_topology' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:36 compute-1 podman[238518]: 2025-09-30 21:36:36.923511636 +0000 UTC m=+0.052444069 container remove cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.929 2 DEBUG nova.compute.manager [req-bfe06dcf-f634-4e81-bac7-020aafd605eb req-92c81b7f-8a2d-4f0f-8339-a8363e18f03a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-unplugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.930 2 DEBUG oslo_concurrency.lockutils [req-bfe06dcf-f634-4e81-bac7-020aafd605eb req-92c81b7f-8a2d-4f0f-8339-a8363e18f03a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.930 2 DEBUG oslo_concurrency.lockutils [req-bfe06dcf-f634-4e81-bac7-020aafd605eb req-92c81b7f-8a2d-4f0f-8339-a8363e18f03a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.931 2 DEBUG oslo_concurrency.lockutils [req-bfe06dcf-f634-4e81-bac7-020aafd605eb req-92c81b7f-8a2d-4f0f-8339-a8363e18f03a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.931 2 DEBUG nova.compute.manager [req-bfe06dcf-f634-4e81-bac7-020aafd605eb req-92c81b7f-8a2d-4f0f-8339-a8363e18f03a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-unplugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.931 2 WARNING nova.compute.manager [req-bfe06dcf-f634-4e81-bac7-020aafd605eb req-92c81b7f-8a2d-4f0f-8339-a8363e18f03a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-unplugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.932 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[63137055-7d0e-44d3-a15c-9723941edad9]: (4, ('Tue Sep 30 09:36:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290)\ncde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290\nTue Sep 30 09:36:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (cde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290)\ncde9953a3eb3bad04e56f5a80da0d0ec0dda6c97aee13729b2c7ccb7179a6290\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.935 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8c53eb3a-1e02-43f1-8e2a-fe60b0738acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.936 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:36 compute-1 kernel: tapf5a6396a-b0: left promiscuous mode
Sep 30 21:36:36 compute-1 NetworkManager[51724]: <info>  [1759268196.9677] manager: (tap203ac748-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Sep 30 21:36:36 compute-1 systemd-udevd[238366]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:36 compute-1 kernel: tap203ac748-c8: entered promiscuous mode
Sep 30 21:36:36 compute-1 nova_compute[192795]: 2025-09-30 21:36:36.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.975 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[17219c04-1b7e-4e5d-8164-4f700be16f73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:36 compute-1 ovn_controller[94902]: 2025-09-30T21:36:36Z|00433|binding|INFO|Claiming lport 203ac748-c872-4282-8098-aa626889ec81 for this chassis.
Sep 30 21:36:36 compute-1 ovn_controller[94902]: 2025-09-30T21:36:36Z|00434|binding|INFO|203ac748-c872-4282-8098-aa626889ec81: Claiming fa:16:3e:74:15:a0 10.100.0.5
Sep 30 21:36:36 compute-1 NetworkManager[51724]: <info>  [1759268196.9879] device (tap203ac748-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:36:36 compute-1 NetworkManager[51724]: <info>  [1759268196.9919] device (tap203ac748-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:36:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:36.991 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:15:a0 10.100.0.5'], port_security=['fa:16:3e:74:15:a0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '7', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=203ac748-c872-4282-8098-aa626889ec81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:37 compute-1 ovn_controller[94902]: 2025-09-30T21:36:37Z|00435|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 ovn-installed in OVS
Sep 30 21:36:37 compute-1 ovn_controller[94902]: 2025-09-30T21:36:37Z|00436|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 up in Southbound
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.014 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cf79bd97-b7f4-4227-9591-9b37a87b90e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.016 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc6d377-05f6-4fdb-a7bb-7aef17f544f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 systemd-machined[152783]: New machine qemu-54-instance-0000006a.
Sep 30 21:36:37 compute-1 systemd[1]: Started Virtual Machine qemu-54-instance-0000006a.
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.043 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb9e50c-4119-4ec2-ade1-97e6f66c4274]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488704, 'reachable_time': 19296, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238550, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 systemd[1]: run-netns-ovnmeta\x2df5a6396a\x2db7b7\x2d4ff1\x2da2af\x2d27477fea2815.mount: Deactivated successfully.
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.048 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.049 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[795248e9-72e2-476a-8264-1950e44a6a29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.050 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 203ac748-c872-4282-8098-aa626889ec81 in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.053 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.069 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[938ed4ba-2e8a-4162-b442-a8b7f301968e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.070 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf5a6396a-b1 in ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.073 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf5a6396a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.073 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7dc17c-0cbf-4e8a-9cc0-e7af57a631b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.075 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0c924d98-44ef-45e6-b31d-518b5a0c19ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.096 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb334c8-ca54-4edc-9f14-47ffbe4cbb53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.117 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9db12d5b-2e2c-4d8c-b14a-25b5426d8b7a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.161 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[0880202d-a364-4758-9c12-a52614a9c2f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 NetworkManager[51724]: <info>  [1759268197.1707] manager: (tapf5a6396a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/223)
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.169 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3fecfb44-b8e7-4feb-8678-487659402977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.216 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[8c48cc37-9a9f-4fcd-bbc3-898c2b565f2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.228 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[c2435595-c85d-4e07-8d9f-a2ca226172e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 NetworkManager[51724]: <info>  [1759268197.2646] device (tapf5a6396a-b0): carrier: link connected
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.276 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9f3d42bd-2cad-4dd9-9b3e-be051fcdb00d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.310 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[264aaa16-0079-4e0c-a83e-dec2625b6a99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489293, 'reachable_time': 28484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238568, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.337 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[aa49a7b9-f20d-461d-ac15-ee215dc97398]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:66d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489293, 'tstamp': 489293}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238569, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.361 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ced8f931-f8f2-45c6-bf3c-0c6d9a5b1786]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489293, 'reachable_time': 28484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238572, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.405 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b93a7c24-3b75-4a7e-99cd-20ca12a3ee53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.483 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[762a9f17-90b0-4788-bcb2-eafc5177ae4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.485 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.485 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.486 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:37 compute-1 NetworkManager[51724]: <info>  [1759268197.4892] manager: (tapf5a6396a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Sep 30 21:36:37 compute-1 kernel: tapf5a6396a-b0: entered promiscuous mode
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.493 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.496 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:36:37 compute-1 ovn_controller[94902]: 2025-09-30T21:36:37Z|00437|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.497 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5006f400-b770-4bba-8292-777295da1ce2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.498 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:36:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:37.499 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'env', 'PROCESS_TAG=haproxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f5a6396a-b7b7-4ff1-a2af-27477fea2815.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:37 compute-1 podman[238609]: 2025-09-30 21:36:37.898834163 +0000 UTC m=+0.069920606 container create 9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.924 2 DEBUG nova.compute.manager [None req-ab1306ed-e778-4ac7-bdfe-af51e56c3c8b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.928 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.929 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268197.9227214, 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.929 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] VM Resumed (Lifecycle Event)
Sep 30 21:36:37 compute-1 systemd[1]: Started libpod-conmon-9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d.scope.
Sep 30 21:36:37 compute-1 podman[238609]: 2025-09-30 21:36:37.863629794 +0000 UTC m=+0.034716257 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.963 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.971 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:37 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:36:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01ba7ae15d968f00edbb28a9a01904fae81e0a05e4360f09eec3e7aca0aa0268/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.993 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] During sync_power_state the instance has a pending task (unrescuing). Skip.
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.994 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268197.923439, 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:37 compute-1 nova_compute[192795]: 2025-09-30 21:36:37.994 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] VM Started (Lifecycle Event)
Sep 30 21:36:38 compute-1 podman[238609]: 2025-09-30 21:36:38.003522784 +0000 UTC m=+0.174609217 container init 9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:36:38 compute-1 podman[238609]: 2025-09-30 21:36:38.009511917 +0000 UTC m=+0.180598330 container start 9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:38 compute-1 nova_compute[192795]: 2025-09-30 21:36:38.015 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:38 compute-1 nova_compute[192795]: 2025-09-30 21:36:38.019 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:38 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238624]: [NOTICE]   (238628) : New worker (238630) forked
Sep 30 21:36:38 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238624]: [NOTICE]   (238628) : Loading success.
Sep 30 21:36:38 compute-1 nova_compute[192795]: 2025-09-30 21:36:38.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:38.695 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:38.696 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:38.697 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.031 2 DEBUG nova.compute.manager [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.033 2 DEBUG oslo_concurrency.lockutils [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.034 2 DEBUG oslo_concurrency.lockutils [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.035 2 DEBUG oslo_concurrency.lockutils [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.035 2 DEBUG nova.compute.manager [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.036 2 WARNING nova.compute.manager [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state active and task_state None.
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.037 2 DEBUG nova.compute.manager [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.037 2 DEBUG oslo_concurrency.lockutils [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.038 2 DEBUG oslo_concurrency.lockutils [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.039 2 DEBUG oslo_concurrency.lockutils [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.039 2 DEBUG nova.compute.manager [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.040 2 WARNING nova.compute.manager [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state active and task_state None.
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.040 2 DEBUG nova.compute.manager [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.041 2 DEBUG oslo_concurrency.lockutils [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.042 2 DEBUG oslo_concurrency.lockutils [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.042 2 DEBUG oslo_concurrency.lockutils [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.043 2 DEBUG nova.compute.manager [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:39 compute-1 nova_compute[192795]: 2025-09-30 21:36:39.044 2 WARNING nova.compute.manager [req-181e7036-a925-4719-a5e0-a53758ef4d01 req-bf544d17-0202-45ad-a3e6-ec39abd0e131 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state active and task_state None.
Sep 30 21:36:40 compute-1 nova_compute[192795]: 2025-09-30 21:36:40.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:42 compute-1 podman[238641]: 2025-09-30 21:36:42.233331408 +0000 UTC m=+0.071580042 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.044 2 DEBUG oslo_concurrency.lockutils [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.046 2 DEBUG oslo_concurrency.lockutils [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.046 2 DEBUG oslo_concurrency.lockutils [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.046 2 DEBUG oslo_concurrency.lockutils [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.047 2 DEBUG oslo_concurrency.lockutils [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.059 2 INFO nova.compute.manager [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Terminating instance
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.070 2 DEBUG nova.compute.manager [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:36:43 compute-1 kernel: tap203ac748-c8 (unregistering): left promiscuous mode
Sep 30 21:36:43 compute-1 NetworkManager[51724]: <info>  [1759268203.0984] device (tap203ac748-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00438|binding|INFO|Releasing lport 203ac748-c872-4282-8098-aa626889ec81 from this chassis (sb_readonly=0)
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00439|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 down in Southbound
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00440|binding|INFO|Removing iface tap203ac748-c8 ovn-installed in OVS
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.128 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:15:a0 10.100.0.5'], port_security=['fa:16:3e:74:15:a0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '8', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=203ac748-c872-4282-8098-aa626889ec81) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.129 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 203ac748-c872-4282-8098-aa626889ec81 in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.131 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5a6396a-b7b7-4ff1-a2af-27477fea2815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.132 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f949b9e1-ab3f-48f9-abc9-2e4d19c8f69c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.134 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace which is not needed anymore
Sep 30 21:36:43 compute-1 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Sep 30 21:36:43 compute-1 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d0000006a.scope: Consumed 5.975s CPU time.
Sep 30 21:36:43 compute-1 systemd-machined[152783]: Machine qemu-54-instance-0000006a terminated.
Sep 30 21:36:43 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238624]: [NOTICE]   (238628) : haproxy version is 2.8.14-c23fe91
Sep 30 21:36:43 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238624]: [NOTICE]   (238628) : path to executable is /usr/sbin/haproxy
Sep 30 21:36:43 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238624]: [WARNING]  (238628) : Exiting Master process...
Sep 30 21:36:43 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238624]: [WARNING]  (238628) : Exiting Master process...
Sep 30 21:36:43 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238624]: [ALERT]    (238628) : Current worker (238630) exited with code 143 (Terminated)
Sep 30 21:36:43 compute-1 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[238624]: [WARNING]  (238628) : All workers exited. Exiting... (0)
Sep 30 21:36:43 compute-1 systemd[1]: libpod-9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d.scope: Deactivated successfully.
Sep 30 21:36:43 compute-1 podman[238684]: 2025-09-30 21:36:43.275834743 +0000 UTC m=+0.051298319 container died 9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:36:43 compute-1 kernel: tap203ac748-c8: entered promiscuous mode
Sep 30 21:36:43 compute-1 NetworkManager[51724]: <info>  [1759268203.2969] manager: (tap203ac748-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/225)
Sep 30 21:36:43 compute-1 systemd-udevd[238664]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:36:43 compute-1 kernel: tap203ac748-c8 (unregistering): left promiscuous mode
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00441|binding|INFO|Claiming lport 203ac748-c872-4282-8098-aa626889ec81 for this chassis.
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00442|binding|INFO|203ac748-c872-4282-8098-aa626889ec81: Claiming fa:16:3e:74:15:a0 10.100.0.5
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.312 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:15:a0 10.100.0.5'], port_security=['fa:16:3e:74:15:a0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '8', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=203ac748-c872-4282-8098-aa626889ec81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00443|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 ovn-installed in OVS
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00444|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 up in Southbound
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00445|binding|INFO|Releasing lport 203ac748-c872-4282-8098-aa626889ec81 from this chassis (sb_readonly=1)
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00446|if_status|INFO|Dropped 2 log messages in last 1212 seconds (most recently, 1212 seconds ago) due to excessive rate
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00447|if_status|INFO|Not setting lport 203ac748-c872-4282-8098-aa626889ec81 down as sb is readonly
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00448|binding|INFO|Removing iface tap203ac748-c8 ovn-installed in OVS
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00449|binding|INFO|Releasing lport 203ac748-c872-4282-8098-aa626889ec81 from this chassis (sb_readonly=0)
Sep 30 21:36:43 compute-1 ovn_controller[94902]: 2025-09-30T21:36:43Z|00450|binding|INFO|Setting lport 203ac748-c872-4282-8098-aa626889ec81 down in Southbound
Sep 30 21:36:43 compute-1 systemd[1]: var-lib-containers-storage-overlay-01ba7ae15d968f00edbb28a9a01904fae81e0a05e4360f09eec3e7aca0aa0268-merged.mount: Deactivated successfully.
Sep 30 21:36:43 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d-userdata-shm.mount: Deactivated successfully.
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 podman[238684]: 2025-09-30 21:36:43.354870486 +0000 UTC m=+0.130334052 container cleanup 9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.355 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:15:a0 10.100.0.5'], port_security=['fa:16:3e:74:15:a0 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '8', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=203ac748-c872-4282-8098-aa626889ec81) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:43 compute-1 systemd[1]: libpod-conmon-9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d.scope: Deactivated successfully.
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.374 2 INFO nova.virt.libvirt.driver [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Instance destroyed successfully.
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.375 2 DEBUG nova.objects.instance [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'resources' on Instance uuid 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.389 2 DEBUG nova.virt.libvirt.vif [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:36:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-2109753286',display_name='tempest-ServerStableDeviceRescueTest-server-2109753286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-2109753286',id=106,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:36:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-6inq7tbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:36:37Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.390 2 DEBUG nova.network.os_vif_util [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "203ac748-c872-4282-8098-aa626889ec81", "address": "fa:16:3e:74:15:a0", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap203ac748-c8", "ovs_interfaceid": "203ac748-c872-4282-8098-aa626889ec81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.391 2 DEBUG nova.network.os_vif_util [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:15:a0,bridge_name='br-int',has_traffic_filtering=True,id=203ac748-c872-4282-8098-aa626889ec81,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203ac748-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.391 2 DEBUG os_vif [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:15:a0,bridge_name='br-int',has_traffic_filtering=True,id=203ac748-c872-4282-8098-aa626889ec81,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203ac748-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.393 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap203ac748-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.398 2 INFO os_vif [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:15:a0,bridge_name='br-int',has_traffic_filtering=True,id=203ac748-c872-4282-8098-aa626889ec81,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203ac748-c8')
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.398 2 INFO nova.virt.libvirt.driver [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Deleting instance files /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca_del
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.399 2 INFO nova.virt.libvirt.driver [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Deletion of /var/lib/nova/instances/72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca_del complete
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.445 2 DEBUG nova.compute.manager [req-9645f132-555a-462f-8f2d-e0308e4213c1 req-5bd1e0c6-79ce-4d8c-927c-9cc4e9111efc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-unplugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.446 2 DEBUG oslo_concurrency.lockutils [req-9645f132-555a-462f-8f2d-e0308e4213c1 req-5bd1e0c6-79ce-4d8c-927c-9cc4e9111efc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.446 2 DEBUG oslo_concurrency.lockutils [req-9645f132-555a-462f-8f2d-e0308e4213c1 req-5bd1e0c6-79ce-4d8c-927c-9cc4e9111efc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.447 2 DEBUG oslo_concurrency.lockutils [req-9645f132-555a-462f-8f2d-e0308e4213c1 req-5bd1e0c6-79ce-4d8c-927c-9cc4e9111efc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.447 2 DEBUG nova.compute.manager [req-9645f132-555a-462f-8f2d-e0308e4213c1 req-5bd1e0c6-79ce-4d8c-927c-9cc4e9111efc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-unplugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.447 2 DEBUG nova.compute.manager [req-9645f132-555a-462f-8f2d-e0308e4213c1 req-5bd1e0c6-79ce-4d8c-927c-9cc4e9111efc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-unplugged-203ac748-c872-4282-8098-aa626889ec81 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:36:43 compute-1 podman[238719]: 2025-09-30 21:36:43.459563347 +0000 UTC m=+0.073328758 container remove 9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.466 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[50501306-bc9c-4da4-8ee4-758679af9112]: (4, ('Tue Sep 30 09:36:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d)\n9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d\nTue Sep 30 09:36:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d)\n9529470f04ee73b70611fb2a13d94375c07f8c408385dc4fe7e63fb74d004c3d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.467 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0cee3f82-ea9d-4201-8cbc-cb9b58c1e68e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.469 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 kernel: tapf5a6396a-b0: left promiscuous mode
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.476 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e42f62ac-f93e-430d-ab9f-24c44fe624e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.493 2 INFO nova.compute.manager [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Took 0.42 seconds to destroy the instance on the hypervisor.
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.494 2 DEBUG oslo.service.loopingcall [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.494 2 DEBUG nova.compute.manager [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:36:43 compute-1 nova_compute[192795]: 2025-09-30 21:36:43.494 2 DEBUG nova.network.neutron [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.513 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[91504f19-6791-459d-a1d3-9b7770787de7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.514 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e05b99b0-c3fe-4e17-a33e-66c58437d690]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.543 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[edfba179-d0a6-424a-94e2-d16ad9fb7077]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489282, 'reachable_time': 44644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238735, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.548 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:36:43 compute-1 systemd[1]: run-netns-ovnmeta\x2df5a6396a\x2db7b7\x2d4ff1\x2da2af\x2d27477fea2815.mount: Deactivated successfully.
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.549 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea91bf1-558d-4ee1-bbcd-933014bf46c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.550 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 203ac748-c872-4282-8098-aa626889ec81 in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.552 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5a6396a-b7b7-4ff1-a2af-27477fea2815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.559 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8eab7685-1918-404c-b33d-67fb3f112e75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.560 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 203ac748-c872-4282-8098-aa626889ec81 in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.562 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5a6396a-b7b7-4ff1-a2af-27477fea2815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:36:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:43.562 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[071436ab-5de2-4ace-bc7e-7ccae2cc8fcf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.023 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000069', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'hostId': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.025 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.025 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1141087200>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1141087200>]
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.025 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.044 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.045 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40bd84e1-8b32-400a-bd6c-1814d1be872d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-vda', 'timestamp': '2025-09-30T21:36:44.025900', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f2c2102-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': 'd60b79704744412377c4735d5c0f72783e54561798bbbf86d4bc89b264afd36f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-sda', 'timestamp': '2025-09-30T21:36:44.025900', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f2c30f2-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': '6363f7f2a1900aef699d0f7400af29966bf29aaaa5520e4beacd61e61fbb6d3f'}]}, 'timestamp': '2025-09-30 21:36:44.045660', '_unique_id': '4b984a09820f40aca1b9fa272998287a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.046 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.050 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 103b7a79-7ea6-47fa-bd6f-bcc87a96369b / tap53d72042-40 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.050 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9eee52a-89fa-4914-842d-f6b7a22533df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.048222', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f2d0c2a-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': 'f1c31f89a644c5c99a34841a294ff6b0543f6720fc341ba03e54f500b02e1326'}]}, 'timestamp': '2025-09-30 21:36:44.051271', '_unique_id': '822a1c9128134ed9b414c4ccd9eeca09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.051 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.052 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.052 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40a0b784-2f35-4762-ba4f-cf1a841ab7c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-vda', 'timestamp': '2025-09-30T21:36:44.052866', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f2d552c-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': 'ea8d6a90bd6d7bd5414cb5e8ae39c3767deb42fd8978446d29e21de0be0a4cd2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-sda', 'timestamp': '2025-09-30T21:36:44.052866', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f2d6080-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': 'b5c2df98eb4f799cd6d045d4f9486f8e049a1dd8c1609c6fb84436160393f04b'}]}, 'timestamp': '2025-09-30 21:36:44.053413', '_unique_id': '4f0d9f9914bc40668258234d47a8e85d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.054 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fc44cfa-6ca0-425d-8d12-99ab471bc61b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-vda', 'timestamp': '2025-09-30T21:36:44.054893', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f2da45a-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': '8e5943f14c51ddc0bcea0d5ddfd7e1a22b7271558dfc36fa9253ef91bd5036a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-sda', 'timestamp': '2025-09-30T21:36:44.054893', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f2dae82-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': '35fb1d4fd942b9fe57445a5a3aa02576f8a05d14d5d61176d5f70a403fa4f863'}]}, 'timestamp': '2025-09-30 21:36:44.055407', '_unique_id': '94acbeacfc134930bbf327a2e2548b36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.055 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.056 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac737f02-4652-4c4e-a7da-bda1ed071b50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.056907', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f2df360-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': '7eabbba2e7cdd7a18d557426125cc050870d53c1cdccfd567840a256e2d25d0b'}]}, 'timestamp': '2025-09-30 21:36:44.057203', '_unique_id': '5b697f61b37046bc96b19bc76ee2d0e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.059 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.069 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.069 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d9da8f5-4ac3-449e-915a-5e50a571ee67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-vda', 'timestamp': '2025-09-30T21:36:44.059258', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f2fd64e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.792044361, 'message_signature': 'c7d6a8288fdc1a41ab4e45001fb8383960cac01a63067bcdb3e8c0598166e83c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-sda', 'timestamp': '2025-09-30T21:36:44.059258', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f2fe256-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.792044361, 'message_signature': '3649eeb61a0c46298cf75e787c5067ca2895f37ee37ee8b9fd63ba65c36ed986'}]}, 'timestamp': '2025-09-30 21:36:44.069848', '_unique_id': '7ebdc97bab32428aa4e62500156588b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.070 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.071 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.071 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.071 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2ac71ec-10d1-42a7-bef5-248be7508360', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-vda', 'timestamp': '2025-09-30T21:36:44.071694', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f3034b8-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': 'b5e3e5870939a92d66126db683579e04a300e63a54003073f56477f305ff9426'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-sda', 'timestamp': '2025-09-30T21:36:44.071694', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f303e0e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': '73c2dddb1db1dc0999cba8917bb4d3357a6d22661390e48db20636d6e7619de4'}]}, 'timestamp': '2025-09-30 21:36:44.072187', '_unique_id': '3bd101ad6a6c41e2acc4939d08d1714e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.072 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.073 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.073 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89107893-df40-415b-a7d4-fc94a1df44e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-vda', 'timestamp': '2025-09-30T21:36:44.073711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f3084fe-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.792044361, 'message_signature': 'a371c45dc203256dbc448841850d8d092ef12aa9f8066105806443eb302d9851'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-sda', 'timestamp': '2025-09-30T21:36:44.073711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f308f58-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.792044361, 'message_signature': 'b66a539383a4253221188e8e2c32d3cc52944e5fec41a6d982d8a132f83fb790'}]}, 'timestamp': '2025-09-30 21:36:44.074271', '_unique_id': 'b182947dd41949cc8405ace5aacb07da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.074 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.075 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.read.latency volume: 506868453 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.read.latency volume: 692799 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95b7c3a1-35cb-40b5-8886-f34c9b6277be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 506868453, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-vda', 'timestamp': '2025-09-30T21:36:44.075859', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f30d788-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': 'e75e7e1cd3463e78e045f1d26fff63d2600cda5ad6cd0573612791f8312edecd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 692799, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-sda', 'timestamp': '2025-09-30T21:36:44.075859', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f30e0fc-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': 'ccbce84d67cf00cb26d6a29dd830d41832b09cd876555e6f0e6d4b088afc0ecb'}]}, 'timestamp': '2025-09-30 21:36:44.076379', '_unique_id': '7c8b2f8b84654fa094ecac6c67e3e8f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.076 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ce5b97f-e525-4816-809e-769c707ba2d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.078114', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f3130d4-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': 'a32b3e411e046d412420fddace4d013434dbbf77c394dcfcdab75b43ea781c93'}]}, 'timestamp': '2025-09-30 21:36:44.078423', '_unique_id': 'd57e20cbff824b16ab97f8168da4982e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.078 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.079 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.094 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.094 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 103b7a79-7ea6-47fa-bd6f-bcc87a96369b: ceilometer.compute.pollsters.NoVolumeException
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.095 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.095 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/cpu volume: 7460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e2ad543-9b68-4496-8a2b-36d435d6536e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7460000000, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'timestamp': '2025-09-30T21:36:44.095206', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8f33cd76-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.827410824, 'message_signature': '6d611d364f82bb037e2bf1f665564607ae7d6bdee25cf5b10a6f7421ca4516cd'}]}, 'timestamp': '2025-09-30 21:36:44.095563', '_unique_id': 'ae10e6b899ac42b692b11dd03bd27b10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.096 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.097 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.097 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1141087200>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1141087200>]
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.097 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.097 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1141087200>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1141087200>]
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.098 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.098 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86fdb562-a430-40ff-85e8-8f3e9e0f3937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.098207', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f34415c-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': '38e426b7a5dc5039777e1593c40a4a68d117b6f90fb35d329762cd927273e60c'}]}, 'timestamp': '2025-09-30 21:36:44.098536', '_unique_id': '62bff23559c747d7ae6d8041895ca4ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.099 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.100 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.100 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c00b9334-7407-48e8-a982-c9011a6bfd35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-vda', 'timestamp': '2025-09-30T21:36:44.100267', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f3491b6-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': '11ea13a1411723c2bcbd8dc964ebc1d3b6c66af911760c5a83aedf1ba5d36ada'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-sda', 'timestamp': '2025-09-30T21:36:44.100267', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f349bca-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.758649461, 'message_signature': '57ac696924de734aa9558ccc6ea1b2559d7cc21ec02467c523f136e04f62dd20'}]}, 'timestamp': '2025-09-30 21:36:44.100806', '_unique_id': '5005862681714dcaadb0d2bc040ff4d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.102 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57f37739-063f-497f-8fee-aec4fe95707b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.102354', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f34e288-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': '7e5d780d3c08e82dd1b0fbafdfbeb4c9af6a47615f79f453c91336a7986f6cda'}]}, 'timestamp': '2025-09-30 21:36:44.102627', '_unique_id': '3e5601f6c1bd49bfb2abbd8cf9ab24ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.103 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.104 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff133397-6cd8-452e-8852-6eb4d7bae27d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.104238', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f352f18-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': 'f456844722fba6d3521e1cd70afd5d5b8cddd323cbe0acf90ef318a1769cfdcd'}]}, 'timestamp': '2025-09-30 21:36:44.104647', '_unique_id': '7ee6a1e0ccab4be2ad14f5bcc4c7f450'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.105 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.106 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75e5e5ce-c991-4e8f-8025-000dd4e84d3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.106577', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f35894a-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': '970dcb1038899a5114d49c5dc3f51e540ce3cf24f2ebf3fe8437638dc701d99e'}]}, 'timestamp': '2025-09-30 21:36:44.106904', '_unique_id': '5df9a74055d64fff9c5df7507392ebcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.108 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '214c202b-5d16-49d8-ba01-867a9d294cd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.108467', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f35d1fc-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': 'ee6ffcb4de73eadf0c38ea3c01c8e70676579ace7ebd99b0224010cca1e287b4'}]}, 'timestamp': '2025-09-30 21:36:44.108755', '_unique_id': '5ecf0ed8f55d4ea0a4df70666813c899'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.109 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.110 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a762b36c-a944-484b-81de-54227fefed71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.110568', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f36238c-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': '61ff64d60bd1bdb5bbcd5ee8e9082e19d671838c7581f419262cbd7c3adc090f'}]}, 'timestamp': '2025-09-30 21:36:44.110848', '_unique_id': '07392fd47043477989e17d8d60322c05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.112 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.112 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1141087200>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1141087200>]
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.112 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd371566c-bf09-4eab-8493-57a36079fbea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-vda', 'timestamp': '2025-09-30T21:36:44.112810', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f367aa8-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.792044361, 'message_signature': '41565864f47873806b12f43cb88c8c83c8b07be2a387b132e3f6fa6723589cc9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b-sda', 'timestamp': '2025-09-30T21:36:44.112810', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'instance-00000069', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f3684b2-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.792044361, 'message_signature': '9a849d1d102df9e23d04e54eadf08695ee3b86cfd066fdd6b9bf71cdc2a775db'}]}, 'timestamp': '2025-09-30 21:36:44.113351', '_unique_id': '5d8ac42cce8a4de5b92ae8b5de8d0241'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.114 12 DEBUG ceilometer.compute.pollsters [-] 103b7a79-7ea6-47fa-bd6f-bcc87a96369b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9ee4d8f-f425-4e79-80b1-e9f499339ed6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-00000069-103b7a79-7ea6-47fa-bd6f-bcc87a96369b-tap53d72042-40', 'timestamp': '2025-09-30T21:36:44.114859', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1141087200', 'name': 'tap53d72042-40', 'instance_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'instance_type': 'm1.nano', 'host': '58a7b3b1c7f03c4702c5e4a1e59aa0fc15083a515a8d91319e852006', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fc:6f:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap53d72042-40'}, 'message_id': '8f36caee-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 4899.78099538, 'message_signature': '538542828e361a4d785f3ec98bedafd0a471c055c30c52a21f2092a9506e4e35'}]}, 'timestamp': '2025-09-30 21:36:44.115154', '_unique_id': '6c332f21b8614b6aa1a703ff295bd20f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:36:44.115 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-1 nova_compute[192795]: 2025-09-30 21:36:44.988 2 DEBUG nova.network.neutron [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.014 2 INFO nova.compute.manager [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Took 1.52 seconds to deallocate network for instance.
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.175 2 DEBUG oslo_concurrency.lockutils [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.176 2 DEBUG oslo_concurrency.lockutils [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.274 2 DEBUG nova.compute.provider_tree [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.291 2 DEBUG nova.scheduler.client.report [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.320 2 DEBUG oslo_concurrency.lockutils [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.328 2 DEBUG nova.compute.manager [req-4c7e5320-96d6-445e-88bc-0da180528188 req-edecad75-93a0-4411-b330-46bf3211272c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-deleted-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.378 2 INFO nova.scheduler.client.report [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Deleted allocations for instance 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.496 2 DEBUG oslo_concurrency.lockutils [None req-ec3e706a-8a02-4b4a-a3ee-54d82e949bf5 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.772 2 DEBUG nova.compute.manager [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.773 2 DEBUG oslo_concurrency.lockutils [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.773 2 DEBUG oslo_concurrency.lockutils [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.773 2 DEBUG oslo_concurrency.lockutils [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.774 2 DEBUG nova.compute.manager [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.774 2 WARNING nova.compute.manager [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state deleted and task_state None.
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.775 2 DEBUG nova.compute.manager [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.776 2 DEBUG oslo_concurrency.lockutils [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.777 2 DEBUG oslo_concurrency.lockutils [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.777 2 DEBUG oslo_concurrency.lockutils [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.777 2 DEBUG nova.compute.manager [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.778 2 WARNING nova.compute.manager [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state deleted and task_state None.
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.778 2 DEBUG nova.compute.manager [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.778 2 DEBUG oslo_concurrency.lockutils [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.779 2 DEBUG oslo_concurrency.lockutils [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.779 2 DEBUG oslo_concurrency.lockutils [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.779 2 DEBUG nova.compute.manager [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] No waiting events found dispatching network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:45 compute-1 nova_compute[192795]: 2025-09-30 21:36:45.779 2 WARNING nova.compute.manager [req-c674cc24-c869-4305-bcfe-305b91f6bc25 req-286270dd-3aa0-4915-8bc7-5bab02d2c5dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Received unexpected event network-vif-plugged-203ac748-c872-4282-8098-aa626889ec81 for instance with vm_state deleted and task_state None.
Sep 30 21:36:48 compute-1 nova_compute[192795]: 2025-09-30 21:36:48.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:48 compute-1 nova_compute[192795]: 2025-09-30 21:36:48.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:48 compute-1 ovn_controller[94902]: 2025-09-30T21:36:48Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fc:6f:ef 10.100.0.3
Sep 30 21:36:48 compute-1 ovn_controller[94902]: 2025-09-30T21:36:48Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fc:6f:ef 10.100.0.3
Sep 30 21:36:49 compute-1 nova_compute[192795]: 2025-09-30 21:36:49.732 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:51 compute-1 podman[238745]: 2025-09-30 21:36:51.22297659 +0000 UTC m=+0.065703591 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:36:51 compute-1 podman[238747]: 2025-09-30 21:36:51.257097799 +0000 UTC m=+0.091519333 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:36:51 compute-1 podman[238746]: 2025-09-30 21:36:51.30774467 +0000 UTC m=+0.135467741 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:36:51 compute-1 nova_compute[192795]: 2025-09-30 21:36:51.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:51 compute-1 nova_compute[192795]: 2025-09-30 21:36:51.725 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:51 compute-1 nova_compute[192795]: 2025-09-30 21:36:51.725 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:51 compute-1 nova_compute[192795]: 2025-09-30 21:36:51.726 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:51 compute-1 nova_compute[192795]: 2025-09-30 21:36:51.726 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:36:51 compute-1 nova_compute[192795]: 2025-09-30 21:36:51.808 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:51 compute-1 nova_compute[192795]: 2025-09-30 21:36:51.891 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:51 compute-1 nova_compute[192795]: 2025-09-30 21:36:51.892 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:51 compute-1 nova_compute[192795]: 2025-09-30 21:36:51.970 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.124 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.126 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5482MB free_disk=73.28837966918945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.126 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.126 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.219 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 103b7a79-7ea6-47fa-bd6f-bcc87a96369b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.219 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.220 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.236 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.269 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.270 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.305 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.337 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.408 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.429 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.456 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:36:52 compute-1 nova_compute[192795]: 2025-09-30 21:36:52.457 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:53 compute-1 nova_compute[192795]: 2025-09-30 21:36:53.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:53 compute-1 nova_compute[192795]: 2025-09-30 21:36:53.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:53 compute-1 nova_compute[192795]: 2025-09-30 21:36:53.458 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:53 compute-1 nova_compute[192795]: 2025-09-30 21:36:53.459 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:36:53 compute-1 nova_compute[192795]: 2025-09-30 21:36:53.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.434 2 DEBUG oslo_concurrency.lockutils [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.435 2 DEBUG oslo_concurrency.lockutils [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.436 2 DEBUG oslo_concurrency.lockutils [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.436 2 DEBUG oslo_concurrency.lockutils [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.437 2 DEBUG oslo_concurrency.lockutils [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.457 2 INFO nova.compute.manager [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Terminating instance
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.477 2 DEBUG nova.compute.manager [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:36:54 compute-1 kernel: tap53d72042-40 (unregistering): left promiscuous mode
Sep 30 21:36:54 compute-1 NetworkManager[51724]: <info>  [1759268214.5069] device (tap53d72042-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:36:54 compute-1 ovn_controller[94902]: 2025-09-30T21:36:54Z|00451|binding|INFO|Releasing lport 53d72042-40b3-4719-8611-1684c83c15ea from this chassis (sb_readonly=0)
Sep 30 21:36:54 compute-1 ovn_controller[94902]: 2025-09-30T21:36:54Z|00452|binding|INFO|Setting lport 53d72042-40b3-4719-8611-1684c83c15ea down in Southbound
Sep 30 21:36:54 compute-1 ovn_controller[94902]: 2025-09-30T21:36:54Z|00453|binding|INFO|Removing iface tap53d72042-40 ovn-installed in OVS
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.522 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:6f:ef 10.100.0.3'], port_security=['fa:16:3e:fc:6f:ef 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '103b7a79-7ea6-47fa-bd6f-bcc87a96369b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20111f98-bf7f-4696-b726-3e06c68cfed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6a3ecea0-9346-40cc-9dbe-25cd68fe08ed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4af2c14-351e-4037-bacd-dca3cfee62e9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=53d72042-40b3-4719-8611-1684c83c15ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.524 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 53d72042-40b3-4719-8611-1684c83c15ea in datapath 20111f98-bf7f-4696-b726-3e06c68cfed2 unbound from our chassis
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.525 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20111f98-bf7f-4696-b726-3e06c68cfed2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.526 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[adf98154-d0b8-426a-ae42-5e45e6655764]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.527 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 namespace which is not needed anymore
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000069.scope: Deactivated successfully.
Sep 30 21:36:54 compute-1 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000069.scope: Consumed 14.168s CPU time.
Sep 30 21:36:54 compute-1 systemd-machined[152783]: Machine qemu-53-instance-00000069 terminated.
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[238432]: [NOTICE]   (238436) : haproxy version is 2.8.14-c23fe91
Sep 30 21:36:54 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[238432]: [NOTICE]   (238436) : path to executable is /usr/sbin/haproxy
Sep 30 21:36:54 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[238432]: [WARNING]  (238436) : Exiting Master process...
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[238432]: [WARNING]  (238436) : Exiting Master process...
Sep 30 21:36:54 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[238432]: [ALERT]    (238436) : Current worker (238438) exited with code 143 (Terminated)
Sep 30 21:36:54 compute-1 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[238432]: [WARNING]  (238436) : All workers exited. Exiting... (0)
Sep 30 21:36:54 compute-1 systemd[1]: libpod-92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104.scope: Deactivated successfully.
Sep 30 21:36:54 compute-1 podman[238843]: 2025-09-30 21:36:54.720775774 +0000 UTC m=+0.066784229 container died 92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.763 2 INFO nova.virt.libvirt.driver [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Instance destroyed successfully.
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.763 2 DEBUG nova.objects.instance [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'resources' on Instance uuid 103b7a79-7ea6-47fa-bd6f-bcc87a96369b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:54 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104-userdata-shm.mount: Deactivated successfully.
Sep 30 21:36:54 compute-1 systemd[1]: var-lib-containers-storage-overlay-290a0218553f45a7718ce8043ea7f1610cbe6d2f25f976a4013cf029518e47ae-merged.mount: Deactivated successfully.
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.782 2 DEBUG nova.virt.libvirt.vif [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:36:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1141087200',display_name='tempest-ListServerFiltersTestJSON-instance-1141087200',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1141087200',id=105,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:36:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='17bd9c2628a94a0b83c4cae3f51b3f7c',ramdisk_id='',reservation_id='r-rydrv4y4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1322408077',owner_user_name='tempest-ListServerFiltersTestJSON-1322408077-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:36:36Z,user_data=None,user_id='3746d13787f042a1bfad4de0c42015eb',uuid=103b7a79-7ea6-47fa-bd6f-bcc87a96369b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.783 2 DEBUG nova.network.os_vif_util [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converting VIF {"id": "53d72042-40b3-4719-8611-1684c83c15ea", "address": "fa:16:3e:fc:6f:ef", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap53d72042-40", "ovs_interfaceid": "53d72042-40b3-4719-8611-1684c83c15ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.784 2 DEBUG nova.network.os_vif_util [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.784 2 DEBUG os_vif [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.787 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53d72042-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 podman[238843]: 2025-09-30 21:36:54.792275852 +0000 UTC m=+0.138284307 container cleanup 92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.793 2 INFO os_vif [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:ef,bridge_name='br-int',has_traffic_filtering=True,id=53d72042-40b3-4719-8611-1684c83c15ea,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap53d72042-40')
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.793 2 INFO nova.virt.libvirt.driver [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Deleting instance files /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b_del
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.794 2 INFO nova.virt.libvirt.driver [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Deletion of /var/lib/nova/instances/103b7a79-7ea6-47fa-bd6f-bcc87a96369b_del complete
Sep 30 21:36:54 compute-1 systemd[1]: libpod-conmon-92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104.scope: Deactivated successfully.
Sep 30 21:36:54 compute-1 podman[238886]: 2025-09-30 21:36:54.867814709 +0000 UTC m=+0.047927496 container remove 92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.874 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6c2e2fd0-ae1e-43af-8079-44c80d22bb50]: (4, ('Tue Sep 30 09:36:54 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 (92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104)\n92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104\nTue Sep 30 09:36:54 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 (92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104)\n92c3dc64f069b76dbc109d5ca45bf98d6b66018376433d07005bbcad0da7b104\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.876 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[368bcdf9-1240-420c-9a77-645c2a057dd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.877 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20111f98-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 kernel: tap20111f98-b0: left promiscuous mode
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.886 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4aac4ef6-d94d-48cd-9c39-363f620a3cf8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.902 2 INFO nova.compute.manager [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Took 0.42 seconds to destroy the instance on the hypervisor.
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.903 2 DEBUG oslo.service.loopingcall [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.903 2 DEBUG nova.compute.manager [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:36:54 compute-1 nova_compute[192795]: 2025-09-30 21:36:54.903 2 DEBUG nova.network.neutron [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.921 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9978bd41-8139-474c-9e86-294acd4f25dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.926 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8723c765-99fe-4238-9cde-94d27729c05c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.958 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6777f9a7-013a-47e6-9294-2eb84c031c6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489093, 'reachable_time': 31097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238901, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:54 compute-1 systemd[1]: run-netns-ovnmeta\x2d20111f98\x2dbf7f\x2d4696\x2db726\x2d3e06c68cfed2.mount: Deactivated successfully.
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.965 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:36:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:36:54.965 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[af3ddfda-cedd-45f3-b92c-e8bde8bf2473]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.438 2 DEBUG nova.compute.manager [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-unplugged-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.439 2 DEBUG oslo_concurrency.lockutils [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.440 2 DEBUG oslo_concurrency.lockutils [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.441 2 DEBUG oslo_concurrency.lockutils [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.441 2 DEBUG nova.compute.manager [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] No waiting events found dispatching network-vif-unplugged-53d72042-40b3-4719-8611-1684c83c15ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.442 2 DEBUG nova.compute.manager [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-unplugged-53d72042-40b3-4719-8611-1684c83c15ea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.442 2 DEBUG nova.compute.manager [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.443 2 DEBUG oslo_concurrency.lockutils [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.443 2 DEBUG oslo_concurrency.lockutils [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.443 2 DEBUG oslo_concurrency.lockutils [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.444 2 DEBUG nova.compute.manager [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] No waiting events found dispatching network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.444 2 WARNING nova.compute.manager [req-07ca632d-17ce-400a-8954-bfbe49d41191 req-5949d778-6da7-472a-8f0b-c6acc6e0f395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received unexpected event network-vif-plugged-53d72042-40b3-4719-8611-1684c83c15ea for instance with vm_state active and task_state deleting.
Sep 30 21:36:55 compute-1 nova_compute[192795]: 2025-09-30 21:36:55.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:56 compute-1 nova_compute[192795]: 2025-09-30 21:36:56.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:57 compute-1 podman[238902]: 2025-09-30 21:36:57.247487399 +0000 UTC m=+0.076060763 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:36:57 compute-1 nova_compute[192795]: 2025-09-30 21:36:57.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.235 2 DEBUG nova.network.neutron [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.260 2 INFO nova.compute.manager [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Took 3.36 seconds to deallocate network for instance.
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.302 2 DEBUG nova.compute.manager [req-3c176fe9-decc-4c30-8317-e6e5dbdd86be req-b3756dfa-086f-462a-bed7-fa54875a40ca dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Received event network-vif-deleted-53d72042-40b3-4719-8611-1684c83c15ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.330 2 DEBUG oslo_concurrency.lockutils [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.330 2 DEBUG oslo_concurrency.lockutils [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.372 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268203.3714018, 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.373 2 INFO nova.compute.manager [-] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] VM Stopped (Lifecycle Event)
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.393 2 DEBUG nova.compute.manager [None req-2c0b7fcd-c44e-44f8-aabe-67ccb1af2cde - - - - - -] [instance: 72ec9496-2587-4dac-bc9e-9ca1b8f5e6ca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.411 2 DEBUG nova.compute.provider_tree [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.428 2 DEBUG nova.scheduler.client.report [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.452 2 DEBUG oslo_concurrency.lockutils [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.482 2 INFO nova.scheduler.client.report [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Deleted allocations for instance 103b7a79-7ea6-47fa-bd6f-bcc87a96369b
Sep 30 21:36:58 compute-1 nova_compute[192795]: 2025-09-30 21:36:58.616 2 DEBUG oslo_concurrency.lockutils [None req-9fa5f82b-6509-4aff-bb51-f57cfa078725 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "103b7a79-7ea6-47fa-bd6f-bcc87a96369b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:59 compute-1 nova_compute[192795]: 2025-09-30 21:36:59.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:00 compute-1 nova_compute[192795]: 2025-09-30 21:37:00.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:02 compute-1 podman[238923]: 2025-09-30 21:37:02.233798267 +0000 UTC m=+0.070830249 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, vcs-type=git)
Sep 30 21:37:02 compute-1 podman[238924]: 2025-09-30 21:37:02.250326878 +0000 UTC m=+0.078129400 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:37:02 compute-1 podman[238925]: 2025-09-30 21:37:02.267941167 +0000 UTC m=+0.078158219 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:37:02 compute-1 nova_compute[192795]: 2025-09-30 21:37:02.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:03 compute-1 nova_compute[192795]: 2025-09-30 21:37:03.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:03 compute-1 nova_compute[192795]: 2025-09-30 21:37:03.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:03 compute-1 nova_compute[192795]: 2025-09-30 21:37:03.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:37:03 compute-1 nova_compute[192795]: 2025-09-30 21:37:03.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:37:03 compute-1 nova_compute[192795]: 2025-09-30 21:37:03.728 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:37:04 compute-1 nova_compute[192795]: 2025-09-30 21:37:04.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:08 compute-1 nova_compute[192795]: 2025-09-30 21:37:08.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.254 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.255 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.279 2 DEBUG nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.434 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.435 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.443 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.444 2 INFO nova.compute.claims [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.600 2 DEBUG nova.compute.provider_tree [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.614 2 DEBUG nova.scheduler.client.report [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.661 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.662 2 DEBUG nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.745 2 DEBUG nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.745 2 DEBUG nova.network.neutron [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.760 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268214.7588897, 103b7a79-7ea6-47fa-bd6f-bcc87a96369b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.760 2 INFO nova.compute.manager [-] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] VM Stopped (Lifecycle Event)
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.786 2 INFO nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.828 2 DEBUG nova.compute.manager [None req-a3859fea-8102-4d1b-bf1a-e612c1e75d1b - - - - - -] [instance: 103b7a79-7ea6-47fa-bd6f-bcc87a96369b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:09 compute-1 nova_compute[192795]: 2025-09-30 21:37:09.853 2 DEBUG nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.004 2 DEBUG nova.policy [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3fb2d00c8e3a4805a2f71cad4ff37678', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0d411f374dcb4367a0cbd1966b53998b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.010 2 DEBUG nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.011 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.011 2 INFO nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Creating image(s)
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.012 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "/var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.012 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "/var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.013 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "/var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.026 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.084 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.087 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.089 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.115 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.197 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.199 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.236 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.237 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.237 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.293 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.294 2 DEBUG nova.virt.disk.api [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Checking if we can resize image /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.295 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.358 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.358 2 DEBUG nova.virt.disk.api [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Cannot resize image /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.359 2 DEBUG nova.objects.instance [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lazy-loading 'migration_context' on Instance uuid d652090c-364f-4b7f-a5d7-6c6d568afaf8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.514 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.515 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Ensure instance console log exists: /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.515 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.516 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:10 compute-1 nova_compute[192795]: 2025-09-30 21:37:10.516 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:13 compute-1 nova_compute[192795]: 2025-09-30 21:37:13.033 2 DEBUG nova.network.neutron [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Successfully created port: af2b2485-3452-44f6-8b63-368cdd756a08 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:37:13 compute-1 nova_compute[192795]: 2025-09-30 21:37:13.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:13 compute-1 podman[238998]: 2025-09-30 21:37:13.283374001 +0000 UTC m=+0.120096002 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS)
Sep 30 21:37:14 compute-1 nova_compute[192795]: 2025-09-30 21:37:14.438 2 DEBUG nova.network.neutron [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Successfully updated port: af2b2485-3452-44f6-8b63-368cdd756a08 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:37:14 compute-1 nova_compute[192795]: 2025-09-30 21:37:14.473 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "refresh_cache-d652090c-364f-4b7f-a5d7-6c6d568afaf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:14 compute-1 nova_compute[192795]: 2025-09-30 21:37:14.473 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquired lock "refresh_cache-d652090c-364f-4b7f-a5d7-6c6d568afaf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:14 compute-1 nova_compute[192795]: 2025-09-30 21:37:14.474 2 DEBUG nova.network.neutron [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:37:14 compute-1 nova_compute[192795]: 2025-09-30 21:37:14.649 2 DEBUG nova.network.neutron [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:37:14 compute-1 nova_compute[192795]: 2025-09-30 21:37:14.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.137 2 DEBUG nova.compute.manager [req-6387b565-bfd3-4737-bfc1-8aef7d88a12d req-6363cdc1-d0c2-418c-adaa-9fb64ca16abe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Received event network-changed-af2b2485-3452-44f6-8b63-368cdd756a08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.138 2 DEBUG nova.compute.manager [req-6387b565-bfd3-4737-bfc1-8aef7d88a12d req-6363cdc1-d0c2-418c-adaa-9fb64ca16abe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Refreshing instance network info cache due to event network-changed-af2b2485-3452-44f6-8b63-368cdd756a08. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.138 2 DEBUG oslo_concurrency.lockutils [req-6387b565-bfd3-4737-bfc1-8aef7d88a12d req-6363cdc1-d0c2-418c-adaa-9fb64ca16abe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-d652090c-364f-4b7f-a5d7-6c6d568afaf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.895 2 DEBUG nova.network.neutron [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Updating instance_info_cache with network_info: [{"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.934 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Releasing lock "refresh_cache-d652090c-364f-4b7f-a5d7-6c6d568afaf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.934 2 DEBUG nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Instance network_info: |[{"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.934 2 DEBUG oslo_concurrency.lockutils [req-6387b565-bfd3-4737-bfc1-8aef7d88a12d req-6363cdc1-d0c2-418c-adaa-9fb64ca16abe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-d652090c-364f-4b7f-a5d7-6c6d568afaf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.935 2 DEBUG nova.network.neutron [req-6387b565-bfd3-4737-bfc1-8aef7d88a12d req-6363cdc1-d0c2-418c-adaa-9fb64ca16abe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Refreshing network info cache for port af2b2485-3452-44f6-8b63-368cdd756a08 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.937 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Start _get_guest_xml network_info=[{"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.946 2 WARNING nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.954 2 DEBUG nova.virt.libvirt.host [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.954 2 DEBUG nova.virt.libvirt.host [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.964 2 DEBUG nova.virt.libvirt.host [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.965 2 DEBUG nova.virt.libvirt.host [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.966 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.966 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.967 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.967 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.967 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.967 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.967 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.968 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.968 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.968 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.969 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.969 2 DEBUG nova.virt.hardware [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.973 2 DEBUG nova.virt.libvirt.vif [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:37:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-1033636295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servertagstestjson-server-1033636295',id=112,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0d411f374dcb4367a0cbd1966b53998b',ramdisk_id='',reservation_id='r-l7gm5r6k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-1932675786',owner_user_name='tempest-ServerTagsTestJSON-1932675786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:37:09Z,user_data=None,user_id='3fb2d00c8e3a4805a2f71cad4ff37678',uuid=d652090c-364f-4b7f-a5d7-6c6d568afaf8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.973 2 DEBUG nova.network.os_vif_util [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Converting VIF {"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.974 2 DEBUG nova.network.os_vif_util [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:af:1d,bridge_name='br-int',has_traffic_filtering=True,id=af2b2485-3452-44f6-8b63-368cdd756a08,network=Network(4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf2b2485-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.975 2 DEBUG nova.objects.instance [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lazy-loading 'pci_devices' on Instance uuid d652090c-364f-4b7f-a5d7-6c6d568afaf8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.987 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <uuid>d652090c-364f-4b7f-a5d7-6c6d568afaf8</uuid>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <name>instance-00000070</name>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerTagsTestJSON-server-1033636295</nova:name>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:37:15</nova:creationTime>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:37:15 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:37:15 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:37:15 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:37:15 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:37:15 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:37:15 compute-1 nova_compute[192795]:         <nova:user uuid="3fb2d00c8e3a4805a2f71cad4ff37678">tempest-ServerTagsTestJSON-1932675786-project-member</nova:user>
Sep 30 21:37:15 compute-1 nova_compute[192795]:         <nova:project uuid="0d411f374dcb4367a0cbd1966b53998b">tempest-ServerTagsTestJSON-1932675786</nova:project>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:37:15 compute-1 nova_compute[192795]:         <nova:port uuid="af2b2485-3452-44f6-8b63-368cdd756a08">
Sep 30 21:37:15 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <system>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <entry name="serial">d652090c-364f-4b7f-a5d7-6c6d568afaf8</entry>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <entry name="uuid">d652090c-364f-4b7f-a5d7-6c6d568afaf8</entry>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     </system>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <os>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   </os>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <features>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   </features>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk.config"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:6f:af:1d"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <target dev="tapaf2b2485-34"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/console.log" append="off"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <video>
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     </video>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:37:15 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:37:15 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:37:15 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:37:15 compute-1 nova_compute[192795]: </domain>
Sep 30 21:37:15 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.988 2 DEBUG nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Preparing to wait for external event network-vif-plugged-af2b2485-3452-44f6-8b63-368cdd756a08 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.989 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.989 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.989 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.990 2 DEBUG nova.virt.libvirt.vif [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:37:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-1033636295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servertagstestjson-server-1033636295',id=112,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0d411f374dcb4367a0cbd1966b53998b',ramdisk_id='',reservation_id='r-l7gm5r6k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-1932675786',owner_user_name='tempest-ServerTagsTestJSON-1932675786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:37:09Z,user_data=None,user_id='3fb2d00c8e3a4805a2f71cad4ff37678',uuid=d652090c-364f-4b7f-a5d7-6c6d568afaf8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.990 2 DEBUG nova.network.os_vif_util [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Converting VIF {"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.991 2 DEBUG nova.network.os_vif_util [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:af:1d,bridge_name='br-int',has_traffic_filtering=True,id=af2b2485-3452-44f6-8b63-368cdd756a08,network=Network(4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf2b2485-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.991 2 DEBUG os_vif [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:af:1d,bridge_name='br-int',has_traffic_filtering=True,id=af2b2485-3452-44f6-8b63-368cdd756a08,network=Network(4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf2b2485-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.992 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.992 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.995 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf2b2485-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:15 compute-1 nova_compute[192795]: 2025-09-30 21:37:15.996 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf2b2485-34, col_values=(('external_ids', {'iface-id': 'af2b2485-3452-44f6-8b63-368cdd756a08', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:af:1d', 'vm-uuid': 'd652090c-364f-4b7f-a5d7-6c6d568afaf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:16 compute-1 NetworkManager[51724]: <info>  [1759268236.0455] manager: (tapaf2b2485-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.052 2 INFO os_vif [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:af:1d,bridge_name='br-int',has_traffic_filtering=True,id=af2b2485-3452-44f6-8b63-368cdd756a08,network=Network(4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf2b2485-34')
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.116 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.117 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.117 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] No VIF found with MAC fa:16:3e:6f:af:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.117 2 INFO nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Using config drive
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.598 2 INFO nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Creating config drive at /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk.config
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.608 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdwitptlo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.757 2 DEBUG oslo_concurrency.processutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdwitptlo" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:16 compute-1 kernel: tapaf2b2485-34: entered promiscuous mode
Sep 30 21:37:16 compute-1 NetworkManager[51724]: <info>  [1759268236.8594] manager: (tapaf2b2485-34): new Tun device (/org/freedesktop/NetworkManager/Devices/227)
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:16 compute-1 ovn_controller[94902]: 2025-09-30T21:37:16Z|00454|binding|INFO|Claiming lport af2b2485-3452-44f6-8b63-368cdd756a08 for this chassis.
Sep 30 21:37:16 compute-1 ovn_controller[94902]: 2025-09-30T21:37:16Z|00455|binding|INFO|af2b2485-3452-44f6-8b63-368cdd756a08: Claiming fa:16:3e:6f:af:1d 10.100.0.6
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:16 compute-1 systemd-udevd[239035]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:37:16 compute-1 NetworkManager[51724]: <info>  [1759268236.9061] device (tapaf2b2485-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:37:16 compute-1 NetworkManager[51724]: <info>  [1759268236.9068] device (tapaf2b2485-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.917 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:af:1d 10.100.0.6'], port_security=['fa:16:3e:6f:af:1d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd652090c-364f-4b7f-a5d7-6c6d568afaf8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0d411f374dcb4367a0cbd1966b53998b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '640e3817-959e-4aa8-b729-b93f10c192e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4368a674-347e-4a8e-9b8e-9f40d9a37dbd, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=af2b2485-3452-44f6-8b63-368cdd756a08) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.918 103861 INFO neutron.agent.ovn.metadata.agent [-] Port af2b2485-3452-44f6-8b63-368cdd756a08 in datapath 4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c bound to our chassis
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.920 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c
Sep 30 21:37:16 compute-1 systemd-machined[152783]: New machine qemu-55-instance-00000070.
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:16 compute-1 ovn_controller[94902]: 2025-09-30T21:37:16Z|00456|binding|INFO|Setting lport af2b2485-3452-44f6-8b63-368cdd756a08 ovn-installed in OVS
Sep 30 21:37:16 compute-1 ovn_controller[94902]: 2025-09-30T21:37:16Z|00457|binding|INFO|Setting lport af2b2485-3452-44f6-8b63-368cdd756a08 up in Southbound
Sep 30 21:37:16 compute-1 nova_compute[192795]: 2025-09-30 21:37:16.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:16 compute-1 systemd[1]: Started Virtual Machine qemu-55-instance-00000070.
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.937 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[afa0de45-6a84-4099-9a56-d25898ae3e1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.938 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4a1f84a4-51 in ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.940 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4a1f84a4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.940 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd8b615-5c20-4df7-91e7-1496683bfe58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.941 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[235fca88-1269-468e-adb4-2f81aadc3657]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.954 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[4eacd7e7-43c7-43c8-b37e-d8911ec800c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:16.979 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ddcc2c3d-9c1b-4f96-b78b-9f854372f87a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.016 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[df831b15-a0d3-4721-bc3f-d84e36d922ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.021 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[34a36d96-cd5b-487e-90ce-4c6ec5488c9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 NetworkManager[51724]: <info>  [1759268237.0228] manager: (tap4a1f84a4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/228)
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.061 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e163e043-b735-489d-8af0-5d0703c08e5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.064 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[135770e1-2569-4553-9773-50277691ea3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 NetworkManager[51724]: <info>  [1759268237.0913] device (tap4a1f84a4-50): carrier: link connected
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.099 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[41d883fe-b86f-4d61-bbb6-60bd2f3e9b18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.120 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e6508744-dc8d-403a-9560-26d74fc80664]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a1f84a4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493276, 'reachable_time': 20329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239076, 'error': None, 'target': 'ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.145 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[25545004-da93-4351-a2d8-aaaf18bf9f83]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:ddff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493276, 'tstamp': 493276}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239078, 'error': None, 'target': 'ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.171 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d293eb2c-2e6a-44d6-88f9-e9b270b41a29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a1f84a4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493276, 'reachable_time': 20329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239080, 'error': None, 'target': 'ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.212 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[78463880-feac-4f61-b48d-1d6b7c7c49db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.294 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[19deff08-6a4c-4c3a-8567-8c98ea0eb654]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.296 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a1f84a4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.296 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.296 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a1f84a4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:17 compute-1 NetworkManager[51724]: <info>  [1759268237.2990] manager: (tap4a1f84a4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Sep 30 21:37:17 compute-1 kernel: tap4a1f84a4-50: entered promiscuous mode
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.301 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a1f84a4-50, col_values=(('external_ids', {'iface-id': '29d20de9-b2c4-46ba-8ee9-57b049ba040c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:17 compute-1 ovn_controller[94902]: 2025-09-30T21:37:17Z|00458|binding|INFO|Releasing lport 29d20de9-b2c4-46ba-8ee9-57b049ba040c from this chassis (sb_readonly=0)
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.306 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.307 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[26c9aa64-890a-4e44-8cea-27156aba0b65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.307 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c.pid.haproxy
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:37:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:17.308 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c', 'env', 'PROCESS_TAG=haproxy-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.604 2 DEBUG nova.network.neutron [req-6387b565-bfd3-4737-bfc1-8aef7d88a12d req-6363cdc1-d0c2-418c-adaa-9fb64ca16abe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Updated VIF entry in instance network info cache for port af2b2485-3452-44f6-8b63-368cdd756a08. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.605 2 DEBUG nova.network.neutron [req-6387b565-bfd3-4737-bfc1-8aef7d88a12d req-6363cdc1-d0c2-418c-adaa-9fb64ca16abe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Updating instance_info_cache with network_info: [{"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.629 2 DEBUG oslo_concurrency.lockutils [req-6387b565-bfd3-4737-bfc1-8aef7d88a12d req-6363cdc1-d0c2-418c-adaa-9fb64ca16abe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-d652090c-364f-4b7f-a5d7-6c6d568afaf8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.631 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268237.6281195, d652090c-364f-4b7f-a5d7-6c6d568afaf8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.631 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] VM Started (Lifecycle Event)
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.653 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.658 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268237.628945, d652090c-364f-4b7f-a5d7-6c6d568afaf8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.658 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] VM Paused (Lifecycle Event)
Sep 30 21:37:17 compute-1 podman[239112]: 2025-09-30 21:37:17.673732667 +0000 UTC m=+0.052099760 container create 25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.684 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.690 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:17 compute-1 nova_compute[192795]: 2025-09-30 21:37:17.708 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:37:17 compute-1 systemd[1]: Started libpod-conmon-25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37.scope.
Sep 30 21:37:17 compute-1 podman[239112]: 2025-09-30 21:37:17.645706654 +0000 UTC m=+0.024073767 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:37:17 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:37:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7280da9f0dc94760efc46b88efd43dfeebef82d625b09220458d93aa42248681/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:37:17 compute-1 podman[239112]: 2025-09-30 21:37:17.79567561 +0000 UTC m=+0.174042793 container init 25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:37:17 compute-1 podman[239112]: 2025-09-30 21:37:17.802131955 +0000 UTC m=+0.180499088 container start 25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:37:17 compute-1 neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c[239127]: [NOTICE]   (239131) : New worker (239133) forked
Sep 30 21:37:17 compute-1 neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c[239127]: [NOTICE]   (239131) : Loading success.
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.354 2 DEBUG nova.compute.manager [req-27d6be79-e72d-4eed-8690-aafd19b8152d req-ea83db3e-82e4-47b1-8623-0cc4f30b72a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Received event network-vif-plugged-af2b2485-3452-44f6-8b63-368cdd756a08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.355 2 DEBUG oslo_concurrency.lockutils [req-27d6be79-e72d-4eed-8690-aafd19b8152d req-ea83db3e-82e4-47b1-8623-0cc4f30b72a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.355 2 DEBUG oslo_concurrency.lockutils [req-27d6be79-e72d-4eed-8690-aafd19b8152d req-ea83db3e-82e4-47b1-8623-0cc4f30b72a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.355 2 DEBUG oslo_concurrency.lockutils [req-27d6be79-e72d-4eed-8690-aafd19b8152d req-ea83db3e-82e4-47b1-8623-0cc4f30b72a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.355 2 DEBUG nova.compute.manager [req-27d6be79-e72d-4eed-8690-aafd19b8152d req-ea83db3e-82e4-47b1-8623-0cc4f30b72a1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Processing event network-vif-plugged-af2b2485-3452-44f6-8b63-368cdd756a08 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.356 2 DEBUG nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.360 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.361 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268238.3611612, d652090c-364f-4b7f-a5d7-6c6d568afaf8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.361 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] VM Resumed (Lifecycle Event)
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.365 2 INFO nova.virt.libvirt.driver [-] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Instance spawned successfully.
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.366 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.392 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.404 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.405 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.406 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.407 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.408 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.408 2 DEBUG nova.virt.libvirt.driver [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.414 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.465 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.491 2 INFO nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Took 8.48 seconds to spawn the instance on the hypervisor.
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.491 2 DEBUG nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.579 2 INFO nova.compute.manager [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Took 9.20 seconds to build instance.
Sep 30 21:37:18 compute-1 nova_compute[192795]: 2025-09-30 21:37:18.599 2 DEBUG oslo_concurrency.lockutils [None req-eb14b373-b5a3-4839-a4c5-3ffe9cc1f8e3 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.233 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "004d1ddf-f040-4c90-97d3-e34edd56c064" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.234 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.262 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.417 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.418 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.433 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.433 2 INFO nova.compute.claims [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.735 2 DEBUG nova.compute.provider_tree [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.763 2 DEBUG nova.scheduler.client.report [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.821 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:19 compute-1 nova_compute[192795]: 2025-09-30 21:37:19.822 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.056 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.057 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.078 2 INFO nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.104 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.251 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.252 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.253 2 INFO nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Creating image(s)
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.253 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "/var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.254 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "/var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.255 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "/var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.271 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.300 2 DEBUG nova.policy [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cbebeb1f78b64ee09e2da39a04f9f282', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '27ad5d27d7a44404987f6bf297897a45', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.346 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.347 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.348 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.364 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.430 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.431 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.470 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.471 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.472 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.508 2 DEBUG nova.compute.manager [req-e596a6bc-7b6b-42a8-b571-d5828bb52f11 req-1738609c-8386-4dc7-b6e5-2c76bcac4250 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Received event network-vif-plugged-af2b2485-3452-44f6-8b63-368cdd756a08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.509 2 DEBUG oslo_concurrency.lockutils [req-e596a6bc-7b6b-42a8-b571-d5828bb52f11 req-1738609c-8386-4dc7-b6e5-2c76bcac4250 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.510 2 DEBUG oslo_concurrency.lockutils [req-e596a6bc-7b6b-42a8-b571-d5828bb52f11 req-1738609c-8386-4dc7-b6e5-2c76bcac4250 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.510 2 DEBUG oslo_concurrency.lockutils [req-e596a6bc-7b6b-42a8-b571-d5828bb52f11 req-1738609c-8386-4dc7-b6e5-2c76bcac4250 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.510 2 DEBUG nova.compute.manager [req-e596a6bc-7b6b-42a8-b571-d5828bb52f11 req-1738609c-8386-4dc7-b6e5-2c76bcac4250 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] No waiting events found dispatching network-vif-plugged-af2b2485-3452-44f6-8b63-368cdd756a08 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.511 2 WARNING nova.compute.manager [req-e596a6bc-7b6b-42a8-b571-d5828bb52f11 req-1738609c-8386-4dc7-b6e5-2c76bcac4250 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Received unexpected event network-vif-plugged-af2b2485-3452-44f6-8b63-368cdd756a08 for instance with vm_state active and task_state None.
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.531 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.531 2 DEBUG nova.virt.disk.api [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Checking if we can resize image /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.532 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.599 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.600 2 DEBUG nova.virt.disk.api [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Cannot resize image /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.601 2 DEBUG nova.objects.instance [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lazy-loading 'migration_context' on Instance uuid 004d1ddf-f040-4c90-97d3-e34edd56c064 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.621 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.621 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Ensure instance console log exists: /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.622 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.622 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.623 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:20 compute-1 nova_compute[192795]: 2025-09-30 21:37:20.996 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Successfully created port: 7eea3181-d9f9-4bb8-97ac-b18f21056b96 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:37:21 compute-1 nova_compute[192795]: 2025-09-30 21:37:21.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.194 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Successfully updated port: 7eea3181-d9f9-4bb8-97ac-b18f21056b96 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.225 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "refresh_cache-004d1ddf-f040-4c90-97d3-e34edd56c064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.226 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquired lock "refresh_cache-004d1ddf-f040-4c90-97d3-e34edd56c064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.226 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:37:22 compute-1 podman[239157]: 2025-09-30 21:37:22.248063166 +0000 UTC m=+0.084211086 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:37:22 compute-1 podman[239159]: 2025-09-30 21:37:22.270246709 +0000 UTC m=+0.088633355 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:37:22 compute-1 podman[239158]: 2025-09-30 21:37:22.281668641 +0000 UTC m=+0.110419619 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.326 2 DEBUG nova.compute.manager [req-84c378eb-08eb-4c53-8458-4486db5bbbb9 req-63d13a27-07a9-4f62-8f3a-6c701b510b62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Received event network-changed-7eea3181-d9f9-4bb8-97ac-b18f21056b96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.326 2 DEBUG nova.compute.manager [req-84c378eb-08eb-4c53-8458-4486db5bbbb9 req-63d13a27-07a9-4f62-8f3a-6c701b510b62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Refreshing instance network info cache due to event network-changed-7eea3181-d9f9-4bb8-97ac-b18f21056b96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.326 2 DEBUG oslo_concurrency.lockutils [req-84c378eb-08eb-4c53-8458-4486db5bbbb9 req-63d13a27-07a9-4f62-8f3a-6c701b510b62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-004d1ddf-f040-4c90-97d3-e34edd56c064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.480 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.955 2 DEBUG oslo_concurrency.lockutils [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.955 2 DEBUG oslo_concurrency.lockutils [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.955 2 DEBUG oslo_concurrency.lockutils [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.956 2 DEBUG oslo_concurrency.lockutils [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.956 2 DEBUG oslo_concurrency.lockutils [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.967 2 INFO nova.compute.manager [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Terminating instance
Sep 30 21:37:22 compute-1 nova_compute[192795]: 2025-09-30 21:37:22.987 2 DEBUG nova.compute.manager [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:37:23 compute-1 kernel: tapaf2b2485-34 (unregistering): left promiscuous mode
Sep 30 21:37:23 compute-1 NetworkManager[51724]: <info>  [1759268243.0162] device (tapaf2b2485-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:37:23 compute-1 ovn_controller[94902]: 2025-09-30T21:37:23Z|00459|binding|INFO|Releasing lport af2b2485-3452-44f6-8b63-368cdd756a08 from this chassis (sb_readonly=0)
Sep 30 21:37:23 compute-1 ovn_controller[94902]: 2025-09-30T21:37:23Z|00460|binding|INFO|Setting lport af2b2485-3452-44f6-8b63-368cdd756a08 down in Southbound
Sep 30 21:37:23 compute-1 ovn_controller[94902]: 2025-09-30T21:37:23Z|00461|binding|INFO|Removing iface tapaf2b2485-34 ovn-installed in OVS
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.033 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:af:1d 10.100.0.6'], port_security=['fa:16:3e:6f:af:1d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd652090c-364f-4b7f-a5d7-6c6d568afaf8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0d411f374dcb4367a0cbd1966b53998b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '640e3817-959e-4aa8-b729-b93f10c192e2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4368a674-347e-4a8e-9b8e-9f40d9a37dbd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=af2b2485-3452-44f6-8b63-368cdd756a08) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.036 103861 INFO neutron.agent.ovn.metadata.agent [-] Port af2b2485-3452-44f6-8b63-368cdd756a08 in datapath 4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c unbound from our chassis
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.039 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.041 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[684ab3bf-518c-4044-acc5-cf51a98e1eae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.042 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c namespace which is not needed anymore
Sep 30 21:37:23 compute-1 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000070.scope: Deactivated successfully.
Sep 30 21:37:23 compute-1 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000070.scope: Consumed 5.314s CPU time.
Sep 30 21:37:23 compute-1 systemd-machined[152783]: Machine qemu-55-instance-00000070 terminated.
Sep 30 21:37:23 compute-1 neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c[239127]: [NOTICE]   (239131) : haproxy version is 2.8.14-c23fe91
Sep 30 21:37:23 compute-1 neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c[239127]: [NOTICE]   (239131) : path to executable is /usr/sbin/haproxy
Sep 30 21:37:23 compute-1 neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c[239127]: [WARNING]  (239131) : Exiting Master process...
Sep 30 21:37:23 compute-1 neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c[239127]: [ALERT]    (239131) : Current worker (239133) exited with code 143 (Terminated)
Sep 30 21:37:23 compute-1 neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c[239127]: [WARNING]  (239131) : All workers exited. Exiting... (0)
Sep 30 21:37:23 compute-1 systemd[1]: libpod-25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37.scope: Deactivated successfully.
Sep 30 21:37:23 compute-1 podman[239249]: 2025-09-30 21:37:23.218420666 +0000 UTC m=+0.071439276 container died 25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37-userdata-shm.mount: Deactivated successfully.
Sep 30 21:37:23 compute-1 systemd[1]: var-lib-containers-storage-overlay-7280da9f0dc94760efc46b88efd43dfeebef82d625b09220458d93aa42248681-merged.mount: Deactivated successfully.
Sep 30 21:37:23 compute-1 podman[239249]: 2025-09-30 21:37:23.267941745 +0000 UTC m=+0.120960275 container cleanup 25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:37:23 compute-1 systemd[1]: libpod-conmon-25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37.scope: Deactivated successfully.
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.289 2 INFO nova.virt.libvirt.driver [-] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Instance destroyed successfully.
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.290 2 DEBUG nova.objects.instance [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lazy-loading 'resources' on Instance uuid d652090c-364f-4b7f-a5d7-6c6d568afaf8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.304 2 DEBUG nova.virt.libvirt.vif [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:37:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-1033636295',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servertagstestjson-server-1033636295',id=112,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:37:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0d411f374dcb4367a0cbd1966b53998b',ramdisk_id='',reservation_id='r-l7gm5r6k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerTagsTestJSON-1932675786',owner_user_name='tempest-ServerTagsTestJSON-1932675786-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:37:18Z,user_data=None,user_id='3fb2d00c8e3a4805a2f71cad4ff37678',uuid=d652090c-364f-4b7f-a5d7-6c6d568afaf8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.305 2 DEBUG nova.network.os_vif_util [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Converting VIF {"id": "af2b2485-3452-44f6-8b63-368cdd756a08", "address": "fa:16:3e:6f:af:1d", "network": {"id": "4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-404401089-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d411f374dcb4367a0cbd1966b53998b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf2b2485-34", "ovs_interfaceid": "af2b2485-3452-44f6-8b63-368cdd756a08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.305 2 DEBUG nova.network.os_vif_util [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:af:1d,bridge_name='br-int',has_traffic_filtering=True,id=af2b2485-3452-44f6-8b63-368cdd756a08,network=Network(4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf2b2485-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.306 2 DEBUG os_vif [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:af:1d,bridge_name='br-int',has_traffic_filtering=True,id=af2b2485-3452-44f6-8b63-368cdd756a08,network=Network(4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf2b2485-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.308 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf2b2485-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.315 2 INFO os_vif [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:af:1d,bridge_name='br-int',has_traffic_filtering=True,id=af2b2485-3452-44f6-8b63-368cdd756a08,network=Network(4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf2b2485-34')
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.315 2 INFO nova.virt.libvirt.driver [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Deleting instance files /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8_del
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.316 2 INFO nova.virt.libvirt.driver [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Deletion of /var/lib/nova/instances/d652090c-364f-4b7f-a5d7-6c6d568afaf8_del complete
Sep 30 21:37:23 compute-1 podman[239295]: 2025-09-30 21:37:23.347505772 +0000 UTC m=+0.048225545 container remove 25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3)
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.354 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[56b03cb2-755e-4d6d-9185-245a198eb830]: (4, ('Tue Sep 30 09:37:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c (25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37)\n25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37\nTue Sep 30 09:37:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c (25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37)\n25b06a373bc95d1244b341911186aa71cea0014a3920fa9d34b2a587cf483a37\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.356 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[068e89f3-7e4a-42da-81fe-2e2d08f5d6e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.357 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a1f84a4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 kernel: tap4a1f84a4-50: left promiscuous mode
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.367 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca76185-d255-488f-af0f-9e2a73c2d6ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.391 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Updating instance_info_cache with network_info: [{"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.416 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Releasing lock "refresh_cache-004d1ddf-f040-4c90-97d3-e34edd56c064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.417 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Instance network_info: |[{"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.417 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[760818ea-4f5c-447a-8c04-f7ae915d22f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.419 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[19ddf211-9bff-404d-91ea-98fd2e72e559]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.417 2 DEBUG oslo_concurrency.lockutils [req-84c378eb-08eb-4c53-8458-4486db5bbbb9 req-63d13a27-07a9-4f62-8f3a-6c701b510b62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-004d1ddf-f040-4c90-97d3-e34edd56c064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.417 2 DEBUG nova.network.neutron [req-84c378eb-08eb-4c53-8458-4486db5bbbb9 req-63d13a27-07a9-4f62-8f3a-6c701b510b62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Refreshing network info cache for port 7eea3181-d9f9-4bb8-97ac-b18f21056b96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.420 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Start _get_guest_xml network_info=[{"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.425 2 WARNING nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.434 2 INFO nova.compute.manager [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Took 0.45 seconds to destroy the instance on the hypervisor.
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.435 2 DEBUG oslo.service.loopingcall [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.435 2 DEBUG nova.compute.manager [-] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.435 2 DEBUG nova.network.neutron [-] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.437 2 DEBUG nova.virt.libvirt.host [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.438 2 DEBUG nova.virt.libvirt.host [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.444 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[52f21408-f946-48e8-84d4-256cb3cadac7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493268, 'reachable_time': 18957, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239310, 'error': None, 'target': 'ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.447 2 DEBUG nova.virt.libvirt.host [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.448 2 DEBUG nova.virt.libvirt.host [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.448 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4a1f84a4-582f-4bbd-ae7b-051e7a8ab79c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:37:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:23.448 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[223c5579-dd6c-4dfe-801f-fd5b5c464d64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.449 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.449 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.450 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:37:23 compute-1 systemd[1]: run-netns-ovnmeta\x2d4a1f84a4\x2d582f\x2d4bbd\x2dae7b\x2d051e7a8ab79c.mount: Deactivated successfully.
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.450 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.450 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.451 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.451 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.451 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.451 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.451 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.452 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.452 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.457 2 DEBUG nova.virt.libvirt.vif [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:37:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-2130176929',display_name='tempest-ListServersNegativeTestJSON-server-2130176929-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-2130176929-2',id=114,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27ad5d27d7a44404987f6bf297897a45',ramdisk_id='',reservation_id='r-msrdvef1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1844496645',owner_user_name='tempest-ListServersNegativeTestJSON-1844496645-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:37:20Z,user_data=None,user_id='cbebeb1f78b64ee09e2da39a04f9f282',uuid=004d1ddf-f040-4c90-97d3-e34edd56c064,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.458 2 DEBUG nova.network.os_vif_util [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converting VIF {"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.459 2 DEBUG nova.network.os_vif_util [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:77:a9,bridge_name='br-int',has_traffic_filtering=True,id=7eea3181-d9f9-4bb8-97ac-b18f21056b96,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea3181-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.461 2 DEBUG nova.objects.instance [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lazy-loading 'pci_devices' on Instance uuid 004d1ddf-f040-4c90-97d3-e34edd56c064 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.645 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <uuid>004d1ddf-f040-4c90-97d3-e34edd56c064</uuid>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <name>instance-00000072</name>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <nova:name>tempest-ListServersNegativeTestJSON-server-2130176929-2</nova:name>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:37:23</nova:creationTime>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:37:23 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:37:23 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:37:23 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:37:23 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:37:23 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:37:23 compute-1 nova_compute[192795]:         <nova:user uuid="cbebeb1f78b64ee09e2da39a04f9f282">tempest-ListServersNegativeTestJSON-1844496645-project-member</nova:user>
Sep 30 21:37:23 compute-1 nova_compute[192795]:         <nova:project uuid="27ad5d27d7a44404987f6bf297897a45">tempest-ListServersNegativeTestJSON-1844496645</nova:project>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:37:23 compute-1 nova_compute[192795]:         <nova:port uuid="7eea3181-d9f9-4bb8-97ac-b18f21056b96">
Sep 30 21:37:23 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <system>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <entry name="serial">004d1ddf-f040-4c90-97d3-e34edd56c064</entry>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <entry name="uuid">004d1ddf-f040-4c90-97d3-e34edd56c064</entry>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     </system>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <os>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   </os>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <features>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   </features>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk.config"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:f4:77:a9"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <target dev="tap7eea3181-d9"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/console.log" append="off"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <video>
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     </video>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:37:23 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:37:23 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:37:23 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:37:23 compute-1 nova_compute[192795]: </domain>
Sep 30 21:37:23 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.647 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Preparing to wait for external event network-vif-plugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.647 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.647 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.647 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.648 2 DEBUG nova.virt.libvirt.vif [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:37:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-2130176929',display_name='tempest-ListServersNegativeTestJSON-server-2130176929-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-2130176929-2',id=114,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27ad5d27d7a44404987f6bf297897a45',ramdisk_id='',reservation_id='r-msrdvef1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1844496645',owner_user_name='tempest-ListServersNegativeTestJSON-1844496645-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:37:20Z,user_data=None,user_id='cbebeb1f78b64ee09e2da39a04f9f282',uuid=004d1ddf-f040-4c90-97d3-e34edd56c064,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.648 2 DEBUG nova.network.os_vif_util [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converting VIF {"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.649 2 DEBUG nova.network.os_vif_util [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:77:a9,bridge_name='br-int',has_traffic_filtering=True,id=7eea3181-d9f9-4bb8-97ac-b18f21056b96,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea3181-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.649 2 DEBUG os_vif [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:77:a9,bridge_name='br-int',has_traffic_filtering=True,id=7eea3181-d9f9-4bb8-97ac-b18f21056b96,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea3181-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.650 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.650 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.654 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7eea3181-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.655 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7eea3181-d9, col_values=(('external_ids', {'iface-id': '7eea3181-d9f9-4bb8-97ac-b18f21056b96', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:77:a9', 'vm-uuid': '004d1ddf-f040-4c90-97d3-e34edd56c064'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 NetworkManager[51724]: <info>  [1759268243.6606] manager: (tap7eea3181-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.666 2 INFO os_vif [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:77:a9,bridge_name='br-int',has_traffic_filtering=True,id=7eea3181-d9f9-4bb8-97ac-b18f21056b96,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea3181-d9')
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.725 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.726 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.726 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] No VIF found with MAC fa:16:3e:f4:77:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:37:23 compute-1 nova_compute[192795]: 2025-09-30 21:37:23.727 2 INFO nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Using config drive
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.097 2 INFO nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Creating config drive at /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk.config
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.107 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuufxvbot execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.260 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuufxvbot" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:24 compute-1 kernel: tap7eea3181-d9: entered promiscuous mode
Sep 30 21:37:24 compute-1 NetworkManager[51724]: <info>  [1759268244.3372] manager: (tap7eea3181-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/231)
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-1 ovn_controller[94902]: 2025-09-30T21:37:24Z|00462|binding|INFO|Claiming lport 7eea3181-d9f9-4bb8-97ac-b18f21056b96 for this chassis.
Sep 30 21:37:24 compute-1 ovn_controller[94902]: 2025-09-30T21:37:24Z|00463|binding|INFO|7eea3181-d9f9-4bb8-97ac-b18f21056b96: Claiming fa:16:3e:f4:77:a9 10.100.0.14
Sep 30 21:37:24 compute-1 systemd-udevd[239228]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:37:24 compute-1 NetworkManager[51724]: <info>  [1759268244.3516] device (tap7eea3181-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:37:24 compute-1 NetworkManager[51724]: <info>  [1759268244.3528] device (tap7eea3181-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.355 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:77:a9 10.100.0.14'], port_security=['fa:16:3e:f4:77:a9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '004d1ddf-f040-4c90-97d3-e34edd56c064', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27ad5d27d7a44404987f6bf297897a45', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd191d10a-3559-4b1f-8832-a4135c99c39a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a8f74c6-45db-4076-ad90-3fc15f718204, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=7eea3181-d9f9-4bb8-97ac-b18f21056b96) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.358 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 7eea3181-d9f9-4bb8-97ac-b18f21056b96 in datapath 6815a373-206c-4f26-aa16-cc2d72d0f14d bound to our chassis
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.361 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6815a373-206c-4f26-aa16-cc2d72d0f14d
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.376 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[adf986d2-183f-4ea0-9b51-30058b0ab66e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.377 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6815a373-21 in ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:37:24 compute-1 systemd-machined[152783]: New machine qemu-56-instance-00000072.
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.379 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6815a373-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.380 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd143e5-623b-4827-83ff-634cb9a332aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.387 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[97d4bc07-c158-49a9-a88b-0c985fefe9c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.402 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[c848d92f-3756-454a-bfec-6bfdba10413c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.424 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c2f7f943-86c9-4a5e-962f-d950dc3faf26]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-1 systemd[1]: Started Virtual Machine qemu-56-instance-00000072.
Sep 30 21:37:24 compute-1 ovn_controller[94902]: 2025-09-30T21:37:24Z|00464|binding|INFO|Setting lport 7eea3181-d9f9-4bb8-97ac-b18f21056b96 ovn-installed in OVS
Sep 30 21:37:24 compute-1 ovn_controller[94902]: 2025-09-30T21:37:24Z|00465|binding|INFO|Setting lport 7eea3181-d9f9-4bb8-97ac-b18f21056b96 up in Southbound
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.461 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7badbacc-9ec4-4d8d-bc69-223ad6481d6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 NetworkManager[51724]: <info>  [1759268244.4697] manager: (tap6815a373-20): new Veth device (/org/freedesktop/NetworkManager/Devices/232)
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.468 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d84f2eac-18a3-41d3-ac65-13cd2e120a0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.504 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b952ea17-f3e4-42c4-8220-edebedb45c2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.511 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a416b41e-38e9-447b-90d2-3680660865ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 NetworkManager[51724]: <info>  [1759268244.5359] device (tap6815a373-20): carrier: link connected
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.542 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca53275-3e40-4ac1-80a3-f1bb43e6a227]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.564 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[51506173-4fa5-4e03-935f-c884408d96d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6815a373-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:9d:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494020, 'reachable_time': 19977, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239363, 'error': None, 'target': 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.585 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[40885cbb-f17c-42d2-9847-bde76b4341b0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:9dfc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494020, 'tstamp': 494020}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239364, 'error': None, 'target': 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.606 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[11836179-696c-483d-9b3e-9c0fda61b63d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6815a373-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:9d:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494020, 'reachable_time': 19977, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239365, 'error': None, 'target': 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.645 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c4dba33f-50be-438c-9044-ccc08e11736d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.731 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[135a4bcd-94c3-48a0-9247-a7d30cc13389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.733 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6815a373-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.734 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.734 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6815a373-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:24 compute-1 NetworkManager[51724]: <info>  [1759268244.7380] manager: (tap6815a373-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Sep 30 21:37:24 compute-1 kernel: tap6815a373-20: entered promiscuous mode
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.748 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6815a373-20, col_values=(('external_ids', {'iface-id': '54338240-c8d5-4182-8bce-5e3e54cd794d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-1 ovn_controller[94902]: 2025-09-30T21:37:24Z|00466|binding|INFO|Releasing lport 54338240-c8d5-4182-8bce-5e3e54cd794d from this chassis (sb_readonly=0)
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.767 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6815a373-206c-4f26-aa16-cc2d72d0f14d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6815a373-206c-4f26-aa16-cc2d72d0f14d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.768 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dd10c83a-13ed-42c9-8bd2-23fc70b7266e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.769 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-6815a373-206c-4f26-aa16-cc2d72d0f14d
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/6815a373-206c-4f26-aa16-cc2d72d0f14d.pid.haproxy
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 6815a373-206c-4f26-aa16-cc2d72d0f14d
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:37:24 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:24.770 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'env', 'PROCESS_TAG=haproxy-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6815a373-206c-4f26-aa16-cc2d72d0f14d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.850 2 DEBUG nova.network.neutron [-] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:24 compute-1 nova_compute[192795]: 2025-09-30 21:37:24.967 2 INFO nova.compute.manager [-] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Took 1.53 seconds to deallocate network for instance.
Sep 30 21:37:25 compute-1 podman[239397]: 2025-09-30 21:37:25.182368861 +0000 UTC m=+0.077014009 container create 11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:37:25 compute-1 podman[239397]: 2025-09-30 21:37:25.137608701 +0000 UTC m=+0.032253919 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:37:25 compute-1 systemd[1]: Started libpod-conmon-11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9.scope.
Sep 30 21:37:25 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.272 2 DEBUG oslo_concurrency.lockutils [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.273 2 DEBUG oslo_concurrency.lockutils [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76ea02febd7a11107c9ac4ea5d02ca755cfdfbfad8871c3aef08d23ae07b0ca0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:37:25 compute-1 podman[239397]: 2025-09-30 21:37:25.288707187 +0000 UTC m=+0.183352315 container init 11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:37:25 compute-1 podman[239397]: 2025-09-30 21:37:25.293817587 +0000 UTC m=+0.188462695 container start 11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:37:25 compute-1 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[239419]: [NOTICE]   (239423) : New worker (239425) forked
Sep 30 21:37:25 compute-1 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[239419]: [NOTICE]   (239423) : Loading success.
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.361 2 DEBUG nova.compute.provider_tree [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.480 2 DEBUG nova.scheduler.client.report [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.619 2 DEBUG oslo_concurrency.lockutils [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.688 2 INFO nova.scheduler.client.report [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Deleted allocations for instance d652090c-364f-4b7f-a5d7-6c6d568afaf8
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.692 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268245.6924045, 004d1ddf-f040-4c90-97d3-e34edd56c064 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.693 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] VM Started (Lifecycle Event)
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.753 2 DEBUG nova.network.neutron [req-84c378eb-08eb-4c53-8458-4486db5bbbb9 req-63d13a27-07a9-4f62-8f3a-6c701b510b62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Updated VIF entry in instance network info cache for port 7eea3181-d9f9-4bb8-97ac-b18f21056b96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.754 2 DEBUG nova.network.neutron [req-84c378eb-08eb-4c53-8458-4486db5bbbb9 req-63d13a27-07a9-4f62-8f3a-6c701b510b62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Updating instance_info_cache with network_info: [{"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:25.769 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:25.772 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.776 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.781 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268245.693107, 004d1ddf-f040-4c90-97d3-e34edd56c064 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.782 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] VM Paused (Lifecycle Event)
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.801 2 DEBUG oslo_concurrency.lockutils [req-84c378eb-08eb-4c53-8458-4486db5bbbb9 req-63d13a27-07a9-4f62-8f3a-6c701b510b62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-004d1ddf-f040-4c90-97d3-e34edd56c064" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.833 2 DEBUG nova.compute.manager [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Received event network-vif-unplugged-af2b2485-3452-44f6-8b63-368cdd756a08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.833 2 DEBUG oslo_concurrency.lockutils [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.833 2 DEBUG oslo_concurrency.lockutils [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.834 2 DEBUG oslo_concurrency.lockutils [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.834 2 DEBUG nova.compute.manager [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] No waiting events found dispatching network-vif-unplugged-af2b2485-3452-44f6-8b63-368cdd756a08 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.834 2 WARNING nova.compute.manager [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Received unexpected event network-vif-unplugged-af2b2485-3452-44f6-8b63-368cdd756a08 for instance with vm_state deleted and task_state None.
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.834 2 DEBUG nova.compute.manager [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Received event network-vif-plugged-af2b2485-3452-44f6-8b63-368cdd756a08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.834 2 DEBUG oslo_concurrency.lockutils [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.835 2 DEBUG oslo_concurrency.lockutils [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.835 2 DEBUG oslo_concurrency.lockutils [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.835 2 DEBUG nova.compute.manager [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] No waiting events found dispatching network-vif-plugged-af2b2485-3452-44f6-8b63-368cdd756a08 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.835 2 WARNING nova.compute.manager [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Received unexpected event network-vif-plugged-af2b2485-3452-44f6-8b63-368cdd756a08 for instance with vm_state deleted and task_state None.
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.836 2 DEBUG nova.compute.manager [req-8baca00b-942f-4c67-b3a2-79bc8bce7166 req-34dab1eb-f76e-4eee-94c5-1bf10c2bc4e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Received event network-vif-deleted-af2b2485-3452-44f6-8b63-368cdd756a08 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.848 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.853 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:25 compute-1 nova_compute[192795]: 2025-09-30 21:37:25.984 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:37:26 compute-1 nova_compute[192795]: 2025-09-30 21:37:26.163 2 DEBUG oslo_concurrency.lockutils [None req-4369de11-e6c6-409c-b75c-344985d4da42 3fb2d00c8e3a4805a2f71cad4ff37678 0d411f374dcb4367a0cbd1966b53998b - - default default] Lock "d652090c-364f-4b7f-a5d7-6c6d568afaf8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:26 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:26.775 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.140 2 DEBUG nova.compute.manager [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Received event network-vif-plugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.141 2 DEBUG oslo_concurrency.lockutils [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.142 2 DEBUG oslo_concurrency.lockutils [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.142 2 DEBUG oslo_concurrency.lockutils [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.143 2 DEBUG nova.compute.manager [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Processing event network-vif-plugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.143 2 DEBUG nova.compute.manager [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Received event network-vif-plugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.144 2 DEBUG oslo_concurrency.lockutils [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.144 2 DEBUG oslo_concurrency.lockutils [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.145 2 DEBUG oslo_concurrency.lockutils [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.145 2 DEBUG nova.compute.manager [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] No waiting events found dispatching network-vif-plugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.145 2 WARNING nova.compute.manager [req-b4d06ba5-aebb-45ad-99f8-803523f19f9e req-576822c5-2613-4e1a-8bdd-9e2df21168e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Received unexpected event network-vif-plugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 for instance with vm_state building and task_state spawning.
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.147 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.151 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268248.1513991, 004d1ddf-f040-4c90-97d3-e34edd56c064 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.152 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] VM Resumed (Lifecycle Event)
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.158 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.163 2 INFO nova.virt.libvirt.driver [-] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Instance spawned successfully.
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.164 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.205 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.214 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.221 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.222 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.223 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.223 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.224 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.225 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.274 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:37:28 compute-1 podman[239434]: 2025-09-30 21:37:28.291231402 +0000 UTC m=+0.110543582 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.671 2 INFO nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Took 8.42 seconds to spawn the instance on the hypervisor.
Sep 30 21:37:28 compute-1 nova_compute[192795]: 2025-09-30 21:37:28.671 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:29 compute-1 nova_compute[192795]: 2025-09-30 21:37:29.123 2 INFO nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Took 9.78 seconds to build instance.
Sep 30 21:37:29 compute-1 nova_compute[192795]: 2025-09-30 21:37:29.411 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:33 compute-1 nova_compute[192795]: 2025-09-30 21:37:33.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:33 compute-1 podman[239462]: 2025-09-30 21:37:33.243653748 +0000 UTC m=+0.067997754 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:37:33 compute-1 podman[239455]: 2025-09-30 21:37:33.243675498 +0000 UTC m=+0.083801644 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public)
Sep 30 21:37:33 compute-1 podman[239456]: 2025-09-30 21:37:33.270528309 +0000 UTC m=+0.093547279 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:37:33 compute-1 nova_compute[192795]: 2025-09-30 21:37:33.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:37 compute-1 ovn_controller[94902]: 2025-09-30T21:37:37Z|00467|binding|INFO|Releasing lport 54338240-c8d5-4182-8bce-5e3e54cd794d from this chassis (sb_readonly=0)
Sep 30 21:37:37 compute-1 nova_compute[192795]: 2025-09-30 21:37:37.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:38 compute-1 nova_compute[192795]: 2025-09-30 21:37:38.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:38 compute-1 nova_compute[192795]: 2025-09-30 21:37:38.288 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268243.286693, d652090c-364f-4b7f-a5d7-6c6d568afaf8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:38 compute-1 nova_compute[192795]: 2025-09-30 21:37:38.289 2 INFO nova.compute.manager [-] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] VM Stopped (Lifecycle Event)
Sep 30 21:37:38 compute-1 nova_compute[192795]: 2025-09-30 21:37:38.365 2 DEBUG nova.compute.manager [None req-5b1cdd84-f49e-4e7e-b329-df860c519c75 - - - - - -] [instance: d652090c-364f-4b7f-a5d7-6c6d568afaf8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:38 compute-1 nova_compute[192795]: 2025-09-30 21:37:38.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:38.697 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:38.698 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:38.698 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:42 compute-1 ovn_controller[94902]: 2025-09-30T21:37:42Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:77:a9 10.100.0.14
Sep 30 21:37:42 compute-1 ovn_controller[94902]: 2025-09-30T21:37:42Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:77:a9 10.100.0.14
Sep 30 21:37:43 compute-1 nova_compute[192795]: 2025-09-30 21:37:43.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:43 compute-1 nova_compute[192795]: 2025-09-30 21:37:43.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-1 podman[239526]: 2025-09-30 21:37:44.215196966 +0000 UTC m=+0.055318103 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.330 2 DEBUG oslo_concurrency.lockutils [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "004d1ddf-f040-4c90-97d3-e34edd56c064" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.330 2 DEBUG oslo_concurrency.lockutils [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.331 2 DEBUG oslo_concurrency.lockutils [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.331 2 DEBUG oslo_concurrency.lockutils [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.332 2 DEBUG oslo_concurrency.lockutils [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.345 2 INFO nova.compute.manager [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Terminating instance
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.365 2 DEBUG nova.compute.manager [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:37:44 compute-1 kernel: tap7eea3181-d9 (unregistering): left promiscuous mode
Sep 30 21:37:44 compute-1 NetworkManager[51724]: <info>  [1759268264.3898] device (tap7eea3181-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:37:44 compute-1 ovn_controller[94902]: 2025-09-30T21:37:44Z|00468|binding|INFO|Releasing lport 7eea3181-d9f9-4bb8-97ac-b18f21056b96 from this chassis (sb_readonly=0)
Sep 30 21:37:44 compute-1 ovn_controller[94902]: 2025-09-30T21:37:44Z|00469|binding|INFO|Setting lport 7eea3181-d9f9-4bb8-97ac-b18f21056b96 down in Southbound
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-1 ovn_controller[94902]: 2025-09-30T21:37:44Z|00470|binding|INFO|Removing iface tap7eea3181-d9 ovn-installed in OVS
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-1 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000072.scope: Deactivated successfully.
Sep 30 21:37:44 compute-1 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000072.scope: Consumed 14.269s CPU time.
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.468 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:77:a9 10.100.0.14'], port_security=['fa:16:3e:f4:77:a9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '004d1ddf-f040-4c90-97d3-e34edd56c064', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27ad5d27d7a44404987f6bf297897a45', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd191d10a-3559-4b1f-8832-a4135c99c39a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a8f74c6-45db-4076-ad90-3fc15f718204, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=7eea3181-d9f9-4bb8-97ac-b18f21056b96) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.470 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 7eea3181-d9f9-4bb8-97ac-b18f21056b96 in datapath 6815a373-206c-4f26-aa16-cc2d72d0f14d unbound from our chassis
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.471 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6815a373-206c-4f26-aa16-cc2d72d0f14d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.472 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0f822d-89e9-466c-858b-dd1ae447b221]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.473 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d namespace which is not needed anymore
Sep 30 21:37:44 compute-1 systemd-machined[152783]: Machine qemu-56-instance-00000072 terminated.
Sep 30 21:37:44 compute-1 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[239419]: [NOTICE]   (239423) : haproxy version is 2.8.14-c23fe91
Sep 30 21:37:44 compute-1 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[239419]: [NOTICE]   (239423) : path to executable is /usr/sbin/haproxy
Sep 30 21:37:44 compute-1 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[239419]: [WARNING]  (239423) : Exiting Master process...
Sep 30 21:37:44 compute-1 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[239419]: [ALERT]    (239423) : Current worker (239425) exited with code 143 (Terminated)
Sep 30 21:37:44 compute-1 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[239419]: [WARNING]  (239423) : All workers exited. Exiting... (0)
Sep 30 21:37:44 compute-1 systemd[1]: libpod-11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9.scope: Deactivated successfully.
Sep 30 21:37:44 compute-1 conmon[239419]: conmon 11d29490cd59d7fc0194 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9.scope/container/memory.events
Sep 30 21:37:44 compute-1 podman[239571]: 2025-09-30 21:37:44.626757386 +0000 UTC m=+0.056845493 container died 11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:37:44 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9-userdata-shm.mount: Deactivated successfully.
Sep 30 21:37:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-76ea02febd7a11107c9ac4ea5d02ca755cfdfbfad8871c3aef08d23ae07b0ca0-merged.mount: Deactivated successfully.
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.663 2 INFO nova.virt.libvirt.driver [-] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Instance destroyed successfully.
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.664 2 DEBUG nova.objects.instance [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lazy-loading 'resources' on Instance uuid 004d1ddf-f040-4c90-97d3-e34edd56c064 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:44 compute-1 podman[239571]: 2025-09-30 21:37:44.667100156 +0000 UTC m=+0.097188273 container cleanup 11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.680 2 DEBUG nova.virt.libvirt.vif [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:37:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-2130176929',display_name='tempest-ListServersNegativeTestJSON-server-2130176929-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-2130176929-2',id=114,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-09-30T21:37:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='27ad5d27d7a44404987f6bf297897a45',ramdisk_id='',reservation_id='r-msrdvef1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1844496645',owner_user_name='tempest-ListServersNegativeTestJSON-1844496645-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:37:28Z,user_data=None,user_id='cbebeb1f78b64ee09e2da39a04f9f282',uuid=004d1ddf-f040-4c90-97d3-e34edd56c064,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.681 2 DEBUG nova.network.os_vif_util [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converting VIF {"id": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "address": "fa:16:3e:f4:77:a9", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eea3181-d9", "ovs_interfaceid": "7eea3181-d9f9-4bb8-97ac-b18f21056b96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.682 2 DEBUG nova.network.os_vif_util [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:77:a9,bridge_name='br-int',has_traffic_filtering=True,id=7eea3181-d9f9-4bb8-97ac-b18f21056b96,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea3181-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.683 2 DEBUG os_vif [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:77:a9,bridge_name='br-int',has_traffic_filtering=True,id=7eea3181-d9f9-4bb8-97ac-b18f21056b96,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea3181-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.685 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7eea3181-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.715 2 INFO os_vif [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:77:a9,bridge_name='br-int',has_traffic_filtering=True,id=7eea3181-d9f9-4bb8-97ac-b18f21056b96,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eea3181-d9')
Sep 30 21:37:44 compute-1 systemd[1]: libpod-conmon-11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9.scope: Deactivated successfully.
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.716 2 INFO nova.virt.libvirt.driver [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Deleting instance files /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064_del
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.718 2 INFO nova.virt.libvirt.driver [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Deletion of /var/lib/nova/instances/004d1ddf-f040-4c90-97d3-e34edd56c064_del complete
Sep 30 21:37:44 compute-1 podman[239614]: 2025-09-30 21:37:44.748025053 +0000 UTC m=+0.051263113 container remove 11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.753 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ac3dbd58-f258-469a-ab9e-003b91f1ce10]: (4, ('Tue Sep 30 09:37:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d (11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9)\n11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9\nTue Sep 30 09:37:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d (11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9)\n11d29490cd59d7fc0194bb37c636fccd99501c74970e48aa2b63ed68084b4bb9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.755 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[596e3e74-dbd7-40ac-8547-dbd2ad4d8bac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.756 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6815a373-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-1 kernel: tap6815a373-20: left promiscuous mode
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.772 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[982398b0-c06c-4458-97fc-71538f4e7844]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.802 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c4273c6c-7132-497b-bf7a-ee72587f5fe5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.804 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6c2469-2860-49de-91d2-898ed2dd6757]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.817 2 INFO nova.compute.manager [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Took 0.45 seconds to destroy the instance on the hypervisor.
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.817 2 DEBUG oslo.service.loopingcall [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.818 2 DEBUG nova.compute.manager [-] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:37:44 compute-1 nova_compute[192795]: 2025-09-30 21:37:44.818 2 DEBUG nova.network.neutron [-] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.823 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[77a6b5c5-e28b-4b68-932a-73224dda5fdb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494012, 'reachable_time': 21423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239632, 'error': None, 'target': 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.826 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:37:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:37:44.826 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[a8bef0fc-c7df-4f6e-bf5c-79c0ba3d8549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:44 compute-1 systemd[1]: run-netns-ovnmeta\x2d6815a373\x2d206c\x2d4f26\x2daa16\x2dcc2d72d0f14d.mount: Deactivated successfully.
Sep 30 21:37:45 compute-1 nova_compute[192795]: 2025-09-30 21:37:45.184 2 DEBUG nova.compute.manager [req-46a533bd-8c1d-4543-aad5-3bc813f2fd81 req-5d3f440f-3612-42c0-9d77-de72da139870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Received event network-vif-unplugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:45 compute-1 nova_compute[192795]: 2025-09-30 21:37:45.185 2 DEBUG oslo_concurrency.lockutils [req-46a533bd-8c1d-4543-aad5-3bc813f2fd81 req-5d3f440f-3612-42c0-9d77-de72da139870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:45 compute-1 nova_compute[192795]: 2025-09-30 21:37:45.185 2 DEBUG oslo_concurrency.lockutils [req-46a533bd-8c1d-4543-aad5-3bc813f2fd81 req-5d3f440f-3612-42c0-9d77-de72da139870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:45 compute-1 nova_compute[192795]: 2025-09-30 21:37:45.185 2 DEBUG oslo_concurrency.lockutils [req-46a533bd-8c1d-4543-aad5-3bc813f2fd81 req-5d3f440f-3612-42c0-9d77-de72da139870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:45 compute-1 nova_compute[192795]: 2025-09-30 21:37:45.186 2 DEBUG nova.compute.manager [req-46a533bd-8c1d-4543-aad5-3bc813f2fd81 req-5d3f440f-3612-42c0-9d77-de72da139870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] No waiting events found dispatching network-vif-unplugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:45 compute-1 nova_compute[192795]: 2025-09-30 21:37:45.186 2 DEBUG nova.compute.manager [req-46a533bd-8c1d-4543-aad5-3bc813f2fd81 req-5d3f440f-3612-42c0-9d77-de72da139870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Received event network-vif-unplugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.038 2 DEBUG nova.network.neutron [-] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.069 2 INFO nova.compute.manager [-] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Took 1.25 seconds to deallocate network for instance.
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.269 2 DEBUG oslo_concurrency.lockutils [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.270 2 DEBUG oslo_concurrency.lockutils [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.384 2 DEBUG nova.compute.provider_tree [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.431 2 DEBUG nova.scheduler.client.report [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.454 2 DEBUG nova.compute.manager [req-9c09081e-6ab2-4683-bc9b-1d634b4ffb18 req-8b784efc-b0f8-4886-a55e-b3f2b50d8f0b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Received event network-vif-deleted-7eea3181-d9f9-4bb8-97ac-b18f21056b96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.484 2 DEBUG oslo_concurrency.lockutils [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.529 2 INFO nova.scheduler.client.report [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Deleted allocations for instance 004d1ddf-f040-4c90-97d3-e34edd56c064
Sep 30 21:37:46 compute-1 nova_compute[192795]: 2025-09-30 21:37:46.640 2 DEBUG oslo_concurrency.lockutils [None req-b3543944-26fe-4ce1-b8ee-c4d458c1f5f6 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:47 compute-1 nova_compute[192795]: 2025-09-30 21:37:47.530 2 DEBUG nova.compute.manager [req-b2ee3991-967a-4dac-9f3c-5848d7054db4 req-e886c6b8-be12-4e2e-9120-566ad0c96525 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Received event network-vif-plugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:47 compute-1 nova_compute[192795]: 2025-09-30 21:37:47.530 2 DEBUG oslo_concurrency.lockutils [req-b2ee3991-967a-4dac-9f3c-5848d7054db4 req-e886c6b8-be12-4e2e-9120-566ad0c96525 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:47 compute-1 nova_compute[192795]: 2025-09-30 21:37:47.530 2 DEBUG oslo_concurrency.lockutils [req-b2ee3991-967a-4dac-9f3c-5848d7054db4 req-e886c6b8-be12-4e2e-9120-566ad0c96525 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:47 compute-1 nova_compute[192795]: 2025-09-30 21:37:47.531 2 DEBUG oslo_concurrency.lockutils [req-b2ee3991-967a-4dac-9f3c-5848d7054db4 req-e886c6b8-be12-4e2e-9120-566ad0c96525 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "004d1ddf-f040-4c90-97d3-e34edd56c064-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:47 compute-1 nova_compute[192795]: 2025-09-30 21:37:47.531 2 DEBUG nova.compute.manager [req-b2ee3991-967a-4dac-9f3c-5848d7054db4 req-e886c6b8-be12-4e2e-9120-566ad0c96525 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] No waiting events found dispatching network-vif-plugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:47 compute-1 nova_compute[192795]: 2025-09-30 21:37:47.531 2 WARNING nova.compute.manager [req-b2ee3991-967a-4dac-9f3c-5848d7054db4 req-e886c6b8-be12-4e2e-9120-566ad0c96525 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Received unexpected event network-vif-plugged-7eea3181-d9f9-4bb8-97ac-b18f21056b96 for instance with vm_state deleted and task_state None.
Sep 30 21:37:48 compute-1 nova_compute[192795]: 2025-09-30 21:37:48.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:49 compute-1 nova_compute[192795]: 2025-09-30 21:37:49.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:49 compute-1 nova_compute[192795]: 2025-09-30 21:37:49.723 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.728 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.728 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.728 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.728 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:37:52 compute-1 podman[239634]: 2025-09-30 21:37:52.841147464 +0000 UTC m=+0.068071643 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:37:52 compute-1 podman[239639]: 2025-09-30 21:37:52.856749253 +0000 UTC m=+0.068919548 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:37:52 compute-1 podman[239635]: 2025-09-30 21:37:52.866802292 +0000 UTC m=+0.088303956 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.937 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.939 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5700MB free_disk=73.3172836303711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.940 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:52 compute-1 nova_compute[192795]: 2025-09-30 21:37:52.940 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:53 compute-1 nova_compute[192795]: 2025-09-30 21:37:53.017 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:37:53 compute-1 nova_compute[192795]: 2025-09-30 21:37:53.018 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:37:53 compute-1 nova_compute[192795]: 2025-09-30 21:37:53.049 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:53 compute-1 nova_compute[192795]: 2025-09-30 21:37:53.064 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:53 compute-1 nova_compute[192795]: 2025-09-30 21:37:53.124 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:37:53 compute-1 nova_compute[192795]: 2025-09-30 21:37:53.125 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:53 compute-1 nova_compute[192795]: 2025-09-30 21:37:53.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:54 compute-1 nova_compute[192795]: 2025-09-30 21:37:54.126 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:54 compute-1 nova_compute[192795]: 2025-09-30 21:37:54.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:55 compute-1 nova_compute[192795]: 2025-09-30 21:37:55.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:58 compute-1 nova_compute[192795]: 2025-09-30 21:37:58.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:58 compute-1 nova_compute[192795]: 2025-09-30 21:37:58.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:59 compute-1 podman[239708]: 2025-09-30 21:37:59.241100867 +0000 UTC m=+0.075927714 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Sep 30 21:37:59 compute-1 nova_compute[192795]: 2025-09-30 21:37:59.660 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268264.6592188, 004d1ddf-f040-4c90-97d3-e34edd56c064 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:59 compute-1 nova_compute[192795]: 2025-09-30 21:37:59.662 2 INFO nova.compute.manager [-] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] VM Stopped (Lifecycle Event)
Sep 30 21:37:59 compute-1 nova_compute[192795]: 2025-09-30 21:37:59.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:59 compute-1 nova_compute[192795]: 2025-09-30 21:37:59.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:59 compute-1 nova_compute[192795]: 2025-09-30 21:37:59.761 2 DEBUG nova.compute.manager [None req-4f5dfba5-1c5b-4a78-9dcb-63a5fc2809ba - - - - - -] [instance: 004d1ddf-f040-4c90-97d3-e34edd56c064] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:01 compute-1 nova_compute[192795]: 2025-09-30 21:38:01.641 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "628fd442-ed35-482c-91db-4a57f527b6a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:01 compute-1 nova_compute[192795]: 2025-09-30 21:38:01.642 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:01 compute-1 nova_compute[192795]: 2025-09-30 21:38:01.660 2 DEBUG nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:38:01 compute-1 nova_compute[192795]: 2025-09-30 21:38:01.786 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:01 compute-1 nova_compute[192795]: 2025-09-30 21:38:01.787 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:01 compute-1 nova_compute[192795]: 2025-09-30 21:38:01.797 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:38:01 compute-1 nova_compute[192795]: 2025-09-30 21:38:01.797 2 INFO nova.compute.claims [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:38:01 compute-1 nova_compute[192795]: 2025-09-30 21:38:01.964 2 DEBUG nova.compute.provider_tree [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:01 compute-1 nova_compute[192795]: 2025-09-30 21:38:01.983 2 DEBUG nova.scheduler.client.report [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.006 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.007 2 DEBUG nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.101 2 DEBUG nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.102 2 DEBUG nova.network.neutron [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.137 2 INFO nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.171 2 DEBUG nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.309 2 DEBUG nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.310 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.310 2 INFO nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Creating image(s)
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.311 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "/var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.311 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.312 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.324 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.385 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.386 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.386 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.398 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.444 2 DEBUG nova.policy [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.455 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.456 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.490 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.491 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.492 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.550 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.551 2 DEBUG nova.virt.disk.api [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Checking if we can resize image /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.551 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.618 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.619 2 DEBUG nova.virt.disk.api [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Cannot resize image /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.620 2 DEBUG nova.objects.instance [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'migration_context' on Instance uuid 628fd442-ed35-482c-91db-4a57f527b6a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.666 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.667 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Ensure instance console log exists: /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.667 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.668 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.668 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:02 compute-1 nova_compute[192795]: 2025-09-30 21:38:02.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:03 compute-1 nova_compute[192795]: 2025-09-30 21:38:03.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:03 compute-1 nova_compute[192795]: 2025-09-30 21:38:03.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:03 compute-1 nova_compute[192795]: 2025-09-30 21:38:03.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:38:03 compute-1 nova_compute[192795]: 2025-09-30 21:38:03.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:38:03 compute-1 nova_compute[192795]: 2025-09-30 21:38:03.721 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:38:03 compute-1 nova_compute[192795]: 2025-09-30 21:38:03.721 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:38:03 compute-1 nova_compute[192795]: 2025-09-30 21:38:03.976 2 DEBUG nova.network.neutron [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Successfully created port: 6be1fbd6-5748-4b3c-af41-2287770931ef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:38:04 compute-1 podman[239743]: 2025-09-30 21:38:04.210456244 +0000 UTC m=+0.056042181 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:38:04 compute-1 podman[239744]: 2025-09-30 21:38:04.235765492 +0000 UTC m=+0.076523440 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:38:04 compute-1 podman[239745]: 2025-09-30 21:38:04.235963677 +0000 UTC m=+0.075303377 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:38:04 compute-1 nova_compute[192795]: 2025-09-30 21:38:04.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:05 compute-1 nova_compute[192795]: 2025-09-30 21:38:05.301 2 DEBUG nova.network.neutron [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Successfully updated port: 6be1fbd6-5748-4b3c-af41-2287770931ef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:38:05 compute-1 nova_compute[192795]: 2025-09-30 21:38:05.317 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "refresh_cache-628fd442-ed35-482c-91db-4a57f527b6a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:05 compute-1 nova_compute[192795]: 2025-09-30 21:38:05.318 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquired lock "refresh_cache-628fd442-ed35-482c-91db-4a57f527b6a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:05 compute-1 nova_compute[192795]: 2025-09-30 21:38:05.318 2 DEBUG nova.network.neutron [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:38:05 compute-1 nova_compute[192795]: 2025-09-30 21:38:05.464 2 DEBUG nova.compute.manager [req-5a0cb3e4-ab6b-44dc-81d2-11cdd36e8de4 req-e0336041-fc85-469c-a980-d1efa57e0cf0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Received event network-changed-6be1fbd6-5748-4b3c-af41-2287770931ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:05 compute-1 nova_compute[192795]: 2025-09-30 21:38:05.465 2 DEBUG nova.compute.manager [req-5a0cb3e4-ab6b-44dc-81d2-11cdd36e8de4 req-e0336041-fc85-469c-a980-d1efa57e0cf0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Refreshing instance network info cache due to event network-changed-6be1fbd6-5748-4b3c-af41-2287770931ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:38:05 compute-1 nova_compute[192795]: 2025-09-30 21:38:05.465 2 DEBUG oslo_concurrency.lockutils [req-5a0cb3e4-ab6b-44dc-81d2-11cdd36e8de4 req-e0336041-fc85-469c-a980-d1efa57e0cf0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-628fd442-ed35-482c-91db-4a57f527b6a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:05 compute-1 nova_compute[192795]: 2025-09-30 21:38:05.914 2 DEBUG nova.network.neutron [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.201 2 DEBUG nova.network.neutron [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Updating instance_info_cache with network_info: [{"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.224 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Releasing lock "refresh_cache-628fd442-ed35-482c-91db-4a57f527b6a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.224 2 DEBUG nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Instance network_info: |[{"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.225 2 DEBUG oslo_concurrency.lockutils [req-5a0cb3e4-ab6b-44dc-81d2-11cdd36e8de4 req-e0336041-fc85-469c-a980-d1efa57e0cf0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-628fd442-ed35-482c-91db-4a57f527b6a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.225 2 DEBUG nova.network.neutron [req-5a0cb3e4-ab6b-44dc-81d2-11cdd36e8de4 req-e0336041-fc85-469c-a980-d1efa57e0cf0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Refreshing network info cache for port 6be1fbd6-5748-4b3c-af41-2287770931ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.228 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Start _get_guest_xml network_info=[{"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.233 2 WARNING nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.238 2 DEBUG nova.virt.libvirt.host [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.239 2 DEBUG nova.virt.libvirt.host [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.241 2 DEBUG nova.virt.libvirt.host [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.242 2 DEBUG nova.virt.libvirt.host [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.243 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.243 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.244 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.244 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.244 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.244 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.244 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.245 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.245 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.245 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.245 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.245 2 DEBUG nova.virt.hardware [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.248 2 DEBUG nova.virt.libvirt.vif [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-2040743504',display_name='tempest-₡-2040743504',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--2040743504',id=118,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-deam0ef4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:02Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=628fd442-ed35-482c-91db-4a57f527b6a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.249 2 DEBUG nova.network.os_vif_util [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.249 2 DEBUG nova.network.os_vif_util [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:23:75,bridge_name='br-int',has_traffic_filtering=True,id=6be1fbd6-5748-4b3c-af41-2287770931ef,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6be1fbd6-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.250 2 DEBUG nova.objects.instance [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 628fd442-ed35-482c-91db-4a57f527b6a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.267 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <uuid>628fd442-ed35-482c-91db-4a57f527b6a8</uuid>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <name>instance-00000076</name>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <nova:name>tempest-₡-2040743504</nova:name>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:38:07</nova:creationTime>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:38:07 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:38:07 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:38:07 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:38:07 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:38:07 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:38:07 compute-1 nova_compute[192795]:         <nova:user uuid="30d0a975d78c4d9a8e2201afdc040092">tempest-ServersTestJSON-782690373-project-member</nova:user>
Sep 30 21:38:07 compute-1 nova_compute[192795]:         <nova:project uuid="8ad754242d964bb487a2174b2c21bcc5">tempest-ServersTestJSON-782690373</nova:project>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:38:07 compute-1 nova_compute[192795]:         <nova:port uuid="6be1fbd6-5748-4b3c-af41-2287770931ef">
Sep 30 21:38:07 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <system>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <entry name="serial">628fd442-ed35-482c-91db-4a57f527b6a8</entry>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <entry name="uuid">628fd442-ed35-482c-91db-4a57f527b6a8</entry>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     </system>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <os>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   </os>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <features>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   </features>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk.config"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:c6:23:75"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <target dev="tap6be1fbd6-57"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/console.log" append="off"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <video>
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     </video>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:38:07 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:38:07 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:38:07 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:38:07 compute-1 nova_compute[192795]: </domain>
Sep 30 21:38:07 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.268 2 DEBUG nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Preparing to wait for external event network-vif-plugged-6be1fbd6-5748-4b3c-af41-2287770931ef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.269 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.269 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.269 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.270 2 DEBUG nova.virt.libvirt.vif [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-2040743504',display_name='tempest-₡-2040743504',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--2040743504',id=118,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-deam0ef4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:02Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=628fd442-ed35-482c-91db-4a57f527b6a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.271 2 DEBUG nova.network.os_vif_util [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.271 2 DEBUG nova.network.os_vif_util [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:23:75,bridge_name='br-int',has_traffic_filtering=True,id=6be1fbd6-5748-4b3c-af41-2287770931ef,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6be1fbd6-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.272 2 DEBUG os_vif [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:23:75,bridge_name='br-int',has_traffic_filtering=True,id=6be1fbd6-5748-4b3c-af41-2287770931ef,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6be1fbd6-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.273 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.273 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.277 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6be1fbd6-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.277 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6be1fbd6-57, col_values=(('external_ids', {'iface-id': '6be1fbd6-5748-4b3c-af41-2287770931ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:23:75', 'vm-uuid': '628fd442-ed35-482c-91db-4a57f527b6a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:07 compute-1 NetworkManager[51724]: <info>  [1759268287.3365] manager: (tap6be1fbd6-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.344 2 INFO os_vif [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:23:75,bridge_name='br-int',has_traffic_filtering=True,id=6be1fbd6-5748-4b3c-af41-2287770931ef,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6be1fbd6-57')
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.412 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.412 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.413 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No VIF found with MAC fa:16:3e:c6:23:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:38:07 compute-1 nova_compute[192795]: 2025-09-30 21:38:07.413 2 INFO nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Using config drive
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.054 2 INFO nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Creating config drive at /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk.config
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.060 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa41ucumh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.191 2 DEBUG oslo_concurrency.processutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa41ucumh" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:08 compute-1 kernel: tap6be1fbd6-57: entered promiscuous mode
Sep 30 21:38:08 compute-1 NetworkManager[51724]: <info>  [1759268288.2697] manager: (tap6be1fbd6-57): new Tun device (/org/freedesktop/NetworkManager/Devices/235)
Sep 30 21:38:08 compute-1 ovn_controller[94902]: 2025-09-30T21:38:08Z|00471|binding|INFO|Claiming lport 6be1fbd6-5748-4b3c-af41-2287770931ef for this chassis.
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:08 compute-1 ovn_controller[94902]: 2025-09-30T21:38:08Z|00472|binding|INFO|6be1fbd6-5748-4b3c-af41-2287770931ef: Claiming fa:16:3e:c6:23:75 10.100.0.5
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.291 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:23:75 10.100.0.5'], port_security=['fa:16:3e:c6:23:75 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=6be1fbd6-5748-4b3c-af41-2287770931ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.292 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 6be1fbd6-5748-4b3c-af41-2287770931ef in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c bound to our chassis
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.294 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.312 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5788f843-acb8-4b57-b142-ab280d30a783]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.316 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap27086519-61 in ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.318 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap27086519-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.318 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6e2a87c6-63e0-4768-b713-9a998223bb4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.319 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff78aec-996d-48ea-8685-b4bf7cd7f4f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 systemd-udevd[239820]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.333 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa2dec3-4f50-4d22-9d21-9ae66eb9ac1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:08 compute-1 systemd-machined[152783]: New machine qemu-57-instance-00000076.
Sep 30 21:38:08 compute-1 ovn_controller[94902]: 2025-09-30T21:38:08Z|00473|binding|INFO|Setting lport 6be1fbd6-5748-4b3c-af41-2287770931ef ovn-installed in OVS
Sep 30 21:38:08 compute-1 ovn_controller[94902]: 2025-09-30T21:38:08Z|00474|binding|INFO|Setting lport 6be1fbd6-5748-4b3c-af41-2287770931ef up in Southbound
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:08 compute-1 NetworkManager[51724]: <info>  [1759268288.3443] device (tap6be1fbd6-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:38:08 compute-1 NetworkManager[51724]: <info>  [1759268288.3451] device (tap6be1fbd6-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.349 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[57e90438-b57e-4be9-95b7-be75721607fe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 systemd[1]: Started Virtual Machine qemu-57-instance-00000076.
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.381 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff5acd3-c4af-45c2-9a11-94412f85cbb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.387 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d68d2b5d-c848-4091-a808-a2c71c1606bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 NetworkManager[51724]: <info>  [1759268288.3889] manager: (tap27086519-60): new Veth device (/org/freedesktop/NetworkManager/Devices/236)
Sep 30 21:38:08 compute-1 systemd-udevd[239825]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.423 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[d5ec91ce-30c9-4359-bc1c-a4d7c2e343f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.426 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e99baacb-50b1-42fd-8b2b-2fad0e010cbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 NetworkManager[51724]: <info>  [1759268288.4572] device (tap27086519-60): carrier: link connected
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.466 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[99657a14-3f7f-42f8-a0b6-435c5fde3e0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.487 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[83bbb544-aeb7-4abf-ba2b-cc8cbafc7bc4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498413, 'reachable_time': 26194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239853, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.504 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2b693417-5f6d-48dd-8d74-1f5f4f7f7dc0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feda:b9e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498413, 'tstamp': 498413}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239854, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.523 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[191f4150-a0c6-4a7a-b476-601a2c3a2ef2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498413, 'reachable_time': 26194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239855, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.571 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef1fb82-c295-421f-964f-fd32d4de6492]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.639 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb4effd-b388-457a-8826-fad976d6f4d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.641 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.641 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.642 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27086519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:08 compute-1 NetworkManager[51724]: <info>  [1759268288.6445] manager: (tap27086519-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:08 compute-1 kernel: tap27086519-60: entered promiscuous mode
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.646 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27086519-60, col_values=(('external_ids', {'iface-id': 'f2abb4ad-797b-4767-b8bc-377990516394'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:08 compute-1 ovn_controller[94902]: 2025-09-30T21:38:08Z|00475|binding|INFO|Releasing lport f2abb4ad-797b-4767-b8bc-377990516394 from this chassis (sb_readonly=0)
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.660 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.661 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfdeddb-e12d-4666-bf3e-f01e619c1c02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.662 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:38:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:08.663 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'env', 'PROCESS_TAG=haproxy-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/27086519-6f4c-45f9-8e5b-5b321cd6871c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:38:08 compute-1 nova_compute[192795]: 2025-09-30 21:38:08.716 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:09 compute-1 podman[239895]: 2025-09-30 21:38:09.021594444 +0000 UTC m=+0.060398798 container create 6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.040 2 DEBUG nova.network.neutron [req-5a0cb3e4-ab6b-44dc-81d2-11cdd36e8de4 req-e0336041-fc85-469c-a980-d1efa57e0cf0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Updated VIF entry in instance network info cache for port 6be1fbd6-5748-4b3c-af41-2287770931ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.041 2 DEBUG nova.network.neutron [req-5a0cb3e4-ab6b-44dc-81d2-11cdd36e8de4 req-e0336041-fc85-469c-a980-d1efa57e0cf0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Updating instance_info_cache with network_info: [{"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:09 compute-1 systemd[1]: Started libpod-conmon-6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b.scope.
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.063 2 DEBUG oslo_concurrency.lockutils [req-5a0cb3e4-ab6b-44dc-81d2-11cdd36e8de4 req-e0336041-fc85-469c-a980-d1efa57e0cf0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-628fd442-ed35-482c-91db-4a57f527b6a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:09 compute-1 podman[239895]: 2025-09-30 21:38:08.983560606 +0000 UTC m=+0.022364980 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:38:09 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:38:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69ad8829d099c2530ffca2473d4e69370e3090c06badb8d726547a2d2d2c19c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:38:09 compute-1 podman[239895]: 2025-09-30 21:38:09.097040314 +0000 UTC m=+0.135844688 container init 6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 21:38:09 compute-1 podman[239895]: 2025-09-30 21:38:09.103477566 +0000 UTC m=+0.142281910 container start 6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:38:09 compute-1 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239910]: [NOTICE]   (239914) : New worker (239916) forked
Sep 30 21:38:09 compute-1 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239910]: [NOTICE]   (239914) : Loading success.
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.385 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268289.3844202, 628fd442-ed35-482c-91db-4a57f527b6a8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.385 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] VM Started (Lifecycle Event)
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.422 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.428 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268289.3847854, 628fd442-ed35-482c-91db-4a57f527b6a8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.428 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] VM Paused (Lifecycle Event)
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.450 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.454 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:09 compute-1 nova_compute[192795]: 2025-09-30 21:38:09.487 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:38:12 compute-1 nova_compute[192795]: 2025-09-30 21:38:12.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.258 2 DEBUG nova.compute.manager [req-88842d60-1058-47fe-9d11-7403399c8afe req-44017e89-acdc-427e-9849-6c5b52d987e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Received event network-vif-plugged-6be1fbd6-5748-4b3c-af41-2287770931ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.259 2 DEBUG oslo_concurrency.lockutils [req-88842d60-1058-47fe-9d11-7403399c8afe req-44017e89-acdc-427e-9849-6c5b52d987e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.259 2 DEBUG oslo_concurrency.lockutils [req-88842d60-1058-47fe-9d11-7403399c8afe req-44017e89-acdc-427e-9849-6c5b52d987e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.259 2 DEBUG oslo_concurrency.lockutils [req-88842d60-1058-47fe-9d11-7403399c8afe req-44017e89-acdc-427e-9849-6c5b52d987e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.259 2 DEBUG nova.compute.manager [req-88842d60-1058-47fe-9d11-7403399c8afe req-44017e89-acdc-427e-9849-6c5b52d987e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Processing event network-vif-plugged-6be1fbd6-5748-4b3c-af41-2287770931ef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.260 2 DEBUG nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.266 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268293.2657285, 628fd442-ed35-482c-91db-4a57f527b6a8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.268 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] VM Resumed (Lifecycle Event)
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.272 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.276 2 INFO nova.virt.libvirt.driver [-] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Instance spawned successfully.
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.276 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.303 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.313 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.319 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.320 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.321 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.321 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.322 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.323 2 DEBUG nova.virt.libvirt.driver [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.358 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.414 2 INFO nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Took 11.10 seconds to spawn the instance on the hypervisor.
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.415 2 DEBUG nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.514 2 INFO nova.compute.manager [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Took 11.78 seconds to build instance.
Sep 30 21:38:13 compute-1 nova_compute[192795]: 2025-09-30 21:38:13.534 2 DEBUG oslo_concurrency.lockutils [None req-e2d3badf-eaa7-451d-8c8d-9055f7036023 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:15 compute-1 podman[239925]: 2025-09-30 21:38:15.228234051 +0000 UTC m=+0.065364221 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:38:15 compute-1 nova_compute[192795]: 2025-09-30 21:38:15.397 2 DEBUG nova.compute.manager [req-a8ac551b-45f6-476a-8664-28cfceed2096 req-1012ca02-e4ed-46dd-8fab-44c23b9f1e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Received event network-vif-plugged-6be1fbd6-5748-4b3c-af41-2287770931ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:15 compute-1 nova_compute[192795]: 2025-09-30 21:38:15.398 2 DEBUG oslo_concurrency.lockutils [req-a8ac551b-45f6-476a-8664-28cfceed2096 req-1012ca02-e4ed-46dd-8fab-44c23b9f1e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:15 compute-1 nova_compute[192795]: 2025-09-30 21:38:15.398 2 DEBUG oslo_concurrency.lockutils [req-a8ac551b-45f6-476a-8664-28cfceed2096 req-1012ca02-e4ed-46dd-8fab-44c23b9f1e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:15 compute-1 nova_compute[192795]: 2025-09-30 21:38:15.398 2 DEBUG oslo_concurrency.lockutils [req-a8ac551b-45f6-476a-8664-28cfceed2096 req-1012ca02-e4ed-46dd-8fab-44c23b9f1e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:15 compute-1 nova_compute[192795]: 2025-09-30 21:38:15.398 2 DEBUG nova.compute.manager [req-a8ac551b-45f6-476a-8664-28cfceed2096 req-1012ca02-e4ed-46dd-8fab-44c23b9f1e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] No waiting events found dispatching network-vif-plugged-6be1fbd6-5748-4b3c-af41-2287770931ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:38:15 compute-1 nova_compute[192795]: 2025-09-30 21:38:15.399 2 WARNING nova.compute.manager [req-a8ac551b-45f6-476a-8664-28cfceed2096 req-1012ca02-e4ed-46dd-8fab-44c23b9f1e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Received unexpected event network-vif-plugged-6be1fbd6-5748-4b3c-af41-2287770931ef for instance with vm_state active and task_state None.
Sep 30 21:38:17 compute-1 nova_compute[192795]: 2025-09-30 21:38:17.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:18 compute-1 nova_compute[192795]: 2025-09-30 21:38:18.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:22 compute-1 nova_compute[192795]: 2025-09-30 21:38:22.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:23 compute-1 podman[239946]: 2025-09-30 21:38:23.239583163 +0000 UTC m=+0.078793551 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Sep 30 21:38:23 compute-1 nova_compute[192795]: 2025-09-30 21:38:23.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:23 compute-1 podman[239948]: 2025-09-30 21:38:23.261239743 +0000 UTC m=+0.084067812 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:38:23 compute-1 podman[239947]: 2025-09-30 21:38:23.292795528 +0000 UTC m=+0.116286415 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 21:38:25 compute-1 ovn_controller[94902]: 2025-09-30T21:38:25Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c6:23:75 10.100.0.5
Sep 30 21:38:25 compute-1 ovn_controller[94902]: 2025-09-30T21:38:25Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c6:23:75 10.100.0.5
Sep 30 21:38:27 compute-1 nova_compute[192795]: 2025-09-30 21:38:27.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:28 compute-1 nova_compute[192795]: 2025-09-30 21:38:28.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:30 compute-1 nova_compute[192795]: 2025-09-30 21:38:30.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:30.241 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:38:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:30.244 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:38:30 compute-1 podman[240029]: 2025-09-30 21:38:30.25725215 +0000 UTC m=+0.089224391 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:38:32 compute-1 nova_compute[192795]: 2025-09-30 21:38:32.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:33.247 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:33 compute-1 nova_compute[192795]: 2025-09-30 21:38:33.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:35 compute-1 podman[240050]: 2025-09-30 21:38:35.243248892 +0000 UTC m=+0.079005327 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:38:35 compute-1 podman[240051]: 2025-09-30 21:38:35.276318027 +0000 UTC m=+0.099342532 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:38:35 compute-1 podman[240049]: 2025-09-30 21:38:35.279882152 +0000 UTC m=+0.120542278 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Sep 30 21:38:37 compute-1 nova_compute[192795]: 2025-09-30 21:38:37.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-1 nova_compute[192795]: 2025-09-30 21:38:38.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:38.697 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:38.698 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:38.698 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:42 compute-1 nova_compute[192795]: 2025-09-30 21:38:42.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:43 compute-1 nova_compute[192795]: 2025-09-30 21:38:43.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.022 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'name': 'tempest-₡-2040743504', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000076', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8ad754242d964bb487a2174b2c21bcc5', 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'hostId': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.024 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.057 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/cpu volume: 12140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44772c5f-7e06-482b-a9f8-69dd828f2f67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12140000000, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'timestamp': '2025-09-30T21:38:44.024233', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd6b4a94a-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.789762287, 'message_signature': '12858c509fbadf3689aaf9a999159477a96ac89f8ca4dfdaad766a3d8b2eebd0'}]}, 'timestamp': '2025-09-30 21:38:44.058488', '_unique_id': '79bd8a19895149ff9b399b17cf88787e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.060 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.062 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.078 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.079 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7330abd-8a50-4932-8e66-85916747fd4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-vda', 'timestamp': '2025-09-30T21:38:44.063025', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6b7da7a-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.795833869, 'message_signature': '61ca97845def7a86dbaa8b4c9bf4cb7b936a70c794da550ad5084f793be1d7ca'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-sda', 'timestamp': '2025-09-30T21:38:44.063025', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6b7eb6e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.795833869, 'message_signature': '7d4a522d821f1337f615ec494906d3106f8d52b158740e6d1b16c96ec4bf921f'}]}, 'timestamp': '2025-09-30 21:38:44.079660', '_unique_id': 'b9c754f2d0f244669449f6c32b729a1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.080 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.082 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.082 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-₡-2040743504>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-2040743504>]
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.082 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.082 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-₡-2040743504>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-2040743504>]
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.103 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.read.requests volume: 1075 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.103 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74e38a48-37bb-4dc3-9bf6-a1ac6c8fe40b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1075, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-vda', 'timestamp': '2025-09-30T21:38:44.083017', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6bb9c78-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': '1d07173c551c7c224d43c976433fdcfd26349ef8b86068fabf5b2987116ad894'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-sda', 'timestamp': '2025-09-30T21:38:44.083017', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6bba7a4-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': '47be345d2931f01d25b98e8c60b3d0ac1b6eeb3f3de403e283255802a767585c'}]}, 'timestamp': '2025-09-30 21:38:44.104093', '_unique_id': 'a4f96adf79844fe1bf5cf20900a4ab4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.106 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.106 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-₡-2040743504>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-2040743504>]
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.109 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 628fd442-ed35-482c-91db-4a57f527b6a8 / tap6be1fbd6-57 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.109 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f22cc2c3-345f-4056-83ad-8e2f3da1b84b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.106344', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6bc7cba-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': 'b34f0c52047cf65b628982b5de236d698c885230fd1dcf6e7033fc1e8457d13f'}]}, 'timestamp': '2025-09-30 21:38:44.109545', '_unique_id': '506225b0817a4afda96c28c1020844cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.110 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5bbdf03-d97a-4859-bdcc-f700a88ecc35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.110697', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6bcb284-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': '5e88f88f690ae73ff2531ed1157b38c64759def759092d7a1c38cf569190b49e'}]}, 'timestamp': '2025-09-30 21:38:44.110918', '_unique_id': 'd8fafdf8079d4de18b8634f831419afe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.111 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.read.latency volume: 480648305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.read.latency volume: 57821383 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a81a0e1f-c89f-419b-b5a3-e469de63a8c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 480648305, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-vda', 'timestamp': '2025-09-30T21:38:44.111972', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6bce452-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': '665f6a272efb6f84cc1ed4a9a41aad884cc2e282945b4dfdc0e9fdb29bedcacb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57821383, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-sda', 'timestamp': '2025-09-30T21:38:44.111972', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6bcebc8-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': 'ed4d697ebb8042ebb4e6e6bc4bd7bab6132088dbd4ca5d3b428717f8d5cfb6c1'}]}, 'timestamp': '2025-09-30 21:38:44.112389', '_unique_id': '019a058b16824e39b059c3a8756645a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.112 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.113 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b2020e4-79c3-40d5-a5e2-4394571f4bc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.113446', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6bd1dfa-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': 'a54c1b85f07ee1a7e89318e2bed887ac912de733447a444fa0887eacae95d4aa'}]}, 'timestamp': '2025-09-30 21:38:44.113668', '_unique_id': '2e77ee106239468c8e340163261501c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.114 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '554d3591-0057-4af0-b9a8-fe0f44923ccc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-vda', 'timestamp': '2025-09-30T21:38:44.114728', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6bd4fe6-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.795833869, 'message_signature': 'd497b314b21b31cd5c5099211690d214b3c00928722799680ac095035efae841'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-sda', 'timestamp': '2025-09-30T21:38:44.114728', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6bd5752-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.795833869, 'message_signature': 'b8caf675322e02a693ef6e9caee6c4bf0a239761df954c81a106ba44eed23212'}]}, 'timestamp': '2025-09-30 21:38:44.115120', '_unique_id': 'ed82f7a0a90041d2ae3249aceaeab451'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.115 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.incoming.bytes volume: 1652 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2f9885c-73a7-42ea-85eb-3f7a1486fcdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1652, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.116181', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6bd897a-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': '1f7e9802c13e7526731a528c182bbc434ae26bf573715eba258e9ea7c2431e7f'}]}, 'timestamp': '2025-09-30 21:38:44.116421', '_unique_id': '8e62a38365344c9ab9714407d72c5389'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.117 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc70c752-b8a1-40e8-8062-37379a2b02e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.117462', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6bdbada-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': 'd253e610d0c15b86d03cb95858956b862783887e23f2fa3bf7ebf04eff2992dc'}]}, 'timestamp': '2025-09-30 21:38:44.117683', '_unique_id': '2d23148fad68478eaf4c436b26f190ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.118 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61bb4dbe-ad02-4220-922d-e720be9b68cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.118742', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6bdecd0-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': '56b78eb03e2d3b84c41f6b1c06a32ab4e9357de6823c01a156a64c635a4c27ef'}]}, 'timestamp': '2025-09-30 21:38:44.118964', '_unique_id': '736f9e28ff2f4551a76569106ca18582'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.119 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/memory.usage volume: 42.453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a63e434a-7637-489d-9413-ea6d88c63fcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.453125, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'timestamp': '2025-09-30T21:38:44.120005', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd6be1e30-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.789762287, 'message_signature': '789ebe56c4488b1bfef3a86c7bfb6e35037342c34097ff8125f38fe8affa7136'}]}, 'timestamp': '2025-09-30 21:38:44.120218', '_unique_id': 'bea0c130b04d445fa839f06d6ea37f03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.121 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2366ce9-15f8-4329-864e-5a1282ab8c5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.121460', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6be5706-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': 'fabef72e3e198109a25643cc0a930a824a0203e6390cf98f5ba0e1a0cccaeb7f'}]}, 'timestamp': '2025-09-30 21:38:44.121684', '_unique_id': '051388e7e95347ccaafdae03f97eb8e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.122 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.write.latency volume: 1753226051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de20b270-ed7b-4725-9658-8e12143ea966', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1753226051, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-vda', 'timestamp': '2025-09-30T21:38:44.122780', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6be8b86-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': 'c8ddf7fba2f6b32e467cfe7b18efebebcba851322c92366de5bcae7b5fc79147'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-sda', 'timestamp': '2025-09-30T21:38:44.122780', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6be9590-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': '636391bd7ed216834e6f919396cdc84c9a577e59b826a4b92a013832f86b3b5c'}]}, 'timestamp': '2025-09-30 21:38:44.123325', '_unique_id': '48030a05dd0b4529bc802176c484cf68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.123 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.124 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28d467a7-8215-4761-8900-d35aa0c78d01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.124490', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6becd6c-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': '55564b5296356a33df25f776def957c162e5bac064e30b8986ea7e218a81bc8d'}]}, 'timestamp': '2025-09-30 21:38:44.124713', '_unique_id': '3bad589168b7461a81b3c7e3a3ffd502'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.125 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.write.bytes volume: 72876032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4365eefc-5d72-4442-9b1b-e3b19c943c3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72876032, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-vda', 'timestamp': '2025-09-30T21:38:44.125792', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6bf003e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': '1f77074b0795f9361044b331c90ac846da878ac4587c78b09f2778e356cc18d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-sda', 'timestamp': '2025-09-30T21:38:44.125792', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6bf08e0-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': '3851977861a8aa9f2d1aba34d60f2c1d587062c5d51f8bd196604562cfff257e'}]}, 'timestamp': '2025-09-30 21:38:44.126222', '_unique_id': '5ab3e71383e04a9c8c5a7d0bba814835'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.126 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.127 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.write.requests volume: 333 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.127 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4735de39-07da-43c6-8fa2-77e018717553', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 333, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-vda', 'timestamp': '2025-09-30T21:38:44.127544', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6bf44a4-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': 'b7beed06163dff05ae211a12cc382f97a6b1f00ae72170f2195491bb0258d39a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-sda', 'timestamp': '2025-09-30T21:38:44.127544', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6bf4d1e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': '3984e006335ef89152a77d61a23f9aff2f7dd1ae9d35acf32d8f9574af7d5df8'}]}, 'timestamp': '2025-09-30 21:38:44.127969', '_unique_id': 'e678603e2c99429db9af67b101b0d666'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-₡-2040743504>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-2040743504>]
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '308e19b8-fb5b-4b76-9950-77df47fecc58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.129320', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6bf8a04-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': 'b69a8b65356e80a7a756bee74fc05562b73c17f043ebc2e461e93767909b0f2c'}]}, 'timestamp': '2025-09-30 21:38:44.129543', '_unique_id': 'aea7c2efccd64aff84caaf914b40442f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.129 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.130 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec9d62ca-dc18-4d89-a1b8-e39ebb7fa3cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': 'instance-00000076-628fd442-ed35-482c-91db-4a57f527b6a8-tap6be1fbd6-57', 'timestamp': '2025-09-30T21:38:44.130599', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'tap6be1fbd6-57', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c6:23:75', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6be1fbd6-57'}, 'message_id': 'd6bfbbdc-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.839117618, 'message_signature': 'd4c32042f6b2298bed1f9ca55666aaa6f45243a12843f702de7985946e2ee074'}]}, 'timestamp': '2025-09-30 21:38:44.130818', '_unique_id': '3a76cdfad0e44246a85ff56f717564bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.131 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.read.bytes volume: 29862400 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6b3a79c-9f1f-4c94-8f94-7bb9ccef8954', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29862400, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-vda', 'timestamp': '2025-09-30T21:38:44.131867', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6bfed78-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': '4f25acca191b2f7aeba2cb24e7e3017e11bbd8848f1c648c127c2c4fe7ac3c92'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-sda', 'timestamp': '2025-09-30T21:38:44.131867', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6bff4f8-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.815809184, 'message_signature': 'b16e830cdedb73bff6e2058abbe663cb78927b18b9ad7272336a78d7ce6ff50f'}]}, 'timestamp': '2025-09-30 21:38:44.132264', '_unique_id': '857c2d747d2445cfac1fc60fe16e7dc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.132 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.133 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.133 12 DEBUG ceilometer.compute.pollsters [-] 628fd442-ed35-482c-91db-4a57f527b6a8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ac0cebe-2e90-46dc-b940-e2faa2f69b68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-vda', 'timestamp': '2025-09-30T21:38:44.133400', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6c0296e-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.795833869, 'message_signature': 'fd43359826d88d744547bde6c6a97b465f32abd54893e2e14841bc1a0093a45e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_name': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_name': None, 'resource_id': '628fd442-ed35-482c-91db-4a57f527b6a8-sda', 'timestamp': '2025-09-30T21:38:44.133400', 'resource_metadata': {'display_name': 'tempest-₡-2040743504', 'name': 'instance-00000076', 'instance_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'instance_type': 'm1.nano', 'host': 'cf46f37b76dc24561ca97a30053a1668b8441c11b5e4b51586818a05', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6c03166-9e45-11f0-984a-fa163e8033fc', 'monotonic_time': 5019.795833869, 'message_signature': '17ef0f31f34a28afd88e1e84e68a82c5bd2ce609e4ff56b112caa7a7dcc99d62'}]}, 'timestamp': '2025-09-30 21:38:44.133817', '_unique_id': 'b361404542c24014903f92f3035842ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:38:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:38:44.134 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:38:46 compute-1 podman[240113]: 2025-09-30 21:38:46.218644411 +0000 UTC m=+0.065942787 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid)
Sep 30 21:38:47 compute-1 nova_compute[192795]: 2025-09-30 21:38:47.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:48 compute-1 nova_compute[192795]: 2025-09-30 21:38:48.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.595 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "2c2f6f5b-4955-4915-b620-f377ca649c75" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.596 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.622 2 DEBUG nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.733 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.752 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.753 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.764 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.765 2 INFO nova.compute.claims [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.936 2 DEBUG nova.compute.provider_tree [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.951 2 DEBUG nova.scheduler.client.report [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.970 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:49 compute-1 nova_compute[192795]: 2025-09-30 21:38:49.971 2 DEBUG nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.032 2 DEBUG nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.032 2 DEBUG nova.network.neutron [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.059 2 INFO nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.093 2 DEBUG nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.244 2 DEBUG nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.246 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.246 2 INFO nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Creating image(s)
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.247 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "/var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.247 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "/var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.247 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "/var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.260 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.342 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.343 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.344 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.355 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.403 2 DEBUG nova.policy [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '972edc166c9442b1a83983d15a64e8b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '51c02ace4fff44cca028986381d7c407', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.415 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.415 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.457 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.458 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.459 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.517 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.518 2 DEBUG nova.virt.disk.api [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Checking if we can resize image /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.519 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.579 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.580 2 DEBUG nova.virt.disk.api [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Cannot resize image /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.580 2 DEBUG nova.objects.instance [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lazy-loading 'migration_context' on Instance uuid 2c2f6f5b-4955-4915-b620-f377ca649c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.593 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.594 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Ensure instance console log exists: /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.595 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.596 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:50 compute-1 nova_compute[192795]: 2025-09-30 21:38:50.596 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.227 2 DEBUG nova.network.neutron [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Successfully created port: a210664d-c1de-43b3-8844-2a8aedba5ac1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.719 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.720 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.720 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.720 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.800 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.864 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.865 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:52 compute-1 nova_compute[192795]: 2025-09-30 21:38:52.925 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.065 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.067 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5545MB free_disk=73.28805541992188GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.067 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.067 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.148 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 628fd442-ed35-482c-91db-4a57f527b6a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.149 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 2c2f6f5b-4955-4915-b620-f377ca649c75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.149 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.149 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.241 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.287 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.357 2 DEBUG nova.network.neutron [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Successfully updated port: a210664d-c1de-43b3-8844-2a8aedba5ac1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.399 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.400 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.401 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.401 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquired lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.402 2 DEBUG nova.network.neutron [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.497 2 DEBUG nova.compute.manager [req-75f7b242-50b7-4286-9d2e-c5dbebe32980 req-750f966d-69e7-486c-ae0e-e462c67c8474 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Received event network-changed-a210664d-c1de-43b3-8844-2a8aedba5ac1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.498 2 DEBUG nova.compute.manager [req-75f7b242-50b7-4286-9d2e-c5dbebe32980 req-750f966d-69e7-486c-ae0e-e462c67c8474 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Refreshing instance network info cache due to event network-changed-a210664d-c1de-43b3-8844-2a8aedba5ac1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:38:53 compute-1 nova_compute[192795]: 2025-09-30 21:38:53.498 2 DEBUG oslo_concurrency.lockutils [req-75f7b242-50b7-4286-9d2e-c5dbebe32980 req-750f966d-69e7-486c-ae0e-e462c67c8474 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:54 compute-1 nova_compute[192795]: 2025-09-30 21:38:54.077 2 DEBUG nova.network.neutron [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:38:54 compute-1 podman[240156]: 2025-09-30 21:38:54.238491041 +0000 UTC m=+0.069254515 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:38:54 compute-1 podman[240154]: 2025-09-30 21:38:54.257979073 +0000 UTC m=+0.092232110 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:38:54 compute-1 podman[240155]: 2025-09-30 21:38:54.273263053 +0000 UTC m=+0.113818789 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=ovn_controller)
Sep 30 21:38:54 compute-1 nova_compute[192795]: 2025-09-30 21:38:54.401 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:54 compute-1 nova_compute[192795]: 2025-09-30 21:38:54.402 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.401 2 DEBUG nova.network.neutron [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Updating instance_info_cache with network_info: [{"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.419 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Releasing lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.420 2 DEBUG nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Instance network_info: |[{"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.420 2 DEBUG oslo_concurrency.lockutils [req-75f7b242-50b7-4286-9d2e-c5dbebe32980 req-750f966d-69e7-486c-ae0e-e462c67c8474 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.421 2 DEBUG nova.network.neutron [req-75f7b242-50b7-4286-9d2e-c5dbebe32980 req-750f966d-69e7-486c-ae0e-e462c67c8474 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Refreshing network info cache for port a210664d-c1de-43b3-8844-2a8aedba5ac1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.424 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Start _get_guest_xml network_info=[{"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.429 2 WARNING nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.438 2 DEBUG nova.virt.libvirt.host [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.439 2 DEBUG nova.virt.libvirt.host [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.447 2 DEBUG nova.virt.libvirt.host [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.448 2 DEBUG nova.virt.libvirt.host [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.450 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.450 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.450 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.451 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.451 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.451 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.451 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.452 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.452 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.452 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.452 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.453 2 DEBUG nova.virt.hardware [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.457 2 DEBUG nova.virt.libvirt.vif [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1599075230',display_name='tempest-ServersNegativeTestJSON-server-1599075230',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1599075230',id=123,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='51c02ace4fff44cca028986381d7c407',ramdisk_id='',reservation_id='r-qtczxsxc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-2038911265',owner_user_name='tempest-ServersNegativeTestJSON-2038911265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:50Z,user_data=None,user_id='972edc166c9442b1a83983d15a64e8b6',uuid=2c2f6f5b-4955-4915-b620-f377ca649c75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.457 2 DEBUG nova.network.os_vif_util [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converting VIF {"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.458 2 DEBUG nova.network.os_vif_util [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:0e:d5,bridge_name='br-int',has_traffic_filtering=True,id=a210664d-c1de-43b3-8844-2a8aedba5ac1,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa210664d-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.459 2 DEBUG nova.objects.instance [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2c2f6f5b-4955-4915-b620-f377ca649c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.492 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <uuid>2c2f6f5b-4955-4915-b620-f377ca649c75</uuid>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <name>instance-0000007b</name>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersNegativeTestJSON-server-1599075230</nova:name>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:38:55</nova:creationTime>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:38:55 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:38:55 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:38:55 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:38:55 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:38:55 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:38:55 compute-1 nova_compute[192795]:         <nova:user uuid="972edc166c9442b1a83983d15a64e8b6">tempest-ServersNegativeTestJSON-2038911265-project-member</nova:user>
Sep 30 21:38:55 compute-1 nova_compute[192795]:         <nova:project uuid="51c02ace4fff44cca028986381d7c407">tempest-ServersNegativeTestJSON-2038911265</nova:project>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:38:55 compute-1 nova_compute[192795]:         <nova:port uuid="a210664d-c1de-43b3-8844-2a8aedba5ac1">
Sep 30 21:38:55 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <system>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <entry name="serial">2c2f6f5b-4955-4915-b620-f377ca649c75</entry>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <entry name="uuid">2c2f6f5b-4955-4915-b620-f377ca649c75</entry>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     </system>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <os>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   </os>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <features>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   </features>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk.config"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:2b:0e:d5"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <target dev="tapa210664d-c1"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/console.log" append="off"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <video>
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     </video>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:38:55 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:38:55 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:38:55 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:38:55 compute-1 nova_compute[192795]: </domain>
Sep 30 21:38:55 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.494 2 DEBUG nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Preparing to wait for external event network-vif-plugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.494 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.494 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.495 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.496 2 DEBUG nova.virt.libvirt.vif [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1599075230',display_name='tempest-ServersNegativeTestJSON-server-1599075230',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1599075230',id=123,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='51c02ace4fff44cca028986381d7c407',ramdisk_id='',reservation_id='r-qtczxsxc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-2038911265',owner_user_name='tempest-ServersNegativeTestJSON-2038911265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:50Z,user_data=None,user_id='972edc166c9442b1a83983d15a64e8b6',uuid=2c2f6f5b-4955-4915-b620-f377ca649c75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.496 2 DEBUG nova.network.os_vif_util [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converting VIF {"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.497 2 DEBUG nova.network.os_vif_util [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:0e:d5,bridge_name='br-int',has_traffic_filtering=True,id=a210664d-c1de-43b3-8844-2a8aedba5ac1,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa210664d-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.497 2 DEBUG os_vif [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:0e:d5,bridge_name='br-int',has_traffic_filtering=True,id=a210664d-c1de-43b3-8844-2a8aedba5ac1,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa210664d-c1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.498 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.499 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.503 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa210664d-c1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.503 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa210664d-c1, col_values=(('external_ids', {'iface-id': 'a210664d-c1de-43b3-8844-2a8aedba5ac1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:0e:d5', 'vm-uuid': '2c2f6f5b-4955-4915-b620-f377ca649c75'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:38:55 compute-1 NetworkManager[51724]: <info>  [1759268335.5070] manager: (tapa210664d-c1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.517 2 INFO os_vif [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:0e:d5,bridge_name='br-int',has_traffic_filtering=True,id=a210664d-c1de-43b3-8844-2a8aedba5ac1,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa210664d-c1')
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.575 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.576 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.576 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] No VIF found with MAC fa:16:3e:2b:0e:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.577 2 INFO nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Using config drive
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:55 compute-1 nova_compute[192795]: 2025-09-30 21:38:55.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.250 2 INFO nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Creating config drive at /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk.config
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.255 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gl3bp1c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.389 2 DEBUG oslo_concurrency.processutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gl3bp1c" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:56 compute-1 kernel: tapa210664d-c1: entered promiscuous mode
Sep 30 21:38:56 compute-1 NetworkManager[51724]: <info>  [1759268336.4533] manager: (tapa210664d-c1): new Tun device (/org/freedesktop/NetworkManager/Devices/239)
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:56 compute-1 ovn_controller[94902]: 2025-09-30T21:38:56Z|00476|binding|INFO|Claiming lport a210664d-c1de-43b3-8844-2a8aedba5ac1 for this chassis.
Sep 30 21:38:56 compute-1 ovn_controller[94902]: 2025-09-30T21:38:56Z|00477|binding|INFO|a210664d-c1de-43b3-8844-2a8aedba5ac1: Claiming fa:16:3e:2b:0e:d5 10.100.0.4
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.468 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:0e:d5 10.100.0.4'], port_security=['fa:16:3e:2b:0e:d5 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2c2f6f5b-4955-4915-b620-f377ca649c75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51c02ace4fff44cca028986381d7c407', 'neutron:revision_number': '2', 'neutron:security_group_ids': '62973e02-a61c-4061-8a33-46cf1b8a21f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6f1e456-6208-4d0c-8c96-5e3a3d932af6, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=a210664d-c1de-43b3-8844-2a8aedba5ac1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.469 103861 INFO neutron.agent.ovn.metadata.agent [-] Port a210664d-c1de-43b3-8844-2a8aedba5ac1 in datapath ea80450e-b8f2-4af5-a00d-9221e5dd4d97 bound to our chassis
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.471 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ea80450e-b8f2-4af5-a00d-9221e5dd4d97
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.485 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa1165b-7dad-4f4a-9bad-d493ba39ad47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.486 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapea80450e-b1 in ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.488 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapea80450e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.488 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd89920-3ecb-4d85-908e-ccc39ecde6bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.489 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[544d8e0c-eef8-4843-803a-78143c9e248b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.500 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[fb381e8b-39eb-4a87-b5b0-6f6e3fbd914b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 systemd-udevd[240240]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:38:56 compute-1 systemd-machined[152783]: New machine qemu-58-instance-0000007b.
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:56 compute-1 ovn_controller[94902]: 2025-09-30T21:38:56Z|00478|binding|INFO|Setting lport a210664d-c1de-43b3-8844-2a8aedba5ac1 ovn-installed in OVS
Sep 30 21:38:56 compute-1 ovn_controller[94902]: 2025-09-30T21:38:56Z|00479|binding|INFO|Setting lport a210664d-c1de-43b3-8844-2a8aedba5ac1 up in Southbound
Sep 30 21:38:56 compute-1 systemd[1]: Started Virtual Machine qemu-58-instance-0000007b.
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:56 compute-1 NetworkManager[51724]: <info>  [1759268336.5217] device (tapa210664d-c1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:38:56 compute-1 NetworkManager[51724]: <info>  [1759268336.5235] device (tapa210664d-c1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.533 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[06f16767-dbb9-49af-a927-83f9c66a896f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.568 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[eab05ce7-5099-4b82-912d-297c4f8bbc4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 NetworkManager[51724]: <info>  [1759268336.5791] manager: (tapea80450e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/240)
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.578 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ebdef880-10a2-4244-b97f-ba7e657fac5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.615 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[eccca86e-20dc-45d3-9f88-224430d45f26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.619 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[dccc2505-fa3a-4126-b43c-78dac7cb79d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 NetworkManager[51724]: <info>  [1759268336.6490] device (tapea80450e-b0): carrier: link connected
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.655 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[3e21b4c9-24b4-4521-bc74-239304176d22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.678 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ad57934e-9fad-4352-9cf9-256fefc8ce84]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea80450e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:34:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503232, 'reachable_time': 31347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240271, 'error': None, 'target': 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.700 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[03fb8bab-cf45-423d-a76a-438c7e1c7806]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:340d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503232, 'tstamp': 503232}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240272, 'error': None, 'target': 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.728 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a2a340ae-472a-466e-a9a7-691e57d19d4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea80450e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:34:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503232, 'reachable_time': 31347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240273, 'error': None, 'target': 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.764 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b7cb30-e130-476d-b730-066f58029488]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.850 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[354ca241-0aae-4a0e-af13-b0850d52e2db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.852 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea80450e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.852 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.853 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea80450e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:56 compute-1 kernel: tapea80450e-b0: entered promiscuous mode
Sep 30 21:38:56 compute-1 NetworkManager[51724]: <info>  [1759268336.8561] manager: (tapea80450e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.858 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapea80450e-b0, col_values=(('external_ids', {'iface-id': '392087bd-c954-490d-9360-3e29d00c1de8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:56 compute-1 ovn_controller[94902]: 2025-09-30T21:38:56Z|00480|binding|INFO|Releasing lport 392087bd-c954-490d-9360-3e29d00c1de8 from this chassis (sb_readonly=0)
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:56 compute-1 nova_compute[192795]: 2025-09-30 21:38:56.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.873 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea80450e-b8f2-4af5-a00d-9221e5dd4d97.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea80450e-b8f2-4af5-a00d-9221e5dd4d97.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.874 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[51b985ff-cbd2-498c-8eba-6a121f37b3e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.875 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-ea80450e-b8f2-4af5-a00d-9221e5dd4d97
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/ea80450e-b8f2-4af5-a00d-9221e5dd4d97.pid.haproxy
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID ea80450e-b8f2-4af5-a00d-9221e5dd4d97
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:38:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:38:56.876 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'env', 'PROCESS_TAG=haproxy-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ea80450e-b8f2-4af5-a00d-9221e5dd4d97.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:38:57 compute-1 podman[240312]: 2025-09-30 21:38:57.280390726 +0000 UTC m=+0.069368058 container create cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:38:57 compute-1 podman[240312]: 2025-09-30 21:38:57.24091795 +0000 UTC m=+0.029895312 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:38:57 compute-1 systemd[1]: Started libpod-conmon-cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11.scope.
Sep 30 21:38:57 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:38:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/383daa8f2dee1d7869e27472a7694a20027adff6d6d0aac2fa030b90d3e10e2f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:38:57 compute-1 podman[240312]: 2025-09-30 21:38:57.408385103 +0000 UTC m=+0.197362465 container init cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:38:57 compute-1 podman[240312]: 2025-09-30 21:38:57.415094583 +0000 UTC m=+0.204071905 container start cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:38:57 compute-1 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[240327]: [NOTICE]   (240331) : New worker (240333) forked
Sep 30 21:38:57 compute-1 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[240327]: [NOTICE]   (240331) : Loading success.
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.499 2 DEBUG nova.network.neutron [req-75f7b242-50b7-4286-9d2e-c5dbebe32980 req-750f966d-69e7-486c-ae0e-e462c67c8474 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Updated VIF entry in instance network info cache for port a210664d-c1de-43b3-8844-2a8aedba5ac1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.501 2 DEBUG nova.network.neutron [req-75f7b242-50b7-4286-9d2e-c5dbebe32980 req-750f966d-69e7-486c-ae0e-e462c67c8474 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Updating instance_info_cache with network_info: [{"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.522 2 DEBUG oslo_concurrency.lockutils [req-75f7b242-50b7-4286-9d2e-c5dbebe32980 req-750f966d-69e7-486c-ae0e-e462c67c8474 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.603 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268337.6028836, 2c2f6f5b-4955-4915-b620-f377ca649c75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.604 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] VM Started (Lifecycle Event)
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.639 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.644 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268337.603246, 2c2f6f5b-4955-4915-b620-f377ca649c75 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.644 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] VM Paused (Lifecycle Event)
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.665 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.669 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:57 compute-1 nova_compute[192795]: 2025-09-30 21:38:57.694 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.394 2 DEBUG nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Received event network-vif-plugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.395 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.396 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.396 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.396 2 DEBUG nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Processing event network-vif-plugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.397 2 DEBUG nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Received event network-vif-plugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.397 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.398 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.398 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.398 2 DEBUG nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] No waiting events found dispatching network-vif-plugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.399 2 WARNING nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Received unexpected event network-vif-plugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 for instance with vm_state building and task_state spawning.
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.400 2 DEBUG nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.405 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268338.4049776, 2c2f6f5b-4955-4915-b620-f377ca649c75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.405 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] VM Resumed (Lifecycle Event)
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.409 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.414 2 INFO nova.virt.libvirt.driver [-] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Instance spawned successfully.
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.415 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.439 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.447 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.453 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.454 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.454 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.455 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.456 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.456 2 DEBUG nova.virt.libvirt.driver [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.470 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.536 2 INFO nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Took 8.29 seconds to spawn the instance on the hypervisor.
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.537 2 DEBUG nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.654 2 INFO nova.compute.manager [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Took 8.97 seconds to build instance.
Sep 30 21:38:58 compute-1 nova_compute[192795]: 2025-09-30 21:38:58.672 2 DEBUG oslo_concurrency.lockutils [None req-859ff25d-865d-4ba7-810f-12db4c1de8ce 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:00 compute-1 nova_compute[192795]: 2025-09-30 21:39:00.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:00 compute-1 nova_compute[192795]: 2025-09-30 21:39:00.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:01 compute-1 podman[240343]: 2025-09-30 21:39:01.240931928 +0000 UTC m=+0.073252512 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:39:01 compute-1 nova_compute[192795]: 2025-09-30 21:39:01.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:02 compute-1 nova_compute[192795]: 2025-09-30 21:39:02.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:03 compute-1 nova_compute[192795]: 2025-09-30 21:39:03.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:04 compute-1 nova_compute[192795]: 2025-09-30 21:39:04.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:04 compute-1 nova_compute[192795]: 2025-09-30 21:39:04.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:39:04 compute-1 nova_compute[192795]: 2025-09-30 21:39:04.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:39:04 compute-1 nova_compute[192795]: 2025-09-30 21:39:04.874 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-628fd442-ed35-482c-91db-4a57f527b6a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:04 compute-1 nova_compute[192795]: 2025-09-30 21:39:04.875 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-628fd442-ed35-482c-91db-4a57f527b6a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:04 compute-1 nova_compute[192795]: 2025-09-30 21:39:04.875 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:39:04 compute-1 nova_compute[192795]: 2025-09-30 21:39:04.875 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 628fd442-ed35-482c-91db-4a57f527b6a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:05 compute-1 nova_compute[192795]: 2025-09-30 21:39:05.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:06 compute-1 nova_compute[192795]: 2025-09-30 21:39:06.121 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Updating instance_info_cache with network_info: [{"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:06 compute-1 nova_compute[192795]: 2025-09-30 21:39:06.148 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-628fd442-ed35-482c-91db-4a57f527b6a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:06 compute-1 nova_compute[192795]: 2025-09-30 21:39:06.149 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:39:06 compute-1 podman[240366]: 2025-09-30 21:39:06.221112545 +0000 UTC m=+0.051354586 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 21:39:06 compute-1 podman[240365]: 2025-09-30 21:39:06.249195167 +0000 UTC m=+0.083769824 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:39:06 compute-1 podman[240364]: 2025-09-30 21:39:06.254017936 +0000 UTC m=+0.093305490 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Sep 30 21:39:08 compute-1 nova_compute[192795]: 2025-09-30 21:39:08.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:10 compute-1 nova_compute[192795]: 2025-09-30 21:39:10.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:11 compute-1 ovn_controller[94902]: 2025-09-30T21:39:11Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:0e:d5 10.100.0.4
Sep 30 21:39:11 compute-1 ovn_controller[94902]: 2025-09-30T21:39:11Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:0e:d5 10.100.0.4
Sep 30 21:39:13 compute-1 nova_compute[192795]: 2025-09-30 21:39:13.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:15 compute-1 nova_compute[192795]: 2025-09-30 21:39:15.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:17 compute-1 podman[240438]: 2025-09-30 21:39:17.229632744 +0000 UTC m=+0.068231988 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:39:18 compute-1 nova_compute[192795]: 2025-09-30 21:39:18.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:20 compute-1 nova_compute[192795]: 2025-09-30 21:39:20.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:21 compute-1 nova_compute[192795]: 2025-09-30 21:39:21.398 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:21 compute-1 nova_compute[192795]: 2025-09-30 21:39:21.398 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:22 compute-1 nova_compute[192795]: 2025-09-30 21:39:22.018 2 DEBUG nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:39:23 compute-1 nova_compute[192795]: 2025-09-30 21:39:23.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:23 compute-1 nova_compute[192795]: 2025-09-30 21:39:23.316 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:23 compute-1 nova_compute[192795]: 2025-09-30 21:39:23.317 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:23 compute-1 nova_compute[192795]: 2025-09-30 21:39:23.325 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:39:23 compute-1 nova_compute[192795]: 2025-09-30 21:39:23.326 2 INFO nova.compute.claims [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:39:23 compute-1 nova_compute[192795]: 2025-09-30 21:39:23.795 2 DEBUG nova.compute.provider_tree [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:23 compute-1 nova_compute[192795]: 2025-09-30 21:39:23.871 2 DEBUG nova.scheduler.client.report [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:24 compute-1 nova_compute[192795]: 2025-09-30 21:39:24.002 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:24 compute-1 nova_compute[192795]: 2025-09-30 21:39:24.003 2 DEBUG nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:39:24 compute-1 nova_compute[192795]: 2025-09-30 21:39:24.238 2 DEBUG nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:39:24 compute-1 nova_compute[192795]: 2025-09-30 21:39:24.239 2 DEBUG nova.network.neutron [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:39:24 compute-1 nova_compute[192795]: 2025-09-30 21:39:24.323 2 INFO nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:39:24 compute-1 nova_compute[192795]: 2025-09-30 21:39:24.419 2 DEBUG nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:39:24 compute-1 nova_compute[192795]: 2025-09-30 21:39:24.482 2 DEBUG nova.policy [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.160 2 DEBUG nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.162 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.162 2 INFO nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Creating image(s)
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.163 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "/var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.163 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.164 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.181 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:25 compute-1 podman[240459]: 2025-09-30 21:39:25.238317904 +0000 UTC m=+0.067946661 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.251 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.252 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.253 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:25 compute-1 podman[240458]: 2025-09-30 21:39:25.255684379 +0000 UTC m=+0.091132412 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_id=ovn_controller)
Sep 30 21:39:25 compute-1 podman[240457]: 2025-09-30 21:39:25.255834732 +0000 UTC m=+0.096998208 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.266 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.327 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.328 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.354 2 DEBUG nova.network.neutron [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Successfully created port: de8ff6be-8523-4d3e-b188-b4ca08b411a2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.365 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.366 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.367 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.441 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.443 2 DEBUG nova.virt.disk.api [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Checking if we can resize image /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.444 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.502 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.503 2 DEBUG nova.virt.disk.api [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Cannot resize image /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.503 2 DEBUG nova.objects.instance [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'migration_context' on Instance uuid c5e53a18-5bb4-49dc-9ce7-6619957edd4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.574 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.574 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Ensure instance console log exists: /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.575 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.575 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:25 compute-1 nova_compute[192795]: 2025-09-30 21:39:25.575 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:26 compute-1 nova_compute[192795]: 2025-09-30 21:39:26.382 2 DEBUG nova.network.neutron [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Successfully updated port: de8ff6be-8523-4d3e-b188-b4ca08b411a2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:39:26 compute-1 nova_compute[192795]: 2025-09-30 21:39:26.399 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "refresh_cache-c5e53a18-5bb4-49dc-9ce7-6619957edd4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:26 compute-1 nova_compute[192795]: 2025-09-30 21:39:26.400 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquired lock "refresh_cache-c5e53a18-5bb4-49dc-9ce7-6619957edd4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:26 compute-1 nova_compute[192795]: 2025-09-30 21:39:26.400 2 DEBUG nova.network.neutron [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:39:26 compute-1 nova_compute[192795]: 2025-09-30 21:39:26.569 2 DEBUG nova.network.neutron [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.532 2 DEBUG nova.network.neutron [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Updating instance_info_cache with network_info: [{"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.552 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Releasing lock "refresh_cache-c5e53a18-5bb4-49dc-9ce7-6619957edd4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.552 2 DEBUG nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Instance network_info: |[{"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.555 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Start _get_guest_xml network_info=[{"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.563 2 WARNING nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.568 2 DEBUG nova.virt.libvirt.host [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.569 2 DEBUG nova.virt.libvirt.host [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.574 2 DEBUG nova.virt.libvirt.host [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.575 2 DEBUG nova.virt.libvirt.host [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.577 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.577 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.578 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.578 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.578 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.578 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.578 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.579 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.579 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.579 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.579 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.580 2 DEBUG nova.virt.hardware [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.583 2 DEBUG nova.virt.libvirt.vif [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-262784555',display_name='tempest-ServersTestJSON-server-262784555',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-262784555',id=128,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-gz1y6pve',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:24Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=c5e53a18-5bb4-49dc-9ce7-6619957edd4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.584 2 DEBUG nova.network.os_vif_util [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.585 2 DEBUG nova.network.os_vif_util [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:25:84,bridge_name='br-int',has_traffic_filtering=True,id=de8ff6be-8523-4d3e-b188-b4ca08b411a2,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde8ff6be-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.586 2 DEBUG nova.objects.instance [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c5e53a18-5bb4-49dc-9ce7-6619957edd4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.601 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <uuid>c5e53a18-5bb4-49dc-9ce7-6619957edd4e</uuid>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <name>instance-00000080</name>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersTestJSON-server-262784555</nova:name>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:39:27</nova:creationTime>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:39:27 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:39:27 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:39:27 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:39:27 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:39:27 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:39:27 compute-1 nova_compute[192795]:         <nova:user uuid="30d0a975d78c4d9a8e2201afdc040092">tempest-ServersTestJSON-782690373-project-member</nova:user>
Sep 30 21:39:27 compute-1 nova_compute[192795]:         <nova:project uuid="8ad754242d964bb487a2174b2c21bcc5">tempest-ServersTestJSON-782690373</nova:project>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:39:27 compute-1 nova_compute[192795]:         <nova:port uuid="de8ff6be-8523-4d3e-b188-b4ca08b411a2">
Sep 30 21:39:27 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <system>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <entry name="serial">c5e53a18-5bb4-49dc-9ce7-6619957edd4e</entry>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <entry name="uuid">c5e53a18-5bb4-49dc-9ce7-6619957edd4e</entry>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     </system>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <os>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   </os>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <features>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   </features>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk.config"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:4a:25:84"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <target dev="tapde8ff6be-85"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/console.log" append="off"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <video>
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     </video>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:39:27 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:39:27 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:39:27 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:39:27 compute-1 nova_compute[192795]: </domain>
Sep 30 21:39:27 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.603 2 DEBUG nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Preparing to wait for external event network-vif-plugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.603 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.604 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.604 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.605 2 DEBUG nova.virt.libvirt.vif [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-262784555',display_name='tempest-ServersTestJSON-server-262784555',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-262784555',id=128,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-gz1y6pve',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:24Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=c5e53a18-5bb4-49dc-9ce7-6619957edd4e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.605 2 DEBUG nova.network.os_vif_util [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.606 2 DEBUG nova.network.os_vif_util [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:25:84,bridge_name='br-int',has_traffic_filtering=True,id=de8ff6be-8523-4d3e-b188-b4ca08b411a2,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde8ff6be-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.606 2 DEBUG os_vif [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:25:84,bridge_name='br-int',has_traffic_filtering=True,id=de8ff6be-8523-4d3e-b188-b4ca08b411a2,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde8ff6be-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.607 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.608 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.612 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde8ff6be-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.612 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapde8ff6be-85, col_values=(('external_ids', {'iface-id': 'de8ff6be-8523-4d3e-b188-b4ca08b411a2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:25:84', 'vm-uuid': 'c5e53a18-5bb4-49dc-9ce7-6619957edd4e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:27 compute-1 NetworkManager[51724]: <info>  [1759268367.6162] manager: (tapde8ff6be-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.623 2 DEBUG nova.compute.manager [req-390d2b54-5544-45a5-950a-fce6e72cce5b req-9ef38df3-c6d4-46cb-9747-9c87cd95f863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Received event network-changed-de8ff6be-8523-4d3e-b188-b4ca08b411a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.624 2 DEBUG nova.compute.manager [req-390d2b54-5544-45a5-950a-fce6e72cce5b req-9ef38df3-c6d4-46cb-9747-9c87cd95f863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Refreshing instance network info cache due to event network-changed-de8ff6be-8523-4d3e-b188-b4ca08b411a2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.625 2 DEBUG oslo_concurrency.lockutils [req-390d2b54-5544-45a5-950a-fce6e72cce5b req-9ef38df3-c6d4-46cb-9747-9c87cd95f863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c5e53a18-5bb4-49dc-9ce7-6619957edd4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.625 2 DEBUG oslo_concurrency.lockutils [req-390d2b54-5544-45a5-950a-fce6e72cce5b req-9ef38df3-c6d4-46cb-9747-9c87cd95f863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c5e53a18-5bb4-49dc-9ce7-6619957edd4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.625 2 DEBUG nova.network.neutron [req-390d2b54-5544-45a5-950a-fce6e72cce5b req-9ef38df3-c6d4-46cb-9747-9c87cd95f863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Refreshing network info cache for port de8ff6be-8523-4d3e-b188-b4ca08b411a2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.628 2 INFO os_vif [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:25:84,bridge_name='br-int',has_traffic_filtering=True,id=de8ff6be-8523-4d3e-b188-b4ca08b411a2,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde8ff6be-85')
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.692 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.693 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.693 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No VIF found with MAC fa:16:3e:4a:25:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:39:27 compute-1 nova_compute[192795]: 2025-09-30 21:39:27.694 2 INFO nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Using config drive
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.055 2 INFO nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Creating config drive at /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk.config
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.060 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnmr1oeho execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.190 2 DEBUG oslo_concurrency.processutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnmr1oeho" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:28 compute-1 kernel: tapde8ff6be-85: entered promiscuous mode
Sep 30 21:39:28 compute-1 NetworkManager[51724]: <info>  [1759268368.2622] manager: (tapde8ff6be-85): new Tun device (/org/freedesktop/NetworkManager/Devices/243)
Sep 30 21:39:28 compute-1 ovn_controller[94902]: 2025-09-30T21:39:28Z|00481|binding|INFO|Claiming lport de8ff6be-8523-4d3e-b188-b4ca08b411a2 for this chassis.
Sep 30 21:39:28 compute-1 ovn_controller[94902]: 2025-09-30T21:39:28Z|00482|binding|INFO|de8ff6be-8523-4d3e-b188-b4ca08b411a2: Claiming fa:16:3e:4a:25:84 10.100.0.4
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.271 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:25:84 10.100.0.4'], port_security=['fa:16:3e:4a:25:84 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c5e53a18-5bb4-49dc-9ce7-6619957edd4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=de8ff6be-8523-4d3e-b188-b4ca08b411a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.273 103861 INFO neutron.agent.ovn.metadata.agent [-] Port de8ff6be-8523-4d3e-b188-b4ca08b411a2 in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c bound to our chassis
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.274 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:39:28 compute-1 ovn_controller[94902]: 2025-09-30T21:39:28Z|00483|binding|INFO|Setting lport de8ff6be-8523-4d3e-b188-b4ca08b411a2 ovn-installed in OVS
Sep 30 21:39:28 compute-1 ovn_controller[94902]: 2025-09-30T21:39:28Z|00484|binding|INFO|Setting lport de8ff6be-8523-4d3e-b188-b4ca08b411a2 up in Southbound
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-1 systemd-udevd[240570]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.308 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[437751e3-9863-4fd7-8b4f-86dba60d0530]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:28 compute-1 NetworkManager[51724]: <info>  [1759268368.3219] device (tapde8ff6be-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:39:28 compute-1 NetworkManager[51724]: <info>  [1759268368.3230] device (tapde8ff6be-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:39:28 compute-1 systemd-machined[152783]: New machine qemu-59-instance-00000080.
Sep 30 21:39:28 compute-1 systemd[1]: Started Virtual Machine qemu-59-instance-00000080.
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.350 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b5afb9ab-0911-4b78-86cd-a46bf3668bb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.356 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2823443a-eb62-4cc7-8841-7da8e60ee40d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.390 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ea0ca265-84e7-42f0-b19b-45617ab895c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.416 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d60889fa-309b-47d4-9110-a525893cc69a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498413, 'reachable_time': 26194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240584, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.436 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b83c3549-5d8c-4a7c-b6a5-3f16fc732397]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap27086519-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498426, 'tstamp': 498426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240585, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap27086519-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498430, 'tstamp': 498430}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240585, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.438 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.443 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27086519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.444 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.444 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27086519-60, col_values=(('external_ids', {'iface-id': 'f2abb4ad-797b-4767-b8bc-377990516394'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:28.444 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.839 2 DEBUG nova.network.neutron [req-390d2b54-5544-45a5-950a-fce6e72cce5b req-9ef38df3-c6d4-46cb-9747-9c87cd95f863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Updated VIF entry in instance network info cache for port de8ff6be-8523-4d3e-b188-b4ca08b411a2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.840 2 DEBUG nova.network.neutron [req-390d2b54-5544-45a5-950a-fce6e72cce5b req-9ef38df3-c6d4-46cb-9747-9c87cd95f863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Updating instance_info_cache with network_info: [{"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:28 compute-1 nova_compute[192795]: 2025-09-30 21:39:28.854 2 DEBUG oslo_concurrency.lockutils [req-390d2b54-5544-45a5-950a-fce6e72cce5b req-9ef38df3-c6d4-46cb-9747-9c87cd95f863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c5e53a18-5bb4-49dc-9ce7-6619957edd4e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.114 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268369.1132367, c5e53a18-5bb4-49dc-9ce7-6619957edd4e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.114 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] VM Started (Lifecycle Event)
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.143 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.148 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268369.1135137, c5e53a18-5bb4-49dc-9ce7-6619957edd4e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.148 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] VM Paused (Lifecycle Event)
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.174 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.178 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.202 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.814 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Received event network-vif-plugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.815 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.815 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.815 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.816 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Processing event network-vif-plugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.816 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Received event network-vif-plugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.816 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.816 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.817 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.817 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] No waiting events found dispatching network-vif-plugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.817 2 WARNING nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Received unexpected event network-vif-plugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 for instance with vm_state building and task_state spawning.
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.818 2 DEBUG nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.822 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268369.8220131, c5e53a18-5bb4-49dc-9ce7-6619957edd4e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.822 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] VM Resumed (Lifecycle Event)
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.824 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.828 2 INFO nova.virt.libvirt.driver [-] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Instance spawned successfully.
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.828 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.843 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.849 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.852 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.853 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.853 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.854 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.854 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.854 2 DEBUG nova.virt.libvirt.driver [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.899 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.955 2 INFO nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Took 4.79 seconds to spawn the instance on the hypervisor.
Sep 30 21:39:29 compute-1 nova_compute[192795]: 2025-09-30 21:39:29.956 2 DEBUG nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:30 compute-1 nova_compute[192795]: 2025-09-30 21:39:30.049 2 INFO nova.compute.manager [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Took 6.91 seconds to build instance.
Sep 30 21:39:30 compute-1 nova_compute[192795]: 2025-09-30 21:39:30.067 2 DEBUG oslo_concurrency.lockutils [None req-913b84fe-65a4-4910-ac08-964aa0e0d775 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:30 compute-1 nova_compute[192795]: 2025-09-30 21:39:30.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:30.774 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:30.776 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.705 2 DEBUG oslo_concurrency.lockutils [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.706 2 DEBUG oslo_concurrency.lockutils [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.707 2 DEBUG oslo_concurrency.lockutils [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.707 2 DEBUG oslo_concurrency.lockutils [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.707 2 DEBUG oslo_concurrency.lockutils [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.719 2 INFO nova.compute.manager [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Terminating instance
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.731 2 DEBUG nova.compute.manager [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:39:31 compute-1 kernel: tapde8ff6be-85 (unregistering): left promiscuous mode
Sep 30 21:39:31 compute-1 NetworkManager[51724]: <info>  [1759268371.7547] device (tapde8ff6be-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:39:31 compute-1 ovn_controller[94902]: 2025-09-30T21:39:31Z|00485|binding|INFO|Releasing lport de8ff6be-8523-4d3e-b188-b4ca08b411a2 from this chassis (sb_readonly=0)
Sep 30 21:39:31 compute-1 ovn_controller[94902]: 2025-09-30T21:39:31Z|00486|binding|INFO|Setting lport de8ff6be-8523-4d3e-b188-b4ca08b411a2 down in Southbound
Sep 30 21:39:31 compute-1 ovn_controller[94902]: 2025-09-30T21:39:31Z|00487|binding|INFO|Removing iface tapde8ff6be-85 ovn-installed in OVS
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.780 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:25:84 10.100.0.4'], port_security=['fa:16:3e:4a:25:84 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c5e53a18-5bb4-49dc-9ce7-6619957edd4e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=de8ff6be-8523-4d3e-b188-b4ca08b411a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.782 103861 INFO neutron.agent.ovn.metadata.agent [-] Port de8ff6be-8523-4d3e-b188-b4ca08b411a2 in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c unbound from our chassis
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.784 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.808 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[abc345d2-89cb-4ee8-bc84-33a0ede4b416]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:31 compute-1 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000080.scope: Deactivated successfully.
Sep 30 21:39:31 compute-1 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000080.scope: Consumed 2.658s CPU time.
Sep 30 21:39:31 compute-1 systemd-machined[152783]: Machine qemu-59-instance-00000080 terminated.
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.853 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[93e3ea88-c4cc-48d4-b879-fffa600da048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.857 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb6ab88-afb1-4b73-b774-1339cf500538]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:31 compute-1 podman[240593]: 2025-09-30 21:39:31.878652995 +0000 UTC m=+0.087477293 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.887 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd66bf9-85a9-4a0f-a045-41985834402c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.907 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[44f3d3de-362e-432c-8aef-249e1faca841]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498413, 'reachable_time': 26194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240626, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.924 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[160b5d39-eec4-45e1-b3c9-e516669d6c54]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap27086519-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498426, 'tstamp': 498426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240627, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap27086519-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498430, 'tstamp': 498430}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240627, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.926 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.934 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27086519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.936 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.937 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27086519-60, col_values=(('external_ids', {'iface-id': 'f2abb4ad-797b-4767-b8bc-377990516394'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:31.937 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.996 2 INFO nova.virt.libvirt.driver [-] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Instance destroyed successfully.
Sep 30 21:39:31 compute-1 nova_compute[192795]: 2025-09-30 21:39:31.997 2 DEBUG nova.objects.instance [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'resources' on Instance uuid c5e53a18-5bb4-49dc-9ce7-6619957edd4e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.020 2 DEBUG nova.virt.libvirt.vif [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:39:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-262784555',display_name='tempest-ServersTestJSON-server-262784555',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-262784555',id=128,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:39:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-gz1y6pve',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:39:30Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=c5e53a18-5bb4-49dc-9ce7-6619957edd4e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.021 2 DEBUG nova.network.os_vif_util [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "address": "fa:16:3e:4a:25:84", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde8ff6be-85", "ovs_interfaceid": "de8ff6be-8523-4d3e-b188-b4ca08b411a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.021 2 DEBUG nova.network.os_vif_util [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:25:84,bridge_name='br-int',has_traffic_filtering=True,id=de8ff6be-8523-4d3e-b188-b4ca08b411a2,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde8ff6be-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.022 2 DEBUG os_vif [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:25:84,bridge_name='br-int',has_traffic_filtering=True,id=de8ff6be-8523-4d3e-b188-b4ca08b411a2,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde8ff6be-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.024 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde8ff6be-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.030 2 INFO os_vif [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:25:84,bridge_name='br-int',has_traffic_filtering=True,id=de8ff6be-8523-4d3e-b188-b4ca08b411a2,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde8ff6be-85')
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.030 2 INFO nova.virt.libvirt.driver [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Deleting instance files /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e_del
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.031 2 INFO nova.virt.libvirt.driver [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Deletion of /var/lib/nova/instances/c5e53a18-5bb4-49dc-9ce7-6619957edd4e_del complete
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.116 2 INFO nova.compute.manager [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.116 2 DEBUG oslo.service.loopingcall [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.117 2 DEBUG nova.compute.manager [-] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.117 2 DEBUG nova.network.neutron [-] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:39:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:32.779 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.920 2 DEBUG nova.network.neutron [-] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:32 compute-1 nova_compute[192795]: 2025-09-30 21:39:32.940 2 INFO nova.compute.manager [-] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Took 0.82 seconds to deallocate network for instance.
Sep 30 21:39:33 compute-1 nova_compute[192795]: 2025-09-30 21:39:33.030 2 DEBUG oslo_concurrency.lockutils [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:33 compute-1 nova_compute[192795]: 2025-09-30 21:39:33.031 2 DEBUG oslo_concurrency.lockutils [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:33 compute-1 nova_compute[192795]: 2025-09-30 21:39:33.144 2 DEBUG nova.compute.provider_tree [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:33 compute-1 nova_compute[192795]: 2025-09-30 21:39:33.159 2 DEBUG nova.scheduler.client.report [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:33 compute-1 nova_compute[192795]: 2025-09-30 21:39:33.179 2 DEBUG oslo_concurrency.lockutils [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:33 compute-1 nova_compute[192795]: 2025-09-30 21:39:33.213 2 INFO nova.scheduler.client.report [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Deleted allocations for instance c5e53a18-5bb4-49dc-9ce7-6619957edd4e
Sep 30 21:39:33 compute-1 nova_compute[192795]: 2025-09-30 21:39:33.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:33 compute-1 nova_compute[192795]: 2025-09-30 21:39:33.337 2 DEBUG oslo_concurrency.lockutils [None req-47e45b43-cfca-4589-b3b7-39de17f426ea 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.308 2 DEBUG nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Received event network-vif-unplugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.309 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.309 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.309 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.310 2 DEBUG nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] No waiting events found dispatching network-vif-unplugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.310 2 WARNING nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Received unexpected event network-vif-unplugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 for instance with vm_state deleted and task_state None.
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.310 2 DEBUG nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Received event network-vif-plugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.311 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.311 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.311 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5e53a18-5bb4-49dc-9ce7-6619957edd4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.312 2 DEBUG nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] No waiting events found dispatching network-vif-plugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.312 2 WARNING nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Received unexpected event network-vif-plugged-de8ff6be-8523-4d3e-b188-b4ca08b411a2 for instance with vm_state deleted and task_state None.
Sep 30 21:39:34 compute-1 nova_compute[192795]: 2025-09-30 21:39:34.312 2 DEBUG nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Received event network-vif-deleted-de8ff6be-8523-4d3e-b188-b4ca08b411a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:37 compute-1 nova_compute[192795]: 2025-09-30 21:39:37.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:37 compute-1 podman[240647]: 2025-09-30 21:39:37.246158832 +0000 UTC m=+0.083573189 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:39:37 compute-1 podman[240646]: 2025-09-30 21:39:37.262278753 +0000 UTC m=+0.101199260 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Sep 30 21:39:37 compute-1 podman[240648]: 2025-09-30 21:39:37.262395716 +0000 UTC m=+0.085533521 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:38 compute-1 nova_compute[192795]: 2025-09-30 21:39:38.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:38.698 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:38.699 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:39:38.700 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:42 compute-1 nova_compute[192795]: 2025-09-30 21:39:42.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:43 compute-1 nova_compute[192795]: 2025-09-30 21:39:43.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:46 compute-1 nova_compute[192795]: 2025-09-30 21:39:46.996 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268371.9940186, c5e53a18-5bb4-49dc-9ce7-6619957edd4e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:46 compute-1 nova_compute[192795]: 2025-09-30 21:39:46.996 2 INFO nova.compute.manager [-] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] VM Stopped (Lifecycle Event)
Sep 30 21:39:47 compute-1 nova_compute[192795]: 2025-09-30 21:39:47.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:47 compute-1 nova_compute[192795]: 2025-09-30 21:39:47.040 2 DEBUG nova.compute.manager [None req-e514b631-4022-4ca9-9c8e-950673a232f7 - - - - - -] [instance: c5e53a18-5bb4-49dc-9ce7-6619957edd4e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:48 compute-1 podman[240713]: 2025-09-30 21:39:48.219336584 +0000 UTC m=+0.060092290 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:39:48 compute-1 nova_compute[192795]: 2025-09-30 21:39:48.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:49 compute-1 nova_compute[192795]: 2025-09-30 21:39:49.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:49 compute-1 nova_compute[192795]: 2025-09-30 21:39:49.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:39:51 compute-1 nova_compute[192795]: 2025-09-30 21:39:51.712 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:52 compute-1 nova_compute[192795]: 2025-09-30 21:39:52.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:53 compute-1 nova_compute[192795]: 2025-09-30 21:39:53.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:53 compute-1 nova_compute[192795]: 2025-09-30 21:39:53.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:53 compute-1 nova_compute[192795]: 2025-09-30 21:39:53.859 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:53 compute-1 nova_compute[192795]: 2025-09-30 21:39:53.859 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:53 compute-1 nova_compute[192795]: 2025-09-30 21:39:53.860 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:53 compute-1 nova_compute[192795]: 2025-09-30 21:39:53.860 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.101 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.170 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.171 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.230 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.236 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.293 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.295 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.353 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.466 2 INFO nova.compute.manager [None req-14180219-96ed-4164-8d74-000a1e20786c 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Pausing
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.468 2 DEBUG nova.objects.instance [None req-14180219-96ed-4164-8d74-000a1e20786c 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lazy-loading 'flavor' on Instance uuid 2c2f6f5b-4955-4915-b620-f377ca649c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.538 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.539 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5333MB free_disk=73.25857543945312GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.540 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.540 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.937 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 628fd442-ed35-482c-91db-4a57f527b6a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.937 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 2c2f6f5b-4955-4915-b620-f377ca649c75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.937 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.937 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.955 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268394.955232, 2c2f6f5b-4955-4915-b620-f377ca649c75 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.956 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] VM Paused (Lifecycle Event)
Sep 30 21:39:54 compute-1 nova_compute[192795]: 2025-09-30 21:39:54.957 2 DEBUG nova.compute.manager [None req-14180219-96ed-4164-8d74-000a1e20786c 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:55 compute-1 nova_compute[192795]: 2025-09-30 21:39:55.050 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:55 compute-1 nova_compute[192795]: 2025-09-30 21:39:55.054 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:55 compute-1 nova_compute[192795]: 2025-09-30 21:39:55.097 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:55 compute-1 nova_compute[192795]: 2025-09-30 21:39:55.216 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:55 compute-1 nova_compute[192795]: 2025-09-30 21:39:55.499 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:39:55 compute-1 nova_compute[192795]: 2025-09-30 21:39:55.500 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.959s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:56 compute-1 podman[240750]: 2025-09-30 21:39:56.235516665 +0000 UTC m=+0.071317891 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:39:56 compute-1 podman[240748]: 2025-09-30 21:39:56.246604912 +0000 UTC m=+0.080091206 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:39:56 compute-1 podman[240749]: 2025-09-30 21:39:56.272140935 +0000 UTC m=+0.108417404 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:39:57 compute-1 nova_compute[192795]: 2025-09-30 21:39:57.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:57 compute-1 nova_compute[192795]: 2025-09-30 21:39:57.500 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:57 compute-1 nova_compute[192795]: 2025-09-30 21:39:57.501 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:57 compute-1 nova_compute[192795]: 2025-09-30 21:39:57.501 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:57 compute-1 nova_compute[192795]: 2025-09-30 21:39:57.502 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:39:58 compute-1 nova_compute[192795]: 2025-09-30 21:39:58.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:59 compute-1 nova_compute[192795]: 2025-09-30 21:39:59.889 2 INFO nova.compute.manager [None req-34ffb8c1-d375-40f9-80eb-963dce14ae83 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Unpausing
Sep 30 21:39:59 compute-1 nova_compute[192795]: 2025-09-30 21:39:59.891 2 DEBUG nova.objects.instance [None req-34ffb8c1-d375-40f9-80eb-963dce14ae83 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lazy-loading 'flavor' on Instance uuid 2c2f6f5b-4955-4915-b620-f377ca649c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.180 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268400.1799746, 2c2f6f5b-4955-4915-b620-f377ca649c75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.180 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] VM Resumed (Lifecycle Event)
Sep 30 21:40:00 compute-1 virtqemud[192217]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.184 2 DEBUG nova.virt.libvirt.guest [None req-34ffb8c1-d375-40f9-80eb-963dce14ae83 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.185 2 DEBUG nova.compute.manager [None req-34ffb8c1-d375-40f9-80eb-963dce14ae83 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.311 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.318 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.387 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] During sync_power_state the instance has a pending task (unpausing). Skip.
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.852 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.853 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.881 2 DEBUG nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.896 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "829530ac-b0fb-4e39-896e-f01e78306ff8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.896 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:00 compute-1 nova_compute[192795]: 2025-09-30 21:40:00.952 2 DEBUG nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.057 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.058 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.066 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.067 2 INFO nova.compute.claims [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.171 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.493 2 DEBUG nova.compute.provider_tree [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.539 2 DEBUG nova.scheduler.client.report [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.707 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.709 2 DEBUG nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.714 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.724 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:40:01 compute-1 nova_compute[192795]: 2025-09-30 21:40:01.724 2 INFO nova.compute.claims [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.034 2 DEBUG nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.034 2 DEBUG nova.network.neutron [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:02 compute-1 podman[240816]: 2025-09-30 21:40:02.226380195 +0000 UTC m=+0.068850124 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.314 2 DEBUG nova.policy [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8e4a8454b4d4d049dde1e287a040dfb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c29435f306af4eebb7d6cb5bb416037d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.319 2 INFO nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.458 2 DEBUG nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.570 2 DEBUG nova.compute.provider_tree [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.667 2 DEBUG nova.scheduler.client.report [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.752 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:02 compute-1 nova_compute[192795]: 2025-09-30 21:40:02.753 2 DEBUG nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.032 2 DEBUG nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.033 2 DEBUG nova.network.neutron [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.036 2 DEBUG nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.037 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.037 2 INFO nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Creating image(s)
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.037 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.038 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.038 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.053 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.113 2 INFO nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.131 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.132 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.133 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.144 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.166 2 DEBUG nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.210 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.211 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.260 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.261 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.261 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.332 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.334 2 DEBUG nova.virt.disk.api [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Checking if we can resize image /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.334 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.393 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.395 2 DEBUG nova.virt.disk.api [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Cannot resize image /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.395 2 DEBUG nova.objects.instance [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'migration_context' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.493 2 DEBUG nova.policy [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.507 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.508 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Ensure instance console log exists: /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.508 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.509 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.509 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.757 2 DEBUG nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.759 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.759 2 INFO nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Creating image(s)
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.760 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "/var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.760 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.761 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.775 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.837 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.838 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.838 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.849 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.907 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.909 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.946 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.947 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:03 compute-1 nova_compute[192795]: 2025-09-30 21:40:03.948 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.039 2 DEBUG nova.network.neutron [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Successfully created port: 54f28900-8a59-4c2b-b1a3-a7b618a894ce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.044 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.044 2 DEBUG nova.virt.disk.api [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Checking if we can resize image /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.045 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.124 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.125 2 DEBUG nova.virt.disk.api [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Cannot resize image /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.125 2 DEBUG nova.objects.instance [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'migration_context' on Instance uuid 829530ac-b0fb-4e39-896e-f01e78306ff8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.166 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.167 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Ensure instance console log exists: /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.167 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.168 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.168 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:04 compute-1 nova_compute[192795]: 2025-09-30 21:40:04.574 2 DEBUG nova.network.neutron [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Successfully created port: 2ece52f5-eb6d-4f35-a922-5b5e099858af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.103 2 DEBUG nova.network.neutron [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Successfully updated port: 54f28900-8a59-4c2b-b1a3-a7b618a894ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.188 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "refresh_cache-333cdbd1-23c9-422d-b896-f3c6b76d4130" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.189 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquired lock "refresh_cache-333cdbd1-23c9-422d-b896-f3c6b76d4130" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.189 2 DEBUG nova.network.neutron [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.239 2 DEBUG nova.compute.manager [req-4e424bf8-3c60-41d5-8996-f84b189c9fa5 req-d18ce1ab-e106-4b43-9c6a-1077bb5765e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-changed-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.239 2 DEBUG nova.compute.manager [req-4e424bf8-3c60-41d5-8996-f84b189c9fa5 req-d18ce1ab-e106-4b43-9c6a-1077bb5765e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Refreshing instance network info cache due to event network-changed-54f28900-8a59-4c2b-b1a3-a7b618a894ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.239 2 DEBUG oslo_concurrency.lockutils [req-4e424bf8-3c60-41d5-8996-f84b189c9fa5 req-d18ce1ab-e106-4b43-9c6a-1077bb5765e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-333cdbd1-23c9-422d-b896-f3c6b76d4130" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.422 2 DEBUG nova.network.neutron [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:06 compute-1 nova_compute[192795]: 2025-09-30 21:40:06.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.151 2 DEBUG nova.network.neutron [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Successfully updated port: 2ece52f5-eb6d-4f35-a922-5b5e099858af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.163 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.163 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.164 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.249 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "refresh_cache-829530ac-b0fb-4e39-896e-f01e78306ff8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.249 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquired lock "refresh_cache-829530ac-b0fb-4e39-896e-f01e78306ff8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.250 2 DEBUG nova.network.neutron [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.472 2 DEBUG nova.network.neutron [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:40:07 compute-1 nova_compute[192795]: 2025-09-30 21:40:07.950 2 DEBUG nova.network.neutron [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Updating instance_info_cache with network_info: [{"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.049 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Releasing lock "refresh_cache-333cdbd1-23c9-422d-b896-f3c6b76d4130" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.049 2 DEBUG nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance network_info: |[{"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.050 2 DEBUG oslo_concurrency.lockutils [req-4e424bf8-3c60-41d5-8996-f84b189c9fa5 req-d18ce1ab-e106-4b43-9c6a-1077bb5765e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-333cdbd1-23c9-422d-b896-f3c6b76d4130" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.050 2 DEBUG nova.network.neutron [req-4e424bf8-3c60-41d5-8996-f84b189c9fa5 req-d18ce1ab-e106-4b43-9c6a-1077bb5765e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Refreshing network info cache for port 54f28900-8a59-4c2b-b1a3-a7b618a894ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.054 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Start _get_guest_xml network_info=[{"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.060 2 WARNING nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.066 2 DEBUG nova.virt.libvirt.host [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.066 2 DEBUG nova.virt.libvirt.host [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.069 2 DEBUG nova.virt.libvirt.host [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.070 2 DEBUG nova.virt.libvirt.host [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.071 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.072 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.072 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.072 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.072 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.073 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.073 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.073 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.073 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.073 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.074 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.074 2 DEBUG nova.virt.hardware [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.078 2 DEBUG nova.virt.libvirt.vif [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-442085525',display_name='tempest-ServerRescueNegativeTestJSON-server-442085525',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-442085525',id=132,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c29435f306af4eebb7d6cb5bb416037d',ramdisk_id='',reservation_id='r-4zqgi81x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-493519679',owner_user_name='tempest-ServerRescueNegativeTestJSON-493519679-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:40:02Z,user_data=None,user_id='a8e4a8454b4d4d049dde1e287a040dfb',uuid=333cdbd1-23c9-422d-b896-f3c6b76d4130,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.078 2 DEBUG nova.network.os_vif_util [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converting VIF {"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.079 2 DEBUG nova.network.os_vif_util [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:df:08,bridge_name='br-int',has_traffic_filtering=True,id=54f28900-8a59-4c2b-b1a3-a7b618a894ce,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54f28900-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.079 2 DEBUG nova.objects.instance [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'pci_devices' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.105 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <uuid>333cdbd1-23c9-422d-b896-f3c6b76d4130</uuid>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <name>instance-00000084</name>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerRescueNegativeTestJSON-server-442085525</nova:name>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:40:08</nova:creationTime>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:40:08 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:40:08 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:40:08 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:40:08 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:40:08 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:40:08 compute-1 nova_compute[192795]:         <nova:user uuid="a8e4a8454b4d4d049dde1e287a040dfb">tempest-ServerRescueNegativeTestJSON-493519679-project-member</nova:user>
Sep 30 21:40:08 compute-1 nova_compute[192795]:         <nova:project uuid="c29435f306af4eebb7d6cb5bb416037d">tempest-ServerRescueNegativeTestJSON-493519679</nova:project>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:40:08 compute-1 nova_compute[192795]:         <nova:port uuid="54f28900-8a59-4c2b-b1a3-a7b618a894ce">
Sep 30 21:40:08 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <system>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <entry name="serial">333cdbd1-23c9-422d-b896-f3c6b76d4130</entry>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <entry name="uuid">333cdbd1-23c9-422d-b896-f3c6b76d4130</entry>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     </system>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <os>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   </os>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <features>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   </features>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.config"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:27:df:08"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <target dev="tap54f28900-8a"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/console.log" append="off"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <video>
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     </video>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:40:08 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:40:08 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:40:08 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:40:08 compute-1 nova_compute[192795]: </domain>
Sep 30 21:40:08 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.105 2 DEBUG nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Preparing to wait for external event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.106 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.106 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.106 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.106 2 DEBUG nova.virt.libvirt.vif [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-442085525',display_name='tempest-ServerRescueNegativeTestJSON-server-442085525',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-442085525',id=132,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c29435f306af4eebb7d6cb5bb416037d',ramdisk_id='',reservation_id='r-4zqgi81x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-493519679',owner_user_name='tempest-ServerRescueNegativeTestJSON-493519679-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:40:02Z,user_data=None,user_id='a8e4a8454b4d4d049dde1e287a040dfb',uuid=333cdbd1-23c9-422d-b896-f3c6b76d4130,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.107 2 DEBUG nova.network.os_vif_util [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converting VIF {"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.107 2 DEBUG nova.network.os_vif_util [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:df:08,bridge_name='br-int',has_traffic_filtering=True,id=54f28900-8a59-4c2b-b1a3-a7b618a894ce,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54f28900-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.107 2 DEBUG os_vif [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:df:08,bridge_name='br-int',has_traffic_filtering=True,id=54f28900-8a59-4c2b-b1a3-a7b618a894ce,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54f28900-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.108 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.109 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.112 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54f28900-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.112 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap54f28900-8a, col_values=(('external_ids', {'iface-id': '54f28900-8a59-4c2b-b1a3-a7b618a894ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:df:08', 'vm-uuid': '333cdbd1-23c9-422d-b896-f3c6b76d4130'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:08 compute-1 NetworkManager[51724]: <info>  [1759268408.1553] manager: (tap54f28900-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.161 2 INFO os_vif [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:df:08,bridge_name='br-int',has_traffic_filtering=True,id=54f28900-8a59-4c2b-b1a3-a7b618a894ce,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54f28900-8a')
Sep 30 21:40:08 compute-1 podman[240877]: 2025-09-30 21:40:08.24991955 +0000 UTC m=+0.061171540 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:40:08 compute-1 podman[240876]: 2025-09-30 21:40:08.251220945 +0000 UTC m=+0.062345371 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:40:08 compute-1 podman[240873]: 2025-09-30 21:40:08.279580114 +0000 UTC m=+0.098771846 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7)
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.295 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.295 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.296 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No VIF found with MAC fa:16:3e:27:df:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.296 2 INFO nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Using config drive
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.804 2 DEBUG nova.compute.manager [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Received event network-changed-2ece52f5-eb6d-4f35-a922-5b5e099858af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.804 2 DEBUG nova.compute.manager [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Refreshing instance network info cache due to event network-changed-2ece52f5-eb6d-4f35-a922-5b5e099858af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:40:08 compute-1 nova_compute[192795]: 2025-09-30 21:40:08.805 2 DEBUG oslo_concurrency.lockutils [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-829530ac-b0fb-4e39-896e-f01e78306ff8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.380 2 DEBUG nova.network.neutron [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Updating instance_info_cache with network_info: [{"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.417 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Releasing lock "refresh_cache-829530ac-b0fb-4e39-896e-f01e78306ff8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.417 2 DEBUG nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Instance network_info: |[{"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.418 2 DEBUG oslo_concurrency.lockutils [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-829530ac-b0fb-4e39-896e-f01e78306ff8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.419 2 DEBUG nova.network.neutron [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Refreshing network info cache for port 2ece52f5-eb6d-4f35-a922-5b5e099858af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.424 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Start _get_guest_xml network_info=[{"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.432 2 WARNING nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.438 2 DEBUG nova.virt.libvirt.host [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.439 2 DEBUG nova.virt.libvirt.host [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.450 2 DEBUG nova.virt.libvirt.host [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.450 2 DEBUG nova.virt.libvirt.host [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.452 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.453 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.453 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.454 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.454 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.455 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.455 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.456 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.456 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.457 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.457 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.457 2 DEBUG nova.virt.hardware [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.464 2 DEBUG nova.virt.libvirt.vif [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-591878727',display_name='tempest-ServersTestJSON-server-591878727',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-591878727',id=133,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-utemcsfg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:40:03Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=829530ac-b0fb-4e39-896e-f01e78306ff8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.465 2 DEBUG nova.network.os_vif_util [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.466 2 DEBUG nova.network.os_vif_util [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:aa:47,bridge_name='br-int',has_traffic_filtering=True,id=2ece52f5-eb6d-4f35-a922-5b5e099858af,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ece52f5-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.468 2 DEBUG nova.objects.instance [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 829530ac-b0fb-4e39-896e-f01e78306ff8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.474 2 INFO nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Creating config drive at /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.config
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.481 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7t8h2hjr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.529 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Updating instance_info_cache with network_info: [{"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.558 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <uuid>829530ac-b0fb-4e39-896e-f01e78306ff8</uuid>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <name>instance-00000085</name>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersTestJSON-server-591878727</nova:name>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:40:09</nova:creationTime>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:40:09 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:40:09 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:40:09 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:40:09 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:40:09 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:40:09 compute-1 nova_compute[192795]:         <nova:user uuid="30d0a975d78c4d9a8e2201afdc040092">tempest-ServersTestJSON-782690373-project-member</nova:user>
Sep 30 21:40:09 compute-1 nova_compute[192795]:         <nova:project uuid="8ad754242d964bb487a2174b2c21bcc5">tempest-ServersTestJSON-782690373</nova:project>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:40:09 compute-1 nova_compute[192795]:         <nova:port uuid="2ece52f5-eb6d-4f35-a922-5b5e099858af">
Sep 30 21:40:09 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <system>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <entry name="serial">829530ac-b0fb-4e39-896e-f01e78306ff8</entry>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <entry name="uuid">829530ac-b0fb-4e39-896e-f01e78306ff8</entry>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     </system>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <os>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   </os>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <features>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   </features>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk.config"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:c6:aa:47"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <target dev="tap2ece52f5-eb"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/console.log" append="off"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <video>
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     </video>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:40:09 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:40:09 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:40:09 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:40:09 compute-1 nova_compute[192795]: </domain>
Sep 30 21:40:09 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.559 2 DEBUG nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Preparing to wait for external event network-vif-plugged-2ece52f5-eb6d-4f35-a922-5b5e099858af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.559 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.559 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.560 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.561 2 DEBUG nova.virt.libvirt.vif [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-591878727',display_name='tempest-ServersTestJSON-server-591878727',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-591878727',id=133,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-utemcsfg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:40:03Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=829530ac-b0fb-4e39-896e-f01e78306ff8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.561 2 DEBUG nova.network.os_vif_util [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.562 2 DEBUG nova.network.os_vif_util [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:aa:47,bridge_name='br-int',has_traffic_filtering=True,id=2ece52f5-eb6d-4f35-a922-5b5e099858af,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ece52f5-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.562 2 DEBUG os_vif [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:aa:47,bridge_name='br-int',has_traffic_filtering=True,id=2ece52f5-eb6d-4f35-a922-5b5e099858af,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ece52f5-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.568 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ece52f5-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.569 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2ece52f5-eb, col_values=(('external_ids', {'iface-id': '2ece52f5-eb6d-4f35-a922-5b5e099858af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:aa:47', 'vm-uuid': '829530ac-b0fb-4e39-896e-f01e78306ff8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:09 compute-1 NetworkManager[51724]: <info>  [1759268409.5729] manager: (tap2ece52f5-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.580 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.580 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.587 2 INFO os_vif [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:aa:47,bridge_name='br-int',has_traffic_filtering=True,id=2ece52f5-eb6d-4f35-a922-5b5e099858af,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ece52f5-eb')
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.630 2 DEBUG oslo_concurrency.processutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7t8h2hjr" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.649 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.650 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.650 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No VIF found with MAC fa:16:3e:c6:aa:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.651 2 INFO nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Using config drive
Sep 30 21:40:09 compute-1 kernel: tap54f28900-8a: entered promiscuous mode
Sep 30 21:40:09 compute-1 NetworkManager[51724]: <info>  [1759268409.7369] manager: (tap54f28900-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/246)
Sep 30 21:40:09 compute-1 ovn_controller[94902]: 2025-09-30T21:40:09Z|00488|binding|INFO|Claiming lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce for this chassis.
Sep 30 21:40:09 compute-1 ovn_controller[94902]: 2025-09-30T21:40:09Z|00489|binding|INFO|54f28900-8a59-4c2b-b1a3-a7b618a894ce: Claiming fa:16:3e:27:df:08 10.100.0.12
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.770 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:df:08 10.100.0.12'], port_security=['fa:16:3e:27:df:08 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '333cdbd1-23c9-422d-b896-f3c6b76d4130', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c29435f306af4eebb7d6cb5bb416037d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7a32e3a5-ee38-4fae-9fbe-f0444b488d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3ff2921-853e-4756-b8d5-05a55aa79dbf, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=54f28900-8a59-4c2b-b1a3-a7b618a894ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.772 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 54f28900-8a59-4c2b-b1a3-a7b618a894ce in datapath 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a bound to our chassis
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.774 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.792 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[593fca22-3f30-4337-a7b4-650aca605579]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.793 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f7a3c1e-01 in ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:40:09 compute-1 systemd-udevd[240957]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.796 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f7a3c1e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.796 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe4507b-c863-4182-acdc-71d43437dd0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 systemd-machined[152783]: New machine qemu-60-instance-00000084.
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.798 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[403913f2-beb6-4f86-b23e-72cdf92b7748]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 NetworkManager[51724]: <info>  [1759268409.8159] device (tap54f28900-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:40:09 compute-1 NetworkManager[51724]: <info>  [1759268409.8171] device (tap54f28900-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.816 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[fac9a9e5-7a1a-4fb3-82b0-ed5b3978edfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 systemd[1]: Started Virtual Machine qemu-60-instance-00000084.
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.841 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4a0ca7db-b022-4769-acfd-fbd701c3df20]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 ovn_controller[94902]: 2025-09-30T21:40:09Z|00490|binding|INFO|Setting lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce ovn-installed in OVS
Sep 30 21:40:09 compute-1 ovn_controller[94902]: 2025-09-30T21:40:09Z|00491|binding|INFO|Setting lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce up in Southbound
Sep 30 21:40:09 compute-1 nova_compute[192795]: 2025-09-30 21:40:09.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.882 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b9a015-2f17-4acd-a65e-9571a64f37cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 NetworkManager[51724]: <info>  [1759268409.8893] manager: (tap9f7a3c1e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/247)
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.889 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2c48e893-47f5-4d1b-afda-09c3edca24d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 systemd-udevd[240960]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.930 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b49752bc-648b-4653-b5a6-296769b76d22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.933 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[663b00a6-c0af-4f8f-8023-e217a09a0717]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 NetworkManager[51724]: <info>  [1759268409.9601] device (tap9f7a3c1e-00): carrier: link connected
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.967 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[14f81700-41ee-40fa-b8c3-afb38d77caa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:09.986 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd74f0d-52a7-42d0-93be-d3b0b0112e36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f7a3c1e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:db:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510563, 'reachable_time': 17209, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240993, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.007 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f52083b6-ff38-4d0b-9167-190792736226]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:dba1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510563, 'tstamp': 510563}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240994, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.026 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0c853e53-7ab2-46f3-80b4-d46c5fd7f567]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f7a3c1e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:db:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510563, 'reachable_time': 17209, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240995, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.071 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ade92710-8f8e-468e-b651-a8a688b7c7f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.112 2 INFO nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Creating config drive at /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk.config
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.117 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphrupwd76 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.153 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e7948d65-dc5c-4fb4-9767-71a7f8bb85ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.155 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f7a3c1e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.156 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.156 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f7a3c1e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 NetworkManager[51724]: <info>  [1759268410.1592] manager: (tap9f7a3c1e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Sep 30 21:40:10 compute-1 kernel: tap9f7a3c1e-00: entered promiscuous mode
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.168 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f7a3c1e-00, col_values=(('external_ids', {'iface-id': 'b7e0b2bc-8210-4854-9253-4d8208499194'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 ovn_controller[94902]: 2025-09-30T21:40:10Z|00492|binding|INFO|Releasing lport b7e0b2bc-8210-4854-9253-4d8208499194 from this chassis (sb_readonly=0)
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.204 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.205 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b5bb799c-323c-49c9-902e-473ef778b1b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.206 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.pid.haproxy
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.208 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'env', 'PROCESS_TAG=haproxy-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.215 2 DEBUG nova.compute.manager [req-2cd913df-a7a6-4eb8-b725-ae194b7dc658 req-94236971-da3b-4c8d-ba04-14673add4727 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.216 2 DEBUG oslo_concurrency.lockutils [req-2cd913df-a7a6-4eb8-b725-ae194b7dc658 req-94236971-da3b-4c8d-ba04-14673add4727 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.216 2 DEBUG oslo_concurrency.lockutils [req-2cd913df-a7a6-4eb8-b725-ae194b7dc658 req-94236971-da3b-4c8d-ba04-14673add4727 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.216 2 DEBUG oslo_concurrency.lockutils [req-2cd913df-a7a6-4eb8-b725-ae194b7dc658 req-94236971-da3b-4c8d-ba04-14673add4727 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.217 2 DEBUG nova.compute.manager [req-2cd913df-a7a6-4eb8-b725-ae194b7dc658 req-94236971-da3b-4c8d-ba04-14673add4727 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Processing event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.254 2 DEBUG oslo_concurrency.processutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphrupwd76" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:10 compute-1 NetworkManager[51724]: <info>  [1759268410.3344] manager: (tap2ece52f5-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Sep 30 21:40:10 compute-1 systemd-udevd[240975]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:40:10 compute-1 kernel: tap2ece52f5-eb: entered promiscuous mode
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 ovn_controller[94902]: 2025-09-30T21:40:10Z|00493|binding|INFO|Claiming lport 2ece52f5-eb6d-4f35-a922-5b5e099858af for this chassis.
Sep 30 21:40:10 compute-1 ovn_controller[94902]: 2025-09-30T21:40:10Z|00494|binding|INFO|2ece52f5-eb6d-4f35-a922-5b5e099858af: Claiming fa:16:3e:c6:aa:47 10.100.0.8
Sep 30 21:40:10 compute-1 NetworkManager[51724]: <info>  [1759268410.3501] device (tap2ece52f5-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:40:10 compute-1 NetworkManager[51724]: <info>  [1759268410.3513] device (tap2ece52f5-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:40:10 compute-1 ovn_controller[94902]: 2025-09-30T21:40:10Z|00495|binding|INFO|Setting lport 2ece52f5-eb6d-4f35-a922-5b5e099858af ovn-installed in OVS
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 ovn_controller[94902]: 2025-09-30T21:40:10Z|00496|binding|INFO|Setting lport 2ece52f5-eb6d-4f35-a922-5b5e099858af up in Southbound
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.385 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:aa:47 10.100.0.8'], port_security=['fa:16:3e:c6:aa:47 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '829530ac-b0fb-4e39-896e-f01e78306ff8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=2ece52f5-eb6d-4f35-a922-5b5e099858af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 systemd-machined[152783]: New machine qemu-61-instance-00000085.
Sep 30 21:40:10 compute-1 systemd[1]: Started Virtual Machine qemu-61-instance-00000085.
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.542 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 podman[241054]: 2025-09-30 21:40:10.609545785 +0000 UTC m=+0.054062208 container create 538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:40:10 compute-1 systemd[1]: Started libpod-conmon-538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c.scope.
Sep 30 21:40:10 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:40:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1572bfea610af20b5dcb270ea3120d3604316cfacf33a727f5b1b0732097348/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.672 2 DEBUG nova.network.neutron [req-4e424bf8-3c60-41d5-8996-f84b189c9fa5 req-d18ce1ab-e106-4b43-9c6a-1077bb5765e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Updated VIF entry in instance network info cache for port 54f28900-8a59-4c2b-b1a3-a7b618a894ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.672 2 DEBUG nova.network.neutron [req-4e424bf8-3c60-41d5-8996-f84b189c9fa5 req-d18ce1ab-e106-4b43-9c6a-1077bb5765e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Updating instance_info_cache with network_info: [{"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:10 compute-1 podman[241054]: 2025-09-30 21:40:10.58057873 +0000 UTC m=+0.025095173 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:40:10 compute-1 podman[241054]: 2025-09-30 21:40:10.689460544 +0000 UTC m=+0.133976987 container init 538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:40:10 compute-1 podman[241054]: 2025-09-30 21:40:10.695022083 +0000 UTC m=+0.139538506 container start 538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.708 2 DEBUG oslo_concurrency.lockutils [req-4e424bf8-3c60-41d5-8996-f84b189c9fa5 req-d18ce1ab-e106-4b43-9c6a-1077bb5765e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-333cdbd1-23c9-422d-b896-f3c6b76d4130" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:40:10 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241070]: [NOTICE]   (241074) : New worker (241076) forked
Sep 30 21:40:10 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241070]: [NOTICE]   (241074) : Loading success.
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.767 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 2ece52f5-eb6d-4f35-a922-5b5e099858af in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c unbound from our chassis
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.771 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.792 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8735f7f7-8882-4b07-9a70-23ed06832a00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.834 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ef89bc85-3b8b-44b7-9a97-e3bd949f9da1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.840 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[558ad726-f3c9-4a92-a8f3-95a20bf081c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.881 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[eec1b1ac-9f44-47c7-9d66-e8b57c05126d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.902 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[32dd17b8-973a-4fb4-bbad-55e56858746d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 916, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 916, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498413, 'reachable_time': 26194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241097, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.925 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9ad38948-8f09-4556-82c8-082faee7663e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap27086519-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498426, 'tstamp': 498426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241102, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap27086519-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498430, 'tstamp': 498430}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241102, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.928 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 nova_compute[192795]: 2025-09-30 21:40:10.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.934 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27086519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.934 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.935 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27086519-60, col_values=(('external_ids', {'iface-id': 'f2abb4ad-797b-4767-b8bc-377990516394'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.935 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:10 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:10.937 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.375 2 DEBUG nova.network.neutron [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Updated VIF entry in instance network info cache for port 2ece52f5-eb6d-4f35-a922-5b5e099858af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.376 2 DEBUG nova.network.neutron [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Updating instance_info_cache with network_info: [{"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.389 2 DEBUG nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.390 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268411.3893986, 333cdbd1-23c9-422d-b896-f3c6b76d4130 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.390 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] VM Started (Lifecycle Event)
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.395 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.399 2 INFO nova.virt.libvirt.driver [-] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance spawned successfully.
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.399 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.477 2 DEBUG oslo_concurrency.lockutils [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-829530ac-b0fb-4e39-896e-f01e78306ff8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.501 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.506 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.507 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.507 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.507 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.508 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.508 2 DEBUG nova.virt.libvirt.driver [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.513 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.567 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.568 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268411.391283, 333cdbd1-23c9-422d-b896-f3c6b76d4130 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.568 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] VM Paused (Lifecycle Event)
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.635 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.639 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.641 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268411.394664, 333cdbd1-23c9-422d-b896-f3c6b76d4130 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.641 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] VM Resumed (Lifecycle Event)
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.778 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.779 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.783 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.819 2 INFO nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Took 8.78 seconds to spawn the instance on the hypervisor.
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.820 2 DEBUG nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.832 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.833 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268411.4591246, 829530ac-b0fb-4e39-896e-f01e78306ff8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:11 compute-1 nova_compute[192795]: 2025-09-30 21:40:11.833 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] VM Started (Lifecycle Event)
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.138 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.139 2 INFO nova.compute.manager [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Took 11.15 seconds to build instance.
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.144 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268411.4592671, 829530ac-b0fb-4e39-896e-f01e78306ff8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.144 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] VM Paused (Lifecycle Event)
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.168 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.172 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.177 2 DEBUG oslo_concurrency.lockutils [None req-45516bd2-1126-4022-bb9c-83b6a83627f6 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.206 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.308 2 DEBUG nova.compute.manager [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.308 2 DEBUG oslo_concurrency.lockutils [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.308 2 DEBUG oslo_concurrency.lockutils [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.308 2 DEBUG oslo_concurrency.lockutils [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.308 2 DEBUG nova.compute.manager [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] No waiting events found dispatching network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.309 2 WARNING nova.compute.manager [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received unexpected event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce for instance with vm_state active and task_state None.
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.309 2 DEBUG nova.compute.manager [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Received event network-vif-plugged-2ece52f5-eb6d-4f35-a922-5b5e099858af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.309 2 DEBUG oslo_concurrency.lockutils [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.309 2 DEBUG oslo_concurrency.lockutils [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.309 2 DEBUG oslo_concurrency.lockutils [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.309 2 DEBUG nova.compute.manager [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Processing event network-vif-plugged-2ece52f5-eb6d-4f35-a922-5b5e099858af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.310 2 DEBUG nova.compute.manager [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Received event network-vif-plugged-2ece52f5-eb6d-4f35-a922-5b5e099858af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.310 2 DEBUG oslo_concurrency.lockutils [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.310 2 DEBUG oslo_concurrency.lockutils [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.310 2 DEBUG oslo_concurrency.lockutils [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.310 2 DEBUG nova.compute.manager [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] No waiting events found dispatching network-vif-plugged-2ece52f5-eb6d-4f35-a922-5b5e099858af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.310 2 WARNING nova.compute.manager [req-434ea2ad-c20b-4a80-bfd7-0b001ee67768 req-c014b521-4484-4337-b5c2-38cd3734e99e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Received unexpected event network-vif-plugged-2ece52f5-eb6d-4f35-a922-5b5e099858af for instance with vm_state building and task_state spawning.
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.311 2 DEBUG nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.315 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268412.3147147, 829530ac-b0fb-4e39-896e-f01e78306ff8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.315 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] VM Resumed (Lifecycle Event)
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.317 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.322 2 INFO nova.virt.libvirt.driver [-] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Instance spawned successfully.
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.322 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.339 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.344 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.358 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.359 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.359 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.360 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.360 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.361 2 DEBUG nova.virt.libvirt.driver [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.371 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.441 2 INFO nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Took 8.68 seconds to spawn the instance on the hypervisor.
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.441 2 DEBUG nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.562 2 INFO nova.compute.manager [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Took 11.52 seconds to build instance.
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.582 2 DEBUG oslo_concurrency.lockutils [None req-adf217fb-ab36-4353-9204-ecfa6f716d39 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:40:12 compute-1 nova_compute[192795]: 2025-09-30 21:40:12.721 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:40:13 compute-1 nova_compute[192795]: 2025-09-30 21:40:13.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:13 compute-1 nova_compute[192795]: 2025-09-30 21:40:13.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:13 compute-1 nova_compute[192795]: 2025-09-30 21:40:13.855 2 INFO nova.compute.manager [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Rescuing
Sep 30 21:40:13 compute-1 nova_compute[192795]: 2025-09-30 21:40:13.855 2 DEBUG oslo_concurrency.lockutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "refresh_cache-333cdbd1-23c9-422d-b896-f3c6b76d4130" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:40:13 compute-1 nova_compute[192795]: 2025-09-30 21:40:13.856 2 DEBUG oslo_concurrency.lockutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquired lock "refresh_cache-333cdbd1-23c9-422d-b896-f3c6b76d4130" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:40:13 compute-1 nova_compute[192795]: 2025-09-30 21:40:13.856 2 DEBUG nova.network.neutron [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:40:14 compute-1 nova_compute[192795]: 2025-09-30 21:40:14.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.044 2 DEBUG oslo_concurrency.lockutils [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "829530ac-b0fb-4e39-896e-f01e78306ff8" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.045 2 DEBUG oslo_concurrency.lockutils [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.045 2 DEBUG nova.compute.manager [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.050 2 DEBUG nova.compute.manager [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.051 2 DEBUG nova.objects.instance [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'flavor' on Instance uuid 829530ac-b0fb-4e39-896e-f01e78306ff8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.096 2 DEBUG nova.objects.instance [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'info_cache' on Instance uuid 829530ac-b0fb-4e39-896e-f01e78306ff8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.131 2 DEBUG nova.network.neutron [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Updating instance_info_cache with network_info: [{"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.171 2 DEBUG nova.virt.libvirt.driver [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.177 2 DEBUG oslo_concurrency.lockutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Releasing lock "refresh_cache-333cdbd1-23c9-422d-b896-f3c6b76d4130" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:40:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:15.938 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:15 compute-1 nova_compute[192795]: 2025-09-30 21:40:15.963 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:40:18 compute-1 nova_compute[192795]: 2025-09-30 21:40:18.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:19 compute-1 podman[241106]: 2025-09-30 21:40:19.271326514 +0000 UTC m=+0.096122895 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:40:19 compute-1 nova_compute[192795]: 2025-09-30 21:40:19.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:23 compute-1 nova_compute[192795]: 2025-09-30 21:40:23.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:24 compute-1 nova_compute[192795]: 2025-09-30 21:40:24.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:25 compute-1 nova_compute[192795]: 2025-09-30 21:40:25.236 2 DEBUG nova.virt.libvirt.driver [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:40:25 compute-1 ovn_controller[94902]: 2025-09-30T21:40:25Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:27:df:08 10.100.0.12
Sep 30 21:40:25 compute-1 ovn_controller[94902]: 2025-09-30T21:40:25Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:df:08 10.100.0.12
Sep 30 21:40:26 compute-1 nova_compute[192795]: 2025-09-30 21:40:26.019 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:40:27 compute-1 podman[241160]: 2025-09-30 21:40:27.257094401 +0000 UTC m=+0.086731613 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:40:27 compute-1 podman[241162]: 2025-09-30 21:40:27.25890777 +0000 UTC m=+0.080730033 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:40:27 compute-1 podman[241161]: 2025-09-30 21:40:27.31009403 +0000 UTC m=+0.135453148 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:40:28 compute-1 kernel: tap54f28900-8a (unregistering): left promiscuous mode
Sep 30 21:40:28 compute-1 NetworkManager[51724]: <info>  [1759268428.1824] device (tap54f28900-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 ovn_controller[94902]: 2025-09-30T21:40:28Z|00497|binding|INFO|Releasing lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce from this chassis (sb_readonly=0)
Sep 30 21:40:28 compute-1 ovn_controller[94902]: 2025-09-30T21:40:28Z|00498|binding|INFO|Setting lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce down in Southbound
Sep 30 21:40:28 compute-1 ovn_controller[94902]: 2025-09-30T21:40:28Z|00499|binding|INFO|Removing iface tap54f28900-8a ovn-installed in OVS
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 kernel: tap2ece52f5-eb (unregistering): left promiscuous mode
Sep 30 21:40:28 compute-1 NetworkManager[51724]: <info>  [1759268428.2179] device (tap2ece52f5-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:40:28 compute-1 ovn_controller[94902]: 2025-09-30T21:40:28Z|00500|binding|INFO|Releasing lport 2ece52f5-eb6d-4f35-a922-5b5e099858af from this chassis (sb_readonly=1)
Sep 30 21:40:28 compute-1 ovn_controller[94902]: 2025-09-30T21:40:28Z|00501|binding|INFO|Removing iface tap2ece52f5-eb ovn-installed in OVS
Sep 30 21:40:28 compute-1 ovn_controller[94902]: 2025-09-30T21:40:28Z|00502|if_status|INFO|Dropped 2 log messages in last 224 seconds (most recently, 224 seconds ago) due to excessive rate
Sep 30 21:40:28 compute-1 ovn_controller[94902]: 2025-09-30T21:40:28Z|00503|if_status|INFO|Not setting lport 2ece52f5-eb6d-4f35-a922-5b5e099858af down as sb is readonly
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.231 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:df:08 10.100.0.12'], port_security=['fa:16:3e:27:df:08 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '333cdbd1-23c9-422d-b896-f3c6b76d4130', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c29435f306af4eebb7d6cb5bb416037d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a32e3a5-ee38-4fae-9fbe-f0444b488d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3ff2921-853e-4756-b8d5-05a55aa79dbf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=54f28900-8a59-4c2b-b1a3-a7b618a894ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.232 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 54f28900-8a59-4c2b-b1a3-a7b618a894ce in datapath 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a unbound from our chassis
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.234 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:40:28 compute-1 ovn_controller[94902]: 2025-09-30T21:40:28Z|00504|binding|INFO|Setting lport 2ece52f5-eb6d-4f35-a922-5b5e099858af down in Southbound
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.237 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0689c07e-3021-4b07-8efe-c5f685b0bf53]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.239 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a namespace which is not needed anymore
Sep 30 21:40:28 compute-1 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000084.scope: Deactivated successfully.
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000084.scope: Consumed 15.308s CPU time.
Sep 30 21:40:28 compute-1 systemd-machined[152783]: Machine qemu-60-instance-00000084 terminated.
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.255 2 INFO nova.virt.libvirt.driver [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Instance shutdown successfully after 13 seconds.
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.271 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:aa:47 10.100.0.8'], port_security=['fa:16:3e:c6:aa:47 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '829530ac-b0fb-4e39-896e-f01e78306ff8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=2ece52f5-eb6d-4f35-a922-5b5e099858af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:28 compute-1 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000085.scope: Deactivated successfully.
Sep 30 21:40:28 compute-1 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000085.scope: Consumed 14.412s CPU time.
Sep 30 21:40:28 compute-1 systemd-machined[152783]: Machine qemu-61-instance-00000085 terminated.
Sep 30 21:40:28 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241070]: [NOTICE]   (241074) : haproxy version is 2.8.14-c23fe91
Sep 30 21:40:28 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241070]: [NOTICE]   (241074) : path to executable is /usr/sbin/haproxy
Sep 30 21:40:28 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241070]: [WARNING]  (241074) : Exiting Master process...
Sep 30 21:40:28 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241070]: [ALERT]    (241074) : Current worker (241076) exited with code 143 (Terminated)
Sep 30 21:40:28 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241070]: [WARNING]  (241074) : All workers exited. Exiting... (0)
Sep 30 21:40:28 compute-1 systemd[1]: libpod-538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c.scope: Deactivated successfully.
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 podman[241259]: 2025-09-30 21:40:28.430744488 +0000 UTC m=+0.070622052 container died 538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 21:40:28 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:40:28 compute-1 systemd[1]: var-lib-containers-storage-overlay-b1572bfea610af20b5dcb270ea3120d3604316cfacf33a727f5b1b0732097348-merged.mount: Deactivated successfully.
Sep 30 21:40:28 compute-1 podman[241259]: 2025-09-30 21:40:28.48943905 +0000 UTC m=+0.129316594 container cleanup 538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:40:28 compute-1 systemd[1]: libpod-conmon-538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c.scope: Deactivated successfully.
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.539 2 INFO nova.virt.libvirt.driver [-] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Instance destroyed successfully.
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.539 2 DEBUG nova.objects.instance [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'numa_topology' on Instance uuid 829530ac-b0fb-4e39-896e-f01e78306ff8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.562 2 DEBUG nova.compute.manager [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:28 compute-1 podman[241317]: 2025-09-30 21:40:28.572219477 +0000 UTC m=+0.055134918 container remove 538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.578 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e8623a14-282c-4b50-97c2-6a1ac15be222]: (4, ('Tue Sep 30 09:40:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a (538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c)\n538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c\nTue Sep 30 09:40:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a (538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c)\n538e08a41dd7b0e5143f8449a3dc2b3b88513aba85b76dcf25d2ef254070694c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.581 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5918d9-a569-4cad-85a5-6424aa094347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.582 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f7a3c1e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.584 2 DEBUG nova.compute.manager [req-6a79a3ad-e49c-4e55-b094-3755508c0ee0 req-79d48f55-2cf0-46eb-8dea-9fec28804466 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-unplugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.584 2 DEBUG oslo_concurrency.lockutils [req-6a79a3ad-e49c-4e55-b094-3755508c0ee0 req-79d48f55-2cf0-46eb-8dea-9fec28804466 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.585 2 DEBUG oslo_concurrency.lockutils [req-6a79a3ad-e49c-4e55-b094-3755508c0ee0 req-79d48f55-2cf0-46eb-8dea-9fec28804466 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.585 2 DEBUG oslo_concurrency.lockutils [req-6a79a3ad-e49c-4e55-b094-3755508c0ee0 req-79d48f55-2cf0-46eb-8dea-9fec28804466 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.585 2 DEBUG nova.compute.manager [req-6a79a3ad-e49c-4e55-b094-3755508c0ee0 req-79d48f55-2cf0-46eb-8dea-9fec28804466 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] No waiting events found dispatching network-vif-unplugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.585 2 WARNING nova.compute.manager [req-6a79a3ad-e49c-4e55-b094-3755508c0ee0 req-79d48f55-2cf0-46eb-8dea-9fec28804466 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received unexpected event network-vif-unplugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce for instance with vm_state active and task_state rescuing.
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 kernel: tap9f7a3c1e-00: left promiscuous mode
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.609 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d6825459-1bed-4d8e-a8cc-1ac85f7c7ea0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.636 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe3c556-0a4a-479a-b17d-46dbbad598ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.638 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[49a7de54-180c-432b-8c81-941d61e23ba7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.656 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0e2ccf76-b84e-4eee-bdf5-1d4acde18cee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510554, 'reachable_time': 35256, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241345, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.659 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.659 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[5721d173-d9f7-4ef5-8961-4007675dac89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.660 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 2ece52f5-eb6d-4f35-a922-5b5e099858af in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c unbound from our chassis
Sep 30 21:40:28 compute-1 systemd[1]: run-netns-ovnmeta\x2d9f7a3c1e\x2d01ae\x2d4ec3\x2da5e2\x2d23ef8435e53a.mount: Deactivated successfully.
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.661 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.685 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb1bb8c-295e-47ec-87e5-45dd014c98a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.730 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5eb475-95bd-40fa-9e5f-8e1655c94b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.733 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f72e88-5fec-4f21-9889-85e2a8411ef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.782 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[297c6105-d651-4906-bfd5-2245b61a144d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.804 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[51d94a26-1d57-4e3a-9382-0b2aa8527325]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 916, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 916, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498413, 'reachable_time': 26194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241351, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.833 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[31e9cb4b-f410-4fbc-a994-86c23032e654]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap27086519-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498426, 'tstamp': 498426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241352, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap27086519-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498430, 'tstamp': 498430}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241352, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.835 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.849 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27086519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.849 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.849 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27086519-60, col_values=(('external_ids', {'iface-id': 'f2abb4ad-797b-4767-b8bc-377990516394'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:28.850 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:28 compute-1 nova_compute[192795]: 2025-09-30 21:40:28.925 2 DEBUG oslo_concurrency.lockutils [None req-4f0881dc-5074-498c-9c6b-8fdc5389132a 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.879s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.036 2 INFO nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance shutdown successfully after 13 seconds.
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.047 2 INFO nova.virt.libvirt.driver [-] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance destroyed successfully.
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.048 2 DEBUG nova.objects.instance [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'numa_topology' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.156 2 INFO nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Attempting rescue
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.157 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.163 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.164 2 INFO nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Creating image(s)
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.165 2 DEBUG oslo_concurrency.lockutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.166 2 DEBUG oslo_concurrency.lockutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.167 2 DEBUG oslo_concurrency.lockutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.168 2 DEBUG nova.objects.instance [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.245 2 DEBUG oslo_concurrency.lockutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.247 2 DEBUG oslo_concurrency.lockutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.272 2 DEBUG oslo_concurrency.processutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.363 2 DEBUG oslo_concurrency.processutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.365 2 DEBUG oslo_concurrency.processutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.422 2 DEBUG oslo_concurrency.processutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.rescue" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.424 2 DEBUG oslo_concurrency.lockutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.425 2 DEBUG nova.objects.instance [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'migration_context' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.498 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.499 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Start _get_guest_xml network_info=[{"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "vif_mac": "fa:16:3e:27:df:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '86b6907c-d747-4e98-8897-42105915831d', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.500 2 DEBUG nova.objects.instance [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'resources' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.578 2 WARNING nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.590 2 DEBUG nova.virt.libvirt.host [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.591 2 DEBUG nova.virt.libvirt.host [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.596 2 DEBUG nova.virt.libvirt.host [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.597 2 DEBUG nova.virt.libvirt.host [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.600 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.601 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.602 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.602 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.603 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.603 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.604 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.604 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.605 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.605 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.606 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.606 2 DEBUG nova.virt.hardware [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.607 2 DEBUG nova.objects.instance [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.714 2 DEBUG nova.virt.libvirt.vif [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:39:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-442085525',display_name='tempest-ServerRescueNegativeTestJSON-server-442085525',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-442085525',id=132,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:40:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c29435f306af4eebb7d6cb5bb416037d',ramdisk_id='',reservation_id='r-4zqgi81x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-493519679',owner_user_name='tempest-ServerRescueNegativeTestJSON-493519679-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:40:11Z,user_data=None,user_id='a8e4a8454b4d4d049dde1e287a040dfb',uuid=333cdbd1-23c9-422d-b896-f3c6b76d4130,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "vif_mac": "fa:16:3e:27:df:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.715 2 DEBUG nova.network.os_vif_util [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converting VIF {"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "vif_mac": "fa:16:3e:27:df:08"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.716 2 DEBUG nova.network.os_vif_util [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:df:08,bridge_name='br-int',has_traffic_filtering=True,id=54f28900-8a59-4c2b-b1a3-a7b618a894ce,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54f28900-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.717 2 DEBUG nova.objects.instance [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'pci_devices' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.762 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <uuid>333cdbd1-23c9-422d-b896-f3c6b76d4130</uuid>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <name>instance-00000084</name>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <nova:name>tempest-ServerRescueNegativeTestJSON-server-442085525</nova:name>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:40:29</nova:creationTime>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:40:29 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:40:29 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:40:29 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:40:29 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:40:29 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:40:29 compute-1 nova_compute[192795]:         <nova:user uuid="a8e4a8454b4d4d049dde1e287a040dfb">tempest-ServerRescueNegativeTestJSON-493519679-project-member</nova:user>
Sep 30 21:40:29 compute-1 nova_compute[192795]:         <nova:project uuid="c29435f306af4eebb7d6cb5bb416037d">tempest-ServerRescueNegativeTestJSON-493519679</nova:project>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:40:29 compute-1 nova_compute[192795]:         <nova:port uuid="54f28900-8a59-4c2b-b1a3-a7b618a894ce">
Sep 30 21:40:29 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <system>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <entry name="serial">333cdbd1-23c9-422d-b896-f3c6b76d4130</entry>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <entry name="uuid">333cdbd1-23c9-422d-b896-f3c6b76d4130</entry>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </system>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <os>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   </os>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <features>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   </features>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.rescue"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <target dev="vdb" bus="virtio"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.config.rescue"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:27:df:08"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <target dev="tap54f28900-8a"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/console.log" append="off"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <video>
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </video>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:40:29 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:40:29 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:40:29 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:40:29 compute-1 nova_compute[192795]: </domain>
Sep 30 21:40:29 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.773 2 INFO nova.virt.libvirt.driver [-] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance destroyed successfully.
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.878 2 DEBUG oslo_concurrency.lockutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "2c2f6f5b-4955-4915-b620-f377ca649c75" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.879 2 DEBUG oslo_concurrency.lockutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:29 compute-1 nova_compute[192795]: 2025-09-30 21:40:29.879 2 INFO nova.compute.manager [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Shelving
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.005 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.006 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.007 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.007 2 DEBUG nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No VIF found with MAC fa:16:3e:27:df:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.008 2 INFO nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Using config drive
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.116 2 DEBUG nova.objects.instance [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'ec2_ids' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.176 2 DEBUG nova.objects.instance [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'keypairs' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.223 2 DEBUG nova.virt.libvirt.driver [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.680 2 INFO nova.virt.libvirt.driver [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Creating config drive at /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.config.rescue
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.691 2 DEBUG oslo_concurrency.processutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvajzqcs9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.847 2 DEBUG oslo_concurrency.processutils [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvajzqcs9" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.908 2 DEBUG nova.compute.manager [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.909 2 DEBUG oslo_concurrency.lockutils [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.910 2 DEBUG oslo_concurrency.lockutils [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.910 2 DEBUG oslo_concurrency.lockutils [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.911 2 DEBUG nova.compute.manager [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] No waiting events found dispatching network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.911 2 WARNING nova.compute.manager [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received unexpected event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce for instance with vm_state active and task_state rescuing.
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.912 2 DEBUG nova.compute.manager [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Received event network-vif-unplugged-2ece52f5-eb6d-4f35-a922-5b5e099858af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.912 2 DEBUG oslo_concurrency.lockutils [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.912 2 DEBUG oslo_concurrency.lockutils [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.913 2 DEBUG oslo_concurrency.lockutils [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.913 2 DEBUG nova.compute.manager [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] No waiting events found dispatching network-vif-unplugged-2ece52f5-eb6d-4f35-a922-5b5e099858af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.913 2 WARNING nova.compute.manager [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Received unexpected event network-vif-unplugged-2ece52f5-eb6d-4f35-a922-5b5e099858af for instance with vm_state stopped and task_state None.
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.914 2 DEBUG nova.compute.manager [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Received event network-vif-plugged-2ece52f5-eb6d-4f35-a922-5b5e099858af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.914 2 DEBUG oslo_concurrency.lockutils [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.915 2 DEBUG oslo_concurrency.lockutils [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.915 2 DEBUG oslo_concurrency.lockutils [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.915 2 DEBUG nova.compute.manager [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] No waiting events found dispatching network-vif-plugged-2ece52f5-eb6d-4f35-a922-5b5e099858af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.916 2 WARNING nova.compute.manager [req-e9188bb5-6bff-4c2a-9dc8-4ac454b5f55f req-82bfd09f-1762-46a9-9532-dd6f93f3003c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Received unexpected event network-vif-plugged-2ece52f5-eb6d-4f35-a922-5b5e099858af for instance with vm_state stopped and task_state None.
Sep 30 21:40:30 compute-1 kernel: tap54f28900-8a: entered promiscuous mode
Sep 30 21:40:30 compute-1 systemd-udevd[241232]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:40:30 compute-1 NetworkManager[51724]: <info>  [1759268430.9479] manager: (tap54f28900-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Sep 30 21:40:30 compute-1 ovn_controller[94902]: 2025-09-30T21:40:30Z|00505|binding|INFO|Claiming lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce for this chassis.
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:30 compute-1 ovn_controller[94902]: 2025-09-30T21:40:30Z|00506|binding|INFO|54f28900-8a59-4c2b-b1a3-a7b618a894ce: Claiming fa:16:3e:27:df:08 10.100.0.12
Sep 30 21:40:30 compute-1 NetworkManager[51724]: <info>  [1759268430.9622] device (tap54f28900-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:40:30 compute-1 NetworkManager[51724]: <info>  [1759268430.9656] device (tap54f28900-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:40:30 compute-1 ovn_controller[94902]: 2025-09-30T21:40:30Z|00507|binding|INFO|Setting lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce ovn-installed in OVS
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:30 compute-1 nova_compute[192795]: 2025-09-30 21:40:30.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:31 compute-1 systemd-machined[152783]: New machine qemu-62-instance-00000084.
Sep 30 21:40:31 compute-1 systemd[1]: Started Virtual Machine qemu-62-instance-00000084.
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.079 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:df:08 10.100.0.12'], port_security=['fa:16:3e:27:df:08 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '333cdbd1-23c9-422d-b896-f3c6b76d4130', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c29435f306af4eebb7d6cb5bb416037d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7a32e3a5-ee38-4fae-9fbe-f0444b488d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3ff2921-853e-4756-b8d5-05a55aa79dbf, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=54f28900-8a59-4c2b-b1a3-a7b618a894ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.080 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 54f28900-8a59-4c2b-b1a3-a7b618a894ce in datapath 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a bound to our chassis
Sep 30 21:40:31 compute-1 ovn_controller[94902]: 2025-09-30T21:40:31Z|00508|binding|INFO|Setting lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce up in Southbound
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.081 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.104 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[566b33ca-63ea-46ad-938a-2144e767a1f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.105 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f7a3c1e-01 in ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.108 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f7a3c1e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.109 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cafddbd8-48ca-4890-98af-66980eb1896b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.110 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e03e2429-4177-42d6-aba8-b12638901509]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.125 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ef75da-f98f-4e03-a48e-d897e2aea20c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.146 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[525d0a91-198a-4325-87bb-f4cda7886483]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.180 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[22f709e4-cdcd-4758-a6de-839e04512424]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 NetworkManager[51724]: <info>  [1759268431.1927] manager: (tap9f7a3c1e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/251)
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.194 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[eaade59a-340f-4a74-99fb-83ca8d8c5204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 systemd-udevd[241400]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.243 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7654db0a-a8b7-438c-8668-f84685088bc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.248 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2b0a1dd9-bd1a-4df0-841e-6c06a72e219e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 NetworkManager[51724]: <info>  [1759268431.2803] device (tap9f7a3c1e-00): carrier: link connected
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.286 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8322e9-aff0-4c48-9fc9-ead1580a2e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.314 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ebf44f-0e51-4c97-a7e6-8e9936266759]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f7a3c1e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:db:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512695, 'reachable_time': 39929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241426, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.337 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5d601a35-38f3-4401-ac45-0cf9838dde49]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:dba1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512695, 'tstamp': 512695}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241429, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.360 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6606c162-384f-4fc8-886c-6cd78722b873]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f7a3c1e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:db:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512695, 'reachable_time': 39929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241430, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.408 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9e6cd3e0-05ca-4921-bcc7-7642c95ea006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.497 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5139a5b2-9633-4034-b162-6a2cc60a4f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.499 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f7a3c1e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.500 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.501 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f7a3c1e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:31 compute-1 NetworkManager[51724]: <info>  [1759268431.5061] manager: (tap9f7a3c1e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Sep 30 21:40:31 compute-1 kernel: tap9f7a3c1e-00: entered promiscuous mode
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.513 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f7a3c1e-00, col_values=(('external_ids', {'iface-id': 'b7e0b2bc-8210-4854-9253-4d8208499194'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:31 compute-1 ovn_controller[94902]: 2025-09-30T21:40:31Z|00509|binding|INFO|Releasing lport b7e0b2bc-8210-4854-9253-4d8208499194 from this chassis (sb_readonly=0)
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.540 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.541 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9f859204-aebe-44cd-be61-0928bf50a89b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.542 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.pid.haproxy
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:40:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:31.542 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'env', 'PROCESS_TAG=haproxy-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.862 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 333cdbd1-23c9-422d-b896-f3c6b76d4130 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.864 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268431.8619268, 333cdbd1-23c9-422d-b896-f3c6b76d4130 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.865 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] VM Resumed (Lifecycle Event)
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.888 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.893 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.931 2 DEBUG nova.compute.manager [None req-9cca21e1-84df-4114-8bc2-f0e1657d6bd9 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.941 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] During sync_power_state the instance has a pending task (rescuing). Skip.
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.942 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268431.863128, 333cdbd1-23c9-422d-b896-f3c6b76d4130 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:31 compute-1 nova_compute[192795]: 2025-09-30 21:40:31.942 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] VM Started (Lifecycle Event)
Sep 30 21:40:31 compute-1 podman[241464]: 2025-09-30 21:40:31.979132395 +0000 UTC m=+0.076486490 container create 4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.015 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.025 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:32 compute-1 podman[241464]: 2025-09-30 21:40:31.933094952 +0000 UTC m=+0.030449037 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:40:32 compute-1 systemd[1]: Started libpod-conmon-4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d.scope.
Sep 30 21:40:32 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:40:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec50e5d9b3060211bc75a073657a991d13e69130363145df51d3a001de6d1825/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:40:32 compute-1 podman[241464]: 2025-09-30 21:40:32.116452882 +0000 UTC m=+0.213806977 container init 4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.120 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] During sync_power_state the instance has a pending task (rescuing). Skip.
Sep 30 21:40:32 compute-1 podman[241464]: 2025-09-30 21:40:32.128275698 +0000 UTC m=+0.225629763 container start 4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:40:32 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241478]: [NOTICE]   (241482) : New worker (241484) forked
Sep 30 21:40:32 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241478]: [NOTICE]   (241482) : Loading success.
Sep 30 21:40:32 compute-1 kernel: tapa210664d-c1 (unregistering): left promiscuous mode
Sep 30 21:40:32 compute-1 NetworkManager[51724]: <info>  [1759268432.4796] device (tapa210664d-c1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:40:32 compute-1 ovn_controller[94902]: 2025-09-30T21:40:32Z|00510|binding|INFO|Releasing lport a210664d-c1de-43b3-8844-2a8aedba5ac1 from this chassis (sb_readonly=0)
Sep 30 21:40:32 compute-1 ovn_controller[94902]: 2025-09-30T21:40:32Z|00511|binding|INFO|Setting lport a210664d-c1de-43b3-8844-2a8aedba5ac1 down in Southbound
Sep 30 21:40:32 compute-1 ovn_controller[94902]: 2025-09-30T21:40:32Z|00512|binding|INFO|Removing iface tapa210664d-c1 ovn-installed in OVS
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.498 2 DEBUG oslo_concurrency.lockutils [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "829530ac-b0fb-4e39-896e-f01e78306ff8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.500 2 DEBUG oslo_concurrency.lockutils [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.502 2 DEBUG oslo_concurrency.lockutils [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.502 2 DEBUG oslo_concurrency.lockutils [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.503 2 DEBUG oslo_concurrency.lockutils [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:32 compute-1 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Sep 30 21:40:32 compute-1 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007b.scope: Consumed 17.424s CPU time.
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.559 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:0e:d5 10.100.0.4'], port_security=['fa:16:3e:2b:0e:d5 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2c2f6f5b-4955-4915-b620-f377ca649c75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51c02ace4fff44cca028986381d7c407', 'neutron:revision_number': '4', 'neutron:security_group_ids': '62973e02-a61c-4061-8a33-46cf1b8a21f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6f1e456-6208-4d0c-8c96-5e3a3d932af6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=a210664d-c1de-43b3-8844-2a8aedba5ac1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.561 103861 INFO neutron.agent.ovn.metadata.agent [-] Port a210664d-c1de-43b3-8844-2a8aedba5ac1 in datapath ea80450e-b8f2-4af5-a00d-9221e5dd4d97 unbound from our chassis
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.564 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea80450e-b8f2-4af5-a00d-9221e5dd4d97, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.565 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[25629c2f-d12b-46ce-b97d-a6c177c91b9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.566 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 namespace which is not needed anymore
Sep 30 21:40:32 compute-1 systemd-machined[152783]: Machine qemu-58-instance-0000007b terminated.
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.592 2 INFO nova.compute.manager [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Terminating instance
Sep 30 21:40:32 compute-1 podman[241494]: 2025-09-30 21:40:32.611009735 +0000 UTC m=+0.099016613 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, org.label-schema.build-date=20250923, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:40:32 compute-1 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[240327]: [NOTICE]   (240331) : haproxy version is 2.8.14-c23fe91
Sep 30 21:40:32 compute-1 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[240327]: [NOTICE]   (240331) : path to executable is /usr/sbin/haproxy
Sep 30 21:40:32 compute-1 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[240327]: [WARNING]  (240331) : Exiting Master process...
Sep 30 21:40:32 compute-1 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[240327]: [ALERT]    (240331) : Current worker (240333) exited with code 143 (Terminated)
Sep 30 21:40:32 compute-1 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[240327]: [WARNING]  (240331) : All workers exited. Exiting... (0)
Sep 30 21:40:32 compute-1 systemd[1]: libpod-cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11.scope: Deactivated successfully.
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.740 2 DEBUG nova.compute.manager [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:40:32 compute-1 podman[241535]: 2025-09-30 21:40:32.74228508 +0000 UTC m=+0.054947382 container died cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.761 2 INFO nova.virt.libvirt.driver [-] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Instance destroyed successfully.
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.762 2 DEBUG nova.objects.instance [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'resources' on Instance uuid 829530ac-b0fb-4e39-896e-f01e78306ff8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11-userdata-shm.mount: Deactivated successfully.
Sep 30 21:40:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-383daa8f2dee1d7869e27472a7694a20027adff6d6d0aac2fa030b90d3e10e2f-merged.mount: Deactivated successfully.
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.786 2 DEBUG nova.virt.libvirt.vif [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:39:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-591878727',display_name='tempest-Íñstáñcé-1706046391',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-591878727',id=133,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:40:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-utemcsfg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:40:30Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=829530ac-b0fb-4e39-896e-f01e78306ff8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.787 2 DEBUG nova.network.os_vif_util [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "address": "fa:16:3e:c6:aa:47", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ece52f5-eb", "ovs_interfaceid": "2ece52f5-eb6d-4f35-a922-5b5e099858af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.788 2 DEBUG nova.network.os_vif_util [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:aa:47,bridge_name='br-int',has_traffic_filtering=True,id=2ece52f5-eb6d-4f35-a922-5b5e099858af,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ece52f5-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.788 2 DEBUG os_vif [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:aa:47,bridge_name='br-int',has_traffic_filtering=True,id=2ece52f5-eb6d-4f35-a922-5b5e099858af,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ece52f5-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.791 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ece52f5-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:32 compute-1 podman[241535]: 2025-09-30 21:40:32.80092792 +0000 UTC m=+0.113590242 container cleanup cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.805 2 INFO os_vif [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:aa:47,bridge_name='br-int',has_traffic_filtering=True,id=2ece52f5-eb6d-4f35-a922-5b5e099858af,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ece52f5-eb')
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.806 2 INFO nova.virt.libvirt.driver [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Deleting instance files /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8_del
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.807 2 INFO nova.virt.libvirt.driver [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Deletion of /var/lib/nova/instances/829530ac-b0fb-4e39-896e-f01e78306ff8_del complete
Sep 30 21:40:32 compute-1 systemd[1]: libpod-conmon-cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11.scope: Deactivated successfully.
Sep 30 21:40:32 compute-1 podman[241583]: 2025-09-30 21:40:32.871425038 +0000 UTC m=+0.049081025 container remove cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.877 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c822425b-32ec-49cd-ab9e-0e74ad34006f]: (4, ('Tue Sep 30 09:40:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 (cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11)\ncc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11\nTue Sep 30 09:40:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 (cc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11)\ncc12276affcecba00cf973b35ff74776b6b747d331c55708645d4cdd38840c11\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.879 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5eab59f2-4e3c-4806-966e-fce12bc710ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.880 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea80450e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:32 compute-1 kernel: tapea80450e-b0: left promiscuous mode
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.906 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a37de880-00dd-4de3-b728-f17e453d2be7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.935 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[82d9588a-d8ce-47d1-bff2-ec458cf0c322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.937 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb2c347-eea7-4967-81b8-f329c085d76a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.961 2 INFO nova.compute.manager [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Took 0.22 seconds to destroy the instance on the hypervisor.
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.962 2 DEBUG oslo.service.loopingcall [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.962 2 DEBUG nova.compute.manager [-] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:40:32 compute-1 nova_compute[192795]: 2025-09-30 21:40:32.962 2 DEBUG nova.network.neutron [-] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.964 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f24027c1-b856-4c2a-b8e2-0646b4031722]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503223, 'reachable_time': 19106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241601, 'error': None, 'target': 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:32 compute-1 systemd[1]: run-netns-ovnmeta\x2dea80450e\x2db8f2\x2d4af5\x2da00d\x2d9221e5dd4d97.mount: Deactivated successfully.
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.967 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:40:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:32.967 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[da83b60e-b3ad-4265-9bb6-4bb6a78332af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.256 2 INFO nova.virt.libvirt.driver [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Instance shutdown successfully after 3 seconds.
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.263 2 INFO nova.virt.libvirt.driver [-] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Instance destroyed successfully.
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.263 2 DEBUG nova.objects.instance [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2c2f6f5b-4955-4915-b620-f377ca649c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.438 2 DEBUG nova.compute.manager [req-2800d61a-3fb4-4fd8-a79c-3c57de783c93 req-d1fc4b56-4c6b-4aea-95cd-12d6bbb4bb7f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Received event network-vif-unplugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.439 2 DEBUG oslo_concurrency.lockutils [req-2800d61a-3fb4-4fd8-a79c-3c57de783c93 req-d1fc4b56-4c6b-4aea-95cd-12d6bbb4bb7f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.439 2 DEBUG oslo_concurrency.lockutils [req-2800d61a-3fb4-4fd8-a79c-3c57de783c93 req-d1fc4b56-4c6b-4aea-95cd-12d6bbb4bb7f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.439 2 DEBUG oslo_concurrency.lockutils [req-2800d61a-3fb4-4fd8-a79c-3c57de783c93 req-d1fc4b56-4c6b-4aea-95cd-12d6bbb4bb7f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.439 2 DEBUG nova.compute.manager [req-2800d61a-3fb4-4fd8-a79c-3c57de783c93 req-d1fc4b56-4c6b-4aea-95cd-12d6bbb4bb7f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] No waiting events found dispatching network-vif-unplugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.440 2 WARNING nova.compute.manager [req-2800d61a-3fb4-4fd8-a79c-3c57de783c93 req-d1fc4b56-4c6b-4aea-95cd-12d6bbb4bb7f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Received unexpected event network-vif-unplugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 for instance with vm_state active and task_state shelving.
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.491 2 DEBUG nova.compute.manager [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.491 2 DEBUG oslo_concurrency.lockutils [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.491 2 DEBUG oslo_concurrency.lockutils [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.492 2 DEBUG oslo_concurrency.lockutils [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.492 2 DEBUG nova.compute.manager [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] No waiting events found dispatching network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.492 2 WARNING nova.compute.manager [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received unexpected event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce for instance with vm_state rescued and task_state None.
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.492 2 DEBUG nova.compute.manager [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.492 2 DEBUG oslo_concurrency.lockutils [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.492 2 DEBUG oslo_concurrency.lockutils [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.493 2 DEBUG oslo_concurrency.lockutils [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.493 2 DEBUG nova.compute.manager [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] No waiting events found dispatching network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.493 2 WARNING nova.compute.manager [req-8057d48d-3533-435a-9faf-8f5f9f4861af req-6ad0b24c-651c-4c50-97fb-5ca713c49367 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received unexpected event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce for instance with vm_state rescued and task_state None.
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.621 2 INFO nova.virt.libvirt.driver [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Beginning cold snapshot process
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.965 2 DEBUG nova.network.neutron [-] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.973 2 DEBUG nova.privsep.utils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:40:33 compute-1 nova_compute[192795]: 2025-09-30 21:40:33.974 2 DEBUG oslo_concurrency.processutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk /var/lib/nova/instances/snapshots/tmpsfkn9kvm/089f56504cca493e810050075a6ae338 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:34 compute-1 nova_compute[192795]: 2025-09-30 21:40:34.010 2 INFO nova.compute.manager [-] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Took 1.05 seconds to deallocate network for instance.
Sep 30 21:40:34 compute-1 nova_compute[192795]: 2025-09-30 21:40:34.177 2 DEBUG oslo_concurrency.lockutils [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:34 compute-1 nova_compute[192795]: 2025-09-30 21:40:34.178 2 DEBUG oslo_concurrency.lockutils [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:34 compute-1 nova_compute[192795]: 2025-09-30 21:40:34.530 2 DEBUG nova.compute.provider_tree [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:34 compute-1 nova_compute[192795]: 2025-09-30 21:40:34.565 2 DEBUG nova.scheduler.client.report [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:34 compute-1 nova_compute[192795]: 2025-09-30 21:40:34.614 2 DEBUG oslo_concurrency.lockutils [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:34 compute-1 nova_compute[192795]: 2025-09-30 21:40:34.725 2 DEBUG oslo_concurrency.processutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75/disk /var/lib/nova/instances/snapshots/tmpsfkn9kvm/089f56504cca493e810050075a6ae338" returned: 0 in 0.750s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:34 compute-1 nova_compute[192795]: 2025-09-30 21:40:34.725 2 INFO nova.virt.libvirt.driver [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Snapshot extracted, beginning image upload
Sep 30 21:40:34 compute-1 nova_compute[192795]: 2025-09-30 21:40:34.852 2 INFO nova.scheduler.client.report [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Deleted allocations for instance 829530ac-b0fb-4e39-896e-f01e78306ff8
Sep 30 21:40:35 compute-1 nova_compute[192795]: 2025-09-30 21:40:35.221 2 DEBUG oslo_concurrency.lockutils [None req-6bd7ae34-734f-4810-be0c-a068ff8f64e6 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "829530ac-b0fb-4e39-896e-f01e78306ff8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:35 compute-1 nova_compute[192795]: 2025-09-30 21:40:35.588 2 DEBUG nova.compute.manager [req-813513e3-7613-4c54-a502-e76b46a0b085 req-5d8d586f-7744-437e-bca9-e3905b7c7886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Received event network-vif-plugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:35 compute-1 nova_compute[192795]: 2025-09-30 21:40:35.588 2 DEBUG oslo_concurrency.lockutils [req-813513e3-7613-4c54-a502-e76b46a0b085 req-5d8d586f-7744-437e-bca9-e3905b7c7886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:35 compute-1 nova_compute[192795]: 2025-09-30 21:40:35.589 2 DEBUG oslo_concurrency.lockutils [req-813513e3-7613-4c54-a502-e76b46a0b085 req-5d8d586f-7744-437e-bca9-e3905b7c7886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:35 compute-1 nova_compute[192795]: 2025-09-30 21:40:35.589 2 DEBUG oslo_concurrency.lockutils [req-813513e3-7613-4c54-a502-e76b46a0b085 req-5d8d586f-7744-437e-bca9-e3905b7c7886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:35 compute-1 nova_compute[192795]: 2025-09-30 21:40:35.589 2 DEBUG nova.compute.manager [req-813513e3-7613-4c54-a502-e76b46a0b085 req-5d8d586f-7744-437e-bca9-e3905b7c7886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] No waiting events found dispatching network-vif-plugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:35 compute-1 nova_compute[192795]: 2025-09-30 21:40:35.589 2 WARNING nova.compute.manager [req-813513e3-7613-4c54-a502-e76b46a0b085 req-5d8d586f-7744-437e-bca9-e3905b7c7886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Received unexpected event network-vif-plugged-a210664d-c1de-43b3-8844-2a8aedba5ac1 for instance with vm_state active and task_state shelving_image_uploading.
Sep 30 21:40:35 compute-1 nova_compute[192795]: 2025-09-30 21:40:35.590 2 DEBUG nova.compute.manager [req-813513e3-7613-4c54-a502-e76b46a0b085 req-5d8d586f-7744-437e-bca9-e3905b7c7886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Received event network-vif-deleted-2ece52f5-eb6d-4f35-a922-5b5e099858af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.687 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.727 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Triggering sync for uuid 628fd442-ed35-482c-91db-4a57f527b6a8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.727 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Triggering sync for uuid 2c2f6f5b-4955-4915-b620-f377ca649c75 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.728 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Triggering sync for uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.728 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "628fd442-ed35-482c-91db-4a57f527b6a8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.728 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "628fd442-ed35-482c-91db-4a57f527b6a8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "2c2f6f5b-4955-4915-b620-f377ca649c75" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.730 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.769 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "628fd442-ed35-482c-91db-4a57f527b6a8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.770 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.936 2 INFO nova.virt.libvirt.driver [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Snapshot image upload complete
Sep 30 21:40:37 compute-1 nova_compute[192795]: 2025-09-30 21:40:37.937 2 DEBUG nova.compute.manager [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.370 2 INFO nova.compute.manager [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Shelve offloading
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.408 2 INFO nova.virt.libvirt.driver [-] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Instance destroyed successfully.
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.409 2 DEBUG nova.compute.manager [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.411 2 DEBUG oslo_concurrency.lockutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.412 2 DEBUG oslo_concurrency.lockutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquired lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.412 2 DEBUG nova.network.neutron [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:38.699 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:38.700 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:38.701 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.854 2 DEBUG oslo_concurrency.lockutils [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "628fd442-ed35-482c-91db-4a57f527b6a8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.856 2 DEBUG oslo_concurrency.lockutils [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.856 2 DEBUG oslo_concurrency.lockutils [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.857 2 DEBUG oslo_concurrency.lockutils [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.857 2 DEBUG oslo_concurrency.lockutils [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.921 2 INFO nova.compute.manager [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Terminating instance
Sep 30 21:40:38 compute-1 nova_compute[192795]: 2025-09-30 21:40:38.989 2 DEBUG nova.compute.manager [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:40:39 compute-1 kernel: tap6be1fbd6-57 (unregistering): left promiscuous mode
Sep 30 21:40:39 compute-1 NetworkManager[51724]: <info>  [1759268439.0388] device (tap6be1fbd6-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:40:39 compute-1 ovn_controller[94902]: 2025-09-30T21:40:39Z|00513|binding|INFO|Releasing lport 6be1fbd6-5748-4b3c-af41-2287770931ef from this chassis (sb_readonly=0)
Sep 30 21:40:39 compute-1 ovn_controller[94902]: 2025-09-30T21:40:39Z|00514|binding|INFO|Setting lport 6be1fbd6-5748-4b3c-af41-2287770931ef down in Southbound
Sep 30 21:40:39 compute-1 ovn_controller[94902]: 2025-09-30T21:40:39Z|00515|binding|INFO|Removing iface tap6be1fbd6-57 ovn-installed in OVS
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.061 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:23:75 10.100.0.5'], port_security=['fa:16:3e:c6:23:75 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '628fd442-ed35-482c-91db-4a57f527b6a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=6be1fbd6-5748-4b3c-af41-2287770931ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.062 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 6be1fbd6-5748-4b3c-af41-2287770931ef in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c unbound from our chassis
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.064 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 27086519-6f4c-45f9-8e5b-5b321cd6871c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.072 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[019ae23f-00d8-4958-ac99-f809e5e70380]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.073 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace which is not needed anymore
Sep 30 21:40:39 compute-1 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000076.scope: Deactivated successfully.
Sep 30 21:40:39 compute-1 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000076.scope: Consumed 19.324s CPU time.
Sep 30 21:40:39 compute-1 systemd-machined[152783]: Machine qemu-57-instance-00000076 terminated.
Sep 30 21:40:39 compute-1 podman[241619]: 2025-09-30 21:40:39.146530229 +0000 UTC m=+0.082943222 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:40:39 compute-1 podman[241618]: 2025-09-30 21:40:39.14734055 +0000 UTC m=+0.084353759 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:40:39 compute-1 podman[241615]: 2025-09-30 21:40:39.213829501 +0000 UTC m=+0.131570384 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Sep 30 21:40:39 compute-1 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239910]: [NOTICE]   (239914) : haproxy version is 2.8.14-c23fe91
Sep 30 21:40:39 compute-1 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239910]: [NOTICE]   (239914) : path to executable is /usr/sbin/haproxy
Sep 30 21:40:39 compute-1 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239910]: [WARNING]  (239914) : Exiting Master process...
Sep 30 21:40:39 compute-1 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239910]: [WARNING]  (239914) : Exiting Master process...
Sep 30 21:40:39 compute-1 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239910]: [ALERT]    (239914) : Current worker (239916) exited with code 143 (Terminated)
Sep 30 21:40:39 compute-1 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239910]: [WARNING]  (239914) : All workers exited. Exiting... (0)
Sep 30 21:40:39 compute-1 systemd[1]: libpod-6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b.scope: Deactivated successfully.
Sep 30 21:40:39 compute-1 podman[241693]: 2025-09-30 21:40:39.271973558 +0000 UTC m=+0.069369278 container died 6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.273 2 INFO nova.virt.libvirt.driver [-] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Instance destroyed successfully.
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.274 2 DEBUG nova.objects.instance [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'resources' on Instance uuid 628fd442-ed35-482c-91db-4a57f527b6a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:39 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b-userdata-shm.mount: Deactivated successfully.
Sep 30 21:40:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-69ad8829d099c2530ffca2473d4e69370e3090c06badb8d726547a2d2d2c19c9-merged.mount: Deactivated successfully.
Sep 30 21:40:39 compute-1 podman[241693]: 2025-09-30 21:40:39.307716805 +0000 UTC m=+0.105112545 container cleanup 6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:40:39 compute-1 systemd[1]: libpod-conmon-6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b.scope: Deactivated successfully.
Sep 30 21:40:39 compute-1 podman[241744]: 2025-09-30 21:40:39.378412348 +0000 UTC m=+0.043838865 container remove 6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.380 2 DEBUG nova.virt.libvirt.vif [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:38:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-₡-2040743504',display_name='tempest-₡-2040743504',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--2040743504',id=118,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:38:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-deam0ef4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:38:13Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=628fd442-ed35-482c-91db-4a57f527b6a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.381 2 DEBUG nova.network.os_vif_util [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "6be1fbd6-5748-4b3c-af41-2287770931ef", "address": "fa:16:3e:c6:23:75", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6be1fbd6-57", "ovs_interfaceid": "6be1fbd6-5748-4b3c-af41-2287770931ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.382 2 DEBUG nova.network.os_vif_util [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c6:23:75,bridge_name='br-int',has_traffic_filtering=True,id=6be1fbd6-5748-4b3c-af41-2287770931ef,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6be1fbd6-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.382 2 DEBUG os_vif [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:23:75,bridge_name='br-int',has_traffic_filtering=True,id=6be1fbd6-5748-4b3c-af41-2287770931ef,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6be1fbd6-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.384 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6be1fbd6-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.394 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b184baa4-3d69-4356-82f3-4087f1034576]: (4, ('Tue Sep 30 09:40:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b)\n6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b\nTue Sep 30 09:40:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b)\n6e149eea435743fe6fc801b81d60ac85acbc8f42247fab8f72c62e7d75e4e62b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.429 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8f19b57f-55ca-4189-8112-f1149f2d8749]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.430 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:39 compute-1 kernel: tap27086519-60: left promiscuous mode
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.440 2 INFO os_vif [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:23:75,bridge_name='br-int',has_traffic_filtering=True,id=6be1fbd6-5748-4b3c-af41-2287770931ef,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6be1fbd6-57')
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.441 2 INFO nova.virt.libvirt.driver [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Deleting instance files /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8_del
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.443 2 INFO nova.virt.libvirt.driver [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Deletion of /var/lib/nova/instances/628fd442-ed35-482c-91db-4a57f527b6a8_del complete
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.459 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c231b349-2326-4945-bea4-bb6951f1b5e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.491 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f1283e-d93a-4870-88df-0990bbbb1f8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.493 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f891f7a2-952c-4f1e-ae37-5a372b2a2268]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.502 2 DEBUG nova.compute.manager [req-e85c8782-deb2-45df-b8dd-a46783628945 req-2fab0e0e-065c-4e79-a24f-94d2d20e2fd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Received event network-vif-unplugged-6be1fbd6-5748-4b3c-af41-2287770931ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.502 2 DEBUG oslo_concurrency.lockutils [req-e85c8782-deb2-45df-b8dd-a46783628945 req-2fab0e0e-065c-4e79-a24f-94d2d20e2fd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.503 2 DEBUG oslo_concurrency.lockutils [req-e85c8782-deb2-45df-b8dd-a46783628945 req-2fab0e0e-065c-4e79-a24f-94d2d20e2fd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.503 2 DEBUG oslo_concurrency.lockutils [req-e85c8782-deb2-45df-b8dd-a46783628945 req-2fab0e0e-065c-4e79-a24f-94d2d20e2fd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.503 2 DEBUG nova.compute.manager [req-e85c8782-deb2-45df-b8dd-a46783628945 req-2fab0e0e-065c-4e79-a24f-94d2d20e2fd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] No waiting events found dispatching network-vif-unplugged-6be1fbd6-5748-4b3c-af41-2287770931ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.504 2 DEBUG nova.compute.manager [req-e85c8782-deb2-45df-b8dd-a46783628945 req-2fab0e0e-065c-4e79-a24f-94d2d20e2fd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Received event network-vif-unplugged-6be1fbd6-5748-4b3c-af41-2287770931ef for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.523 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[625eb239-bc64-4b36-96fe-48b01b8118d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498404, 'reachable_time': 30195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241764, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:39 compute-1 systemd[1]: run-netns-ovnmeta\x2d27086519\x2d6f4c\x2d45f9\x2d8e5b\x2d5b321cd6871c.mount: Deactivated successfully.
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.529 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:40:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:39.529 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[188fe3ff-c90c-43b5-b120-0596a30113ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.586 2 INFO nova.compute.manager [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Took 0.60 seconds to destroy the instance on the hypervisor.
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.586 2 DEBUG oslo.service.loopingcall [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.587 2 DEBUG nova.compute.manager [-] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:40:39 compute-1 nova_compute[192795]: 2025-09-30 21:40:39.587 2 DEBUG nova.network.neutron [-] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:40:40 compute-1 nova_compute[192795]: 2025-09-30 21:40:40.420 2 DEBUG nova.network.neutron [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Updating instance_info_cache with network_info: [{"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:40 compute-1 nova_compute[192795]: 2025-09-30 21:40:40.480 2 DEBUG oslo_concurrency.lockutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Releasing lock "refresh_cache-2c2f6f5b-4955-4915-b620-f377ca649c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:40:40 compute-1 nova_compute[192795]: 2025-09-30 21:40:40.904 2 DEBUG nova.network.neutron [-] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:40 compute-1 nova_compute[192795]: 2025-09-30 21:40:40.924 2 INFO nova.compute.manager [-] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Took 1.34 seconds to deallocate network for instance.
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.007 2 DEBUG oslo_concurrency.lockutils [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.008 2 DEBUG oslo_concurrency.lockutils [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.008 2 DEBUG oslo_concurrency.lockutils [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.009 2 DEBUG oslo_concurrency.lockutils [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.009 2 DEBUG oslo_concurrency.lockutils [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.014 2 DEBUG oslo_concurrency.lockutils [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.014 2 DEBUG oslo_concurrency.lockutils [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.025 2 INFO nova.compute.manager [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Terminating instance
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.039 2 DEBUG nova.compute.manager [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:40:41 compute-1 kernel: tap54f28900-8a (unregistering): left promiscuous mode
Sep 30 21:40:41 compute-1 NetworkManager[51724]: <info>  [1759268441.0774] device (tap54f28900-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 ovn_controller[94902]: 2025-09-30T21:40:41Z|00516|binding|INFO|Releasing lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce from this chassis (sb_readonly=0)
Sep 30 21:40:41 compute-1 ovn_controller[94902]: 2025-09-30T21:40:41Z|00517|binding|INFO|Setting lport 54f28900-8a59-4c2b-b1a3-a7b618a894ce down in Southbound
Sep 30 21:40:41 compute-1 ovn_controller[94902]: 2025-09-30T21:40:41Z|00518|binding|INFO|Removing iface tap54f28900-8a ovn-installed in OVS
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.105 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:df:08 10.100.0.12'], port_security=['fa:16:3e:27:df:08 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '333cdbd1-23c9-422d-b896-f3c6b76d4130', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c29435f306af4eebb7d6cb5bb416037d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7a32e3a5-ee38-4fae-9fbe-f0444b488d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3ff2921-853e-4756-b8d5-05a55aa79dbf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=54f28900-8a59-4c2b-b1a3-a7b618a894ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.107 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 54f28900-8a59-4c2b-b1a3-a7b618a894ce in datapath 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a unbound from our chassis
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.111 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.112 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3ddbc2cd-c067-45cc-b73e-016bad49fdeb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.113 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a namespace which is not needed anymore
Sep 30 21:40:41 compute-1 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000084.scope: Deactivated successfully.
Sep 30 21:40:41 compute-1 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000084.scope: Consumed 9.902s CPU time.
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.144 2 DEBUG nova.compute.provider_tree [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:41 compute-1 systemd-machined[152783]: Machine qemu-62-instance-00000084 terminated.
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.176 2 DEBUG nova.scheduler.client.report [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.200 2 DEBUG oslo_concurrency.lockutils [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.244 2 INFO nova.scheduler.client.report [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Deleted allocations for instance 628fd442-ed35-482c-91db-4a57f527b6a8
Sep 30 21:40:41 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241478]: [NOTICE]   (241482) : haproxy version is 2.8.14-c23fe91
Sep 30 21:40:41 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241478]: [NOTICE]   (241482) : path to executable is /usr/sbin/haproxy
Sep 30 21:40:41 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241478]: [WARNING]  (241482) : Exiting Master process...
Sep 30 21:40:41 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241478]: [WARNING]  (241482) : Exiting Master process...
Sep 30 21:40:41 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241478]: [ALERT]    (241482) : Current worker (241484) exited with code 143 (Terminated)
Sep 30 21:40:41 compute-1 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[241478]: [WARNING]  (241482) : All workers exited. Exiting... (0)
Sep 30 21:40:41 compute-1 NetworkManager[51724]: <info>  [1759268441.2679] manager: (tap54f28900-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Sep 30 21:40:41 compute-1 systemd[1]: libpod-4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d.scope: Deactivated successfully.
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 podman[241790]: 2025-09-30 21:40:41.273567746 +0000 UTC m=+0.052520328 container died 4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d-userdata-shm.mount: Deactivated successfully.
Sep 30 21:40:41 compute-1 systemd[1]: var-lib-containers-storage-overlay-ec50e5d9b3060211bc75a073657a991d13e69130363145df51d3a001de6d1825-merged.mount: Deactivated successfully.
Sep 30 21:40:41 compute-1 podman[241790]: 2025-09-30 21:40:41.313270938 +0000 UTC m=+0.092223500 container cleanup 4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:40:41 compute-1 systemd[1]: libpod-conmon-4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d.scope: Deactivated successfully.
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.333 2 DEBUG oslo_concurrency.lockutils [None req-8d1fe027-536c-41a7-8659-e06699750231 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.346 2 INFO nova.virt.libvirt.driver [-] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Instance destroyed successfully.
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.348 2 DEBUG nova.objects.instance [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'resources' on Instance uuid 333cdbd1-23c9-422d-b896-f3c6b76d4130 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.364 2 DEBUG nova.virt.libvirt.vif [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:39:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-442085525',display_name='tempest-ServerRescueNegativeTestJSON-server-442085525',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-442085525',id=132,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:40:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c29435f306af4eebb7d6cb5bb416037d',ramdisk_id='',reservation_id='r-4zqgi81x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-493519679',owner_user_name='tempest-ServerRescueNegativeTestJSON-493519679-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:40:32Z,user_data=None,user_id='a8e4a8454b4d4d049dde1e287a040dfb',uuid=333cdbd1-23c9-422d-b896-f3c6b76d4130,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.364 2 DEBUG nova.network.os_vif_util [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converting VIF {"id": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "address": "fa:16:3e:27:df:08", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54f28900-8a", "ovs_interfaceid": "54f28900-8a59-4c2b-b1a3-a7b618a894ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.366 2 DEBUG nova.network.os_vif_util [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:df:08,bridge_name='br-int',has_traffic_filtering=True,id=54f28900-8a59-4c2b-b1a3-a7b618a894ce,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54f28900-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.367 2 DEBUG os_vif [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:df:08,bridge_name='br-int',has_traffic_filtering=True,id=54f28900-8a59-4c2b-b1a3-a7b618a894ce,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54f28900-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.369 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54f28900-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.379 2 INFO os_vif [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:df:08,bridge_name='br-int',has_traffic_filtering=True,id=54f28900-8a59-4c2b-b1a3-a7b618a894ce,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54f28900-8a')
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.380 2 INFO nova.virt.libvirt.driver [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Deleting instance files /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130_del
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.381 2 INFO nova.virt.libvirt.driver [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Deletion of /var/lib/nova/instances/333cdbd1-23c9-422d-b896-f3c6b76d4130_del complete
Sep 30 21:40:41 compute-1 podman[241838]: 2025-09-30 21:40:41.387541707 +0000 UTC m=+0.047331167 container remove 4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.390 2 DEBUG nova.compute.manager [req-996c9cd4-5f0d-401f-8adc-e6359955289a req-ee37ab7a-6354-4d96-adfe-3211d1184547 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-unplugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.390 2 DEBUG oslo_concurrency.lockutils [req-996c9cd4-5f0d-401f-8adc-e6359955289a req-ee37ab7a-6354-4d96-adfe-3211d1184547 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.391 2 DEBUG oslo_concurrency.lockutils [req-996c9cd4-5f0d-401f-8adc-e6359955289a req-ee37ab7a-6354-4d96-adfe-3211d1184547 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.391 2 DEBUG oslo_concurrency.lockutils [req-996c9cd4-5f0d-401f-8adc-e6359955289a req-ee37ab7a-6354-4d96-adfe-3211d1184547 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.391 2 DEBUG nova.compute.manager [req-996c9cd4-5f0d-401f-8adc-e6359955289a req-ee37ab7a-6354-4d96-adfe-3211d1184547 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] No waiting events found dispatching network-vif-unplugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.391 2 DEBUG nova.compute.manager [req-996c9cd4-5f0d-401f-8adc-e6359955289a req-ee37ab7a-6354-4d96-adfe-3211d1184547 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-unplugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.396 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[15337717-1047-49de-98ed-c89233425874]: (4, ('Tue Sep 30 09:40:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a (4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d)\n4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d\nTue Sep 30 09:40:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a (4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d)\n4e7a827ecfc5334d9518c6769cf1742d63a2d266b5d113155e7be863ba3c015d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.398 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[954c291a-a312-408b-8d6d-d714a374ea88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.399 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f7a3c1e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 kernel: tap9f7a3c1e-00: left promiscuous mode
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.417 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f19309-1e1b-47de-b5b8-c94b27673eb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.453 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1e07f335-aae2-4dc6-83b5-5d7ab672994f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.454 2 INFO nova.compute.manager [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.455 2 DEBUG oslo.service.loopingcall [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.455 2 DEBUG nova.compute.manager [-] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.455 2 DEBUG nova.network.neutron [-] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.455 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[29d62715-58de-451a-868b-96c102512760]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.475 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[558f3514-d0e4-4546-8647-6a37f9e45b5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512684, 'reachable_time': 30730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241861, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:41 compute-1 systemd[1]: run-netns-ovnmeta\x2d9f7a3c1e\x2d01ae\x2d4ec3\x2da5e2\x2d23ef8435e53a.mount: Deactivated successfully.
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.480 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:40:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:40:41.480 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[a291e916-39c7-4731-8d4e-aa28e056efee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.614 2 DEBUG nova.compute.manager [req-27f9655c-8b49-4320-abfe-a90648738265 req-356a6c52-18b4-4294-9e45-299a61bf4630 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Received event network-vif-plugged-6be1fbd6-5748-4b3c-af41-2287770931ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.615 2 DEBUG oslo_concurrency.lockutils [req-27f9655c-8b49-4320-abfe-a90648738265 req-356a6c52-18b4-4294-9e45-299a61bf4630 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.615 2 DEBUG oslo_concurrency.lockutils [req-27f9655c-8b49-4320-abfe-a90648738265 req-356a6c52-18b4-4294-9e45-299a61bf4630 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.615 2 DEBUG oslo_concurrency.lockutils [req-27f9655c-8b49-4320-abfe-a90648738265 req-356a6c52-18b4-4294-9e45-299a61bf4630 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "628fd442-ed35-482c-91db-4a57f527b6a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.615 2 DEBUG nova.compute.manager [req-27f9655c-8b49-4320-abfe-a90648738265 req-356a6c52-18b4-4294-9e45-299a61bf4630 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] No waiting events found dispatching network-vif-plugged-6be1fbd6-5748-4b3c-af41-2287770931ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.616 2 WARNING nova.compute.manager [req-27f9655c-8b49-4320-abfe-a90648738265 req-356a6c52-18b4-4294-9e45-299a61bf4630 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Received unexpected event network-vif-plugged-6be1fbd6-5748-4b3c-af41-2287770931ef for instance with vm_state deleted and task_state None.
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.616 2 DEBUG nova.compute.manager [req-27f9655c-8b49-4320-abfe-a90648738265 req-356a6c52-18b4-4294-9e45-299a61bf4630 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Received event network-vif-deleted-6be1fbd6-5748-4b3c-af41-2287770931ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.786 2 INFO nova.virt.libvirt.driver [-] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Instance destroyed successfully.
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.787 2 DEBUG nova.objects.instance [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lazy-loading 'resources' on Instance uuid 2c2f6f5b-4955-4915-b620-f377ca649c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.805 2 DEBUG nova.virt.libvirt.vif [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:38:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1599075230',display_name='tempest-ServersNegativeTestJSON-server-1599075230',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1599075230',id=123,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:38:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='51c02ace4fff44cca028986381d7c407',ramdisk_id='',reservation_id='r-qtczxsxc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-2038911265',owner_user_name='tempest-ServersNegativeTestJSON-2038911265-project-member',shelved_at='2025-09-30T21:40:37.936971',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='b2cb7b2e-b6c9-4a27-93e3-da1a9998981b'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:40:34Z,user_data=None,user_id='972edc166c9442b1a83983d15a64e8b6',uuid=2c2f6f5b-4955-4915-b620-f377ca649c75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.806 2 DEBUG nova.network.os_vif_util [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converting VIF {"id": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "address": "fa:16:3e:2b:0e:d5", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa210664d-c1", "ovs_interfaceid": "a210664d-c1de-43b3-8844-2a8aedba5ac1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.808 2 DEBUG nova.network.os_vif_util [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:0e:d5,bridge_name='br-int',has_traffic_filtering=True,id=a210664d-c1de-43b3-8844-2a8aedba5ac1,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa210664d-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.808 2 DEBUG os_vif [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:0e:d5,bridge_name='br-int',has_traffic_filtering=True,id=a210664d-c1de-43b3-8844-2a8aedba5ac1,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa210664d-c1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.812 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa210664d-c1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.821 2 INFO os_vif [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:0e:d5,bridge_name='br-int',has_traffic_filtering=True,id=a210664d-c1de-43b3-8844-2a8aedba5ac1,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa210664d-c1')
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.822 2 INFO nova.virt.libvirt.driver [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Deleting instance files /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75_del
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.827 2 INFO nova.virt.libvirt.driver [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Deletion of /var/lib/nova/instances/2c2f6f5b-4955-4915-b620-f377ca649c75_del complete
Sep 30 21:40:41 compute-1 nova_compute[192795]: 2025-09-30 21:40:41.969 2 INFO nova.scheduler.client.report [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Deleted allocations for instance 2c2f6f5b-4955-4915-b620-f377ca649c75
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.078 2 DEBUG oslo_concurrency.lockutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.079 2 DEBUG oslo_concurrency.lockutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.143 2 DEBUG nova.compute.provider_tree [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.170 2 DEBUG nova.scheduler.client.report [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.205 2 DEBUG nova.network.neutron [-] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.208 2 DEBUG oslo_concurrency.lockutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.229 2 INFO nova.compute.manager [-] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Took 0.77 seconds to deallocate network for instance.
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.291 2 DEBUG oslo_concurrency.lockutils [None req-c972670e-e03f-48f6-a2a3-7c3405e6e158 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 12.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.292 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 4.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.292 2 INFO nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] During sync_power_state the instance has a pending task (shelving_image_uploading). Skip.
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.292 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "2c2f6f5b-4955-4915-b620-f377ca649c75" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.331 2 DEBUG oslo_concurrency.lockutils [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.332 2 DEBUG oslo_concurrency.lockutils [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.368 2 DEBUG nova.compute.provider_tree [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.382 2 DEBUG nova.scheduler.client.report [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.418 2 DEBUG oslo_concurrency.lockutils [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.475 2 INFO nova.scheduler.client.report [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Deleted allocations for instance 333cdbd1-23c9-422d-b896-f3c6b76d4130
Sep 30 21:40:42 compute-1 nova_compute[192795]: 2025-09-30 21:40:42.603 2 DEBUG oslo_concurrency.lockutils [None req-f1de1d20-dec1-43c0-8a35-032a3fc5c0f0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.522 2 DEBUG nova.compute.manager [req-8974d932-6d7e-4ad1-8ee1-45faa3401b02 req-67d91899-5e07-4cf4-bd09-be15f1a58354 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.523 2 DEBUG oslo_concurrency.lockutils [req-8974d932-6d7e-4ad1-8ee1-45faa3401b02 req-67d91899-5e07-4cf4-bd09-be15f1a58354 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.523 2 DEBUG oslo_concurrency.lockutils [req-8974d932-6d7e-4ad1-8ee1-45faa3401b02 req-67d91899-5e07-4cf4-bd09-be15f1a58354 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.523 2 DEBUG oslo_concurrency.lockutils [req-8974d932-6d7e-4ad1-8ee1-45faa3401b02 req-67d91899-5e07-4cf4-bd09-be15f1a58354 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "333cdbd1-23c9-422d-b896-f3c6b76d4130-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.523 2 DEBUG nova.compute.manager [req-8974d932-6d7e-4ad1-8ee1-45faa3401b02 req-67d91899-5e07-4cf4-bd09-be15f1a58354 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] No waiting events found dispatching network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.523 2 WARNING nova.compute.manager [req-8974d932-6d7e-4ad1-8ee1-45faa3401b02 req-67d91899-5e07-4cf4-bd09-be15f1a58354 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received unexpected event network-vif-plugged-54f28900-8a59-4c2b-b1a3-a7b618a894ce for instance with vm_state deleted and task_state None.
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.536 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268428.5360203, 829530ac-b0fb-4e39-896e-f01e78306ff8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.536 2 INFO nova.compute.manager [-] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] VM Stopped (Lifecycle Event)
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.587 2 DEBUG nova.compute.manager [None req-8839c826-eeb9-4bd3-8b5a-fbc8ce6e7752 - - - - - -] [instance: 829530ac-b0fb-4e39-896e-f01e78306ff8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:43 compute-1 nova_compute[192795]: 2025-09-30 21:40:43.798 2 DEBUG nova.compute.manager [req-b8d9134a-e551-4ed1-8b57-4af3b3ff30c5 req-5472dec7-ca37-424c-b21d-1f1b88409227 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Received event network-vif-deleted-54f28900-8a59-4c2b-b1a3-a7b618a894ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.029 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:40:44.029 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:45 compute-1 nova_compute[192795]: 2025-09-30 21:40:45.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:46 compute-1 nova_compute[192795]: 2025-09-30 21:40:46.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:47 compute-1 nova_compute[192795]: 2025-09-30 21:40:47.761 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268432.7599213, 2c2f6f5b-4955-4915-b620-f377ca649c75 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:47 compute-1 nova_compute[192795]: 2025-09-30 21:40:47.762 2 INFO nova.compute.manager [-] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] VM Stopped (Lifecycle Event)
Sep 30 21:40:47 compute-1 nova_compute[192795]: 2025-09-30 21:40:47.801 2 DEBUG nova.compute.manager [None req-8abd79d5-fa3e-4a69-9675-9d64dbf67e18 - - - - - -] [instance: 2c2f6f5b-4955-4915-b620-f377ca649c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:48 compute-1 nova_compute[192795]: 2025-09-30 21:40:48.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:50 compute-1 podman[241862]: 2025-09-30 21:40:50.227201849 +0000 UTC m=+0.059129274 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:40:51 compute-1 nova_compute[192795]: 2025-09-30 21:40:51.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.736 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.736 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.737 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.737 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.931 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.932 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5626MB free_disk=73.31681060791016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.933 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:53 compute-1 nova_compute[192795]: 2025-09-30 21:40:53.933 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:54 compute-1 nova_compute[192795]: 2025-09-30 21:40:54.050 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:40:54 compute-1 nova_compute[192795]: 2025-09-30 21:40:54.051 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:40:54 compute-1 nova_compute[192795]: 2025-09-30 21:40:54.072 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:54 compute-1 nova_compute[192795]: 2025-09-30 21:40:54.150 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:54 compute-1 nova_compute[192795]: 2025-09-30 21:40:54.204 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:40:54 compute-1 nova_compute[192795]: 2025-09-30 21:40:54.205 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:54 compute-1 nova_compute[192795]: 2025-09-30 21:40:54.272 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268439.2711492, 628fd442-ed35-482c-91db-4a57f527b6a8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:54 compute-1 nova_compute[192795]: 2025-09-30 21:40:54.273 2 INFO nova.compute.manager [-] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] VM Stopped (Lifecycle Event)
Sep 30 21:40:54 compute-1 nova_compute[192795]: 2025-09-30 21:40:54.299 2 DEBUG nova.compute.manager [None req-a38c4192-4296-4aab-a93e-023ad2ecb573 - - - - - -] [instance: 628fd442-ed35-482c-91db-4a57f527b6a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:56 compute-1 nova_compute[192795]: 2025-09-30 21:40:56.206 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:56 compute-1 nova_compute[192795]: 2025-09-30 21:40:56.345 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268441.3435302, 333cdbd1-23c9-422d-b896-f3c6b76d4130 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:56 compute-1 nova_compute[192795]: 2025-09-30 21:40:56.345 2 INFO nova.compute.manager [-] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] VM Stopped (Lifecycle Event)
Sep 30 21:40:56 compute-1 nova_compute[192795]: 2025-09-30 21:40:56.459 2 DEBUG nova.compute.manager [None req-3b55157e-21c9-403d-ab64-fc5ad88703aa - - - - - -] [instance: 333cdbd1-23c9-422d-b896-f3c6b76d4130] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:56 compute-1 nova_compute[192795]: 2025-09-30 21:40:56.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:56 compute-1 nova_compute[192795]: 2025-09-30 21:40:56.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:57 compute-1 nova_compute[192795]: 2025-09-30 21:40:57.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:57 compute-1 nova_compute[192795]: 2025-09-30 21:40:57.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:40:58 compute-1 podman[241885]: 2025-09-30 21:40:58.224611639 +0000 UTC m=+0.058636971 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:40:58 compute-1 podman[241883]: 2025-09-30 21:40:58.252895587 +0000 UTC m=+0.093461654 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:40:58 compute-1 podman[241884]: 2025-09-30 21:40:58.258181788 +0000 UTC m=+0.093614777 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:40:58 compute-1 nova_compute[192795]: 2025-09-30 21:40:58.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:00 compute-1 nova_compute[192795]: 2025-09-30 21:41:00.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:01 compute-1 nova_compute[192795]: 2025-09-30 21:41:01.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:01 compute-1 nova_compute[192795]: 2025-09-30 21:41:01.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:03 compute-1 podman[241951]: 2025-09-30 21:41:03.221214274 +0000 UTC m=+0.062583526 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Sep 30 21:41:03 compute-1 nova_compute[192795]: 2025-09-30 21:41:03.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:05.142 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:41:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:05.143 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:41:05 compute-1 nova_compute[192795]: 2025-09-30 21:41:05.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:05 compute-1 nova_compute[192795]: 2025-09-30 21:41:05.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:06 compute-1 nova_compute[192795]: 2025-09-30 21:41:06.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:06 compute-1 nova_compute[192795]: 2025-09-30 21:41:06.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:41:06 compute-1 nova_compute[192795]: 2025-09-30 21:41:06.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:41:06 compute-1 nova_compute[192795]: 2025-09-30 21:41:06.721 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:41:06 compute-1 nova_compute[192795]: 2025-09-30 21:41:06.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:08.145 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:08 compute-1 nova_compute[192795]: 2025-09-30 21:41:08.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:10 compute-1 podman[241975]: 2025-09-30 21:41:10.225754009 +0000 UTC m=+0.056863704 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:41:10 compute-1 podman[241974]: 2025-09-30 21:41:10.234550444 +0000 UTC m=+0.067822586 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:41:10 compute-1 podman[241973]: 2025-09-30 21:41:10.261312121 +0000 UTC m=+0.086825626 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal)
Sep 30 21:41:11 compute-1 nova_compute[192795]: 2025-09-30 21:41:11.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.651 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "c2cfd403-7d78-4290-9c7c-682f8bac568e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.651 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.681 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "3b552e93-79d1-422a-bfb2-240c6d2e3378" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.681 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.682 2 DEBUG nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.715 2 DEBUG nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.802 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.803 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.821 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.822 2 INFO nova.compute.claims [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.829 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.942 2 DEBUG nova.compute.provider_tree [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.962 2 DEBUG nova.scheduler.client.report [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.998 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:13 compute-1 nova_compute[192795]: 2025-09-30 21:41:13.999 2 DEBUG nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.002 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.008 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.009 2 INFO nova.compute.claims [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.092 2 DEBUG nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.093 2 DEBUG nova.network.neutron [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.129 2 INFO nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.159 2 DEBUG nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.187 2 DEBUG nova.compute.provider_tree [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.234 2 DEBUG nova.scheduler.client.report [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.266 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.266 2 DEBUG nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.339 2 DEBUG nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.339 2 DEBUG nova.network.neutron [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.344 2 DEBUG nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.344 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.345 2 INFO nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Creating image(s)
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.345 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "/var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.346 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "/var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.346 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "/var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.358 2 INFO nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.359 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.379 2 DEBUG nova.policy [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ab5d15c0f90247089dbbed4ea5c454da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f624bcd0e96f4a22879c80070e2afe4e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.416 2 DEBUG nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.420 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.420 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.421 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.431 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.486 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.487 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.527 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.528 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.529 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.584 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.585 2 DEBUG nova.virt.disk.api [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Checking if we can resize image /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.585 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.605 2 DEBUG nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.607 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.607 2 INFO nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Creating image(s)
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.608 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "/var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.608 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.609 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.621 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.643 2 DEBUG nova.policy [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27859618cb1d493cb1531af26b200b92', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '043721d1d0a2480fa785367fa56c1fa4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.646 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.647 2 DEBUG nova.virt.disk.api [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Cannot resize image /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.647 2 DEBUG nova.objects.instance [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lazy-loading 'migration_context' on Instance uuid c2cfd403-7d78-4290-9c7c-682f8bac568e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.665 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.665 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Ensure instance console log exists: /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.665 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.666 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.666 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.678 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.678 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.679 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.689 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.745 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.747 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.787 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.788 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.789 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.856 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.858 2 DEBUG nova.virt.disk.api [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Checking if we can resize image /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.858 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.922 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.924 2 DEBUG nova.virt.disk.api [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Cannot resize image /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.925 2 DEBUG nova.objects.instance [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'migration_context' on Instance uuid 3b552e93-79d1-422a-bfb2-240c6d2e3378 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.948 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.949 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Ensure instance console log exists: /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.950 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.950 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:14 compute-1 nova_compute[192795]: 2025-09-30 21:41:14.951 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:15 compute-1 nova_compute[192795]: 2025-09-30 21:41:15.039 2 DEBUG nova.network.neutron [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Successfully created port: fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:41:15 compute-1 nova_compute[192795]: 2025-09-30 21:41:15.870 2 DEBUG nova.network.neutron [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Successfully updated port: fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:41:15 compute-1 nova_compute[192795]: 2025-09-30 21:41:15.915 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "refresh_cache-c2cfd403-7d78-4290-9c7c-682f8bac568e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:41:15 compute-1 nova_compute[192795]: 2025-09-30 21:41:15.915 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquired lock "refresh_cache-c2cfd403-7d78-4290-9c7c-682f8bac568e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:41:15 compute-1 nova_compute[192795]: 2025-09-30 21:41:15.916 2 DEBUG nova.network.neutron [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:41:15 compute-1 nova_compute[192795]: 2025-09-30 21:41:15.966 2 DEBUG nova.compute.manager [req-847b82f4-c6c4-4420-8c99-0334413e115d req-f22806f7-13d2-4984-9c6d-c3a1c6c1a561 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Received event network-changed-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:15 compute-1 nova_compute[192795]: 2025-09-30 21:41:15.966 2 DEBUG nova.compute.manager [req-847b82f4-c6c4-4420-8c99-0334413e115d req-f22806f7-13d2-4984-9c6d-c3a1c6c1a561 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Refreshing instance network info cache due to event network-changed-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:41:15 compute-1 nova_compute[192795]: 2025-09-30 21:41:15.967 2 DEBUG oslo_concurrency.lockutils [req-847b82f4-c6c4-4420-8c99-0334413e115d req-f22806f7-13d2-4984-9c6d-c3a1c6c1a561 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c2cfd403-7d78-4290-9c7c-682f8bac568e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:41:16 compute-1 nova_compute[192795]: 2025-09-30 21:41:16.011 2 DEBUG nova.network.neutron [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Successfully created port: b4e459d6-f285-4155-91ff-3beefa9c4d81 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:41:16 compute-1 nova_compute[192795]: 2025-09-30 21:41:16.246 2 DEBUG nova.network.neutron [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:41:16 compute-1 nova_compute[192795]: 2025-09-30 21:41:16.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.203 2 DEBUG nova.network.neutron [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Updating instance_info_cache with network_info: [{"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.260 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Releasing lock "refresh_cache-c2cfd403-7d78-4290-9c7c-682f8bac568e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.261 2 DEBUG nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Instance network_info: |[{"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.261 2 DEBUG oslo_concurrency.lockutils [req-847b82f4-c6c4-4420-8c99-0334413e115d req-f22806f7-13d2-4984-9c6d-c3a1c6c1a561 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c2cfd403-7d78-4290-9c7c-682f8bac568e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.261 2 DEBUG nova.network.neutron [req-847b82f4-c6c4-4420-8c99-0334413e115d req-f22806f7-13d2-4984-9c6d-c3a1c6c1a561 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Refreshing network info cache for port fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.264 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Start _get_guest_xml network_info=[{"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.270 2 WARNING nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.277 2 DEBUG nova.virt.libvirt.host [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.277 2 DEBUG nova.virt.libvirt.host [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.283 2 DEBUG nova.virt.libvirt.host [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.283 2 DEBUG nova.virt.libvirt.host [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.285 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.285 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.285 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.285 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.286 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.286 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.286 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.286 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.287 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.287 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.287 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.287 2 DEBUG nova.virt.hardware [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.291 2 DEBUG nova.virt.libvirt.vif [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:41:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1070040306',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1070040306',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1070040306',id=135,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f624bcd0e96f4a22879c80070e2afe4e',ramdisk_id='',reservation_id='r-24mw1sup',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-490713827',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-490713827-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:41:14Z,user_data=None,user_id='ab5d15c0f90247089dbbed4ea5c454da',uuid=c2cfd403-7d78-4290-9c7c-682f8bac568e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.292 2 DEBUG nova.network.os_vif_util [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Converting VIF {"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.292 2 DEBUG nova.network.os_vif_util [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:df:6d,bridge_name='br-int',has_traffic_filtering=True,id=fdf3d8c5-ab5c-437f-9233-8eec99d30bc1,network=Network(773a5870-06c0-44a8-97b7-d3402370fd46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf3d8c5-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.293 2 DEBUG nova.objects.instance [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lazy-loading 'pci_devices' on Instance uuid c2cfd403-7d78-4290-9c7c-682f8bac568e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.309 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <uuid>c2cfd403-7d78-4290-9c7c-682f8bac568e</uuid>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <name>instance-00000087</name>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <nova:name>tempest-ServersNegativeTestMultiTenantJSON-server-1070040306</nova:name>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:41:17</nova:creationTime>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:41:17 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:41:17 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:41:17 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:41:17 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:41:17 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:41:17 compute-1 nova_compute[192795]:         <nova:user uuid="ab5d15c0f90247089dbbed4ea5c454da">tempest-ServersNegativeTestMultiTenantJSON-490713827-project-member</nova:user>
Sep 30 21:41:17 compute-1 nova_compute[192795]:         <nova:project uuid="f624bcd0e96f4a22879c80070e2afe4e">tempest-ServersNegativeTestMultiTenantJSON-490713827</nova:project>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:41:17 compute-1 nova_compute[192795]:         <nova:port uuid="fdf3d8c5-ab5c-437f-9233-8eec99d30bc1">
Sep 30 21:41:17 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <system>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <entry name="serial">c2cfd403-7d78-4290-9c7c-682f8bac568e</entry>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <entry name="uuid">c2cfd403-7d78-4290-9c7c-682f8bac568e</entry>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     </system>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <os>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   </os>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <features>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   </features>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk.config"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:b9:df:6d"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <target dev="tapfdf3d8c5-ab"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/console.log" append="off"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <video>
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     </video>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:41:17 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:41:17 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:41:17 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:41:17 compute-1 nova_compute[192795]: </domain>
Sep 30 21:41:17 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.311 2 DEBUG nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Preparing to wait for external event network-vif-plugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.311 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.311 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.311 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.312 2 DEBUG nova.virt.libvirt.vif [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:41:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1070040306',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1070040306',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1070040306',id=135,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f624bcd0e96f4a22879c80070e2afe4e',ramdisk_id='',reservation_id='r-24mw1sup',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-490713827',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-490713827-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:41:14Z,user_data=None,user_id='ab5d15c0f90247089dbbed4ea5c454da',uuid=c2cfd403-7d78-4290-9c7c-682f8bac568e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.312 2 DEBUG nova.network.os_vif_util [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Converting VIF {"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.313 2 DEBUG nova.network.os_vif_util [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:df:6d,bridge_name='br-int',has_traffic_filtering=True,id=fdf3d8c5-ab5c-437f-9233-8eec99d30bc1,network=Network(773a5870-06c0-44a8-97b7-d3402370fd46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf3d8c5-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.313 2 DEBUG os_vif [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:df:6d,bridge_name='br-int',has_traffic_filtering=True,id=fdf3d8c5-ab5c-437f-9233-8eec99d30bc1,network=Network(773a5870-06c0-44a8-97b7-d3402370fd46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf3d8c5-ab') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.314 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.315 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.318 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfdf3d8c5-ab, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.319 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfdf3d8c5-ab, col_values=(('external_ids', {'iface-id': 'fdf3d8c5-ab5c-437f-9233-8eec99d30bc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:df:6d', 'vm-uuid': 'c2cfd403-7d78-4290-9c7c-682f8bac568e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:41:17 compute-1 NetworkManager[51724]: <info>  [1759268477.3232] manager: (tapfdf3d8c5-ab): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.329 2 INFO os_vif [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:df:6d,bridge_name='br-int',has_traffic_filtering=True,id=fdf3d8c5-ab5c-437f-9233-8eec99d30bc1,network=Network(773a5870-06c0-44a8-97b7-d3402370fd46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf3d8c5-ab')
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.375 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.376 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.376 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] No VIF found with MAC fa:16:3e:b9:df:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.377 2 INFO nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Using config drive
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.478 2 DEBUG nova.network.neutron [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Successfully updated port: b4e459d6-f285-4155-91ff-3beefa9c4d81 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.493 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.493 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquired lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.493 2 DEBUG nova.network.neutron [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.603 2 DEBUG nova.compute.manager [req-72aa7d74-a343-4d54-bde9-9c3933799196 req-6f23e111-f7f9-4411-a477-bafb32d9d4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received event network-changed-b4e459d6-f285-4155-91ff-3beefa9c4d81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.603 2 DEBUG nova.compute.manager [req-72aa7d74-a343-4d54-bde9-9c3933799196 req-6f23e111-f7f9-4411-a477-bafb32d9d4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Refreshing instance network info cache due to event network-changed-b4e459d6-f285-4155-91ff-3beefa9c4d81. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.603 2 DEBUG oslo_concurrency.lockutils [req-72aa7d74-a343-4d54-bde9-9c3933799196 req-6f23e111-f7f9-4411-a477-bafb32d9d4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.719 2 DEBUG nova.network.neutron [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.942 2 INFO nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Creating config drive at /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk.config
Sep 30 21:41:17 compute-1 nova_compute[192795]: 2025-09-30 21:41:17.949 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2f50n0vu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.080 2 DEBUG oslo_concurrency.processutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2f50n0vu" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:18 compute-1 kernel: tapfdf3d8c5-ab: entered promiscuous mode
Sep 30 21:41:18 compute-1 NetworkManager[51724]: <info>  [1759268478.1643] manager: (tapfdf3d8c5-ab): new Tun device (/org/freedesktop/NetworkManager/Devices/255)
Sep 30 21:41:18 compute-1 ovn_controller[94902]: 2025-09-30T21:41:18Z|00519|binding|INFO|Claiming lport fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 for this chassis.
Sep 30 21:41:18 compute-1 ovn_controller[94902]: 2025-09-30T21:41:18Z|00520|binding|INFO|fdf3d8c5-ab5c-437f-9233-8eec99d30bc1: Claiming fa:16:3e:b9:df:6d 10.100.0.13
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.178 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:df:6d 10.100.0.13'], port_security=['fa:16:3e:b9:df:6d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c2cfd403-7d78-4290-9c7c-682f8bac568e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-773a5870-06c0-44a8-97b7-d3402370fd46', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f624bcd0e96f4a22879c80070e2afe4e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '30e8623e-093c-4248-9876-ba507ac6ac9f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9303632d-e068-4c79-add1-d0768573add5, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=fdf3d8c5-ab5c-437f-9233-8eec99d30bc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.179 103861 INFO neutron.agent.ovn.metadata.agent [-] Port fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 in datapath 773a5870-06c0-44a8-97b7-d3402370fd46 bound to our chassis
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.181 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 773a5870-06c0-44a8-97b7-d3402370fd46
Sep 30 21:41:18 compute-1 systemd-udevd[242083]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.197 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0e022ac1-7006-4c1f-a417-e00a4c4867e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.198 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap773a5870-01 in ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.201 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap773a5870-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.201 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f49112-0368-411d-94c8-9dd9a2f82dc5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.202 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b101f13d-e6b8-42fd-9e45-c4ccf14fd286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 NetworkManager[51724]: <info>  [1759268478.2155] device (tapfdf3d8c5-ab): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:41:18 compute-1 systemd-machined[152783]: New machine qemu-63-instance-00000087.
Sep 30 21:41:18 compute-1 NetworkManager[51724]: <info>  [1759268478.2173] device (tapfdf3d8c5-ab): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.220 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[902fcf1c-d643-42c0-80e3-5577cecc587d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-1 ovn_controller[94902]: 2025-09-30T21:41:18Z|00521|binding|INFO|Setting lport fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 ovn-installed in OVS
Sep 30 21:41:18 compute-1 ovn_controller[94902]: 2025-09-30T21:41:18Z|00522|binding|INFO|Setting lport fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 up in Southbound
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-1 systemd[1]: Started Virtual Machine qemu-63-instance-00000087.
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.248 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7d172d73-17e7-446a-875c-a2b61610d7e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.283 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4a347ca8-0d54-471c-820b-578a4999e893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.290 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1d2c4289-c097-4126-bd70-fe424d7a251b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 NetworkManager[51724]: <info>  [1759268478.2931] manager: (tap773a5870-00): new Veth device (/org/freedesktop/NetworkManager/Devices/256)
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.334 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4ce49f7d-6763-40c5-885c-ce8e61045eb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.340 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e7985e8f-1e5c-4270-99b3-eed27bd5ff83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 NetworkManager[51724]: <info>  [1759268478.3632] device (tap773a5870-00): carrier: link connected
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.371 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5d999496-360a-451f-8bac-320dc192fe37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.393 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[62a5a615-ff71-4482-9fe4-d6b7bd6c7cf7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap773a5870-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:8f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517403, 'reachable_time': 44938, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242117, 'error': None, 'target': 'ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.410 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d1380496-0e8e-4756-b70c-8cace4ed88cb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:8fda'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517403, 'tstamp': 517403}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242118, 'error': None, 'target': 'ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.429 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[24d8e634-2474-4244-9981-1213aa4566d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap773a5870-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:8f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517403, 'reachable_time': 44938, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242119, 'error': None, 'target': 'ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.456 2 DEBUG nova.compute.manager [req-bd2017d8-c556-478c-b524-790987ff8560 req-11e964e9-7fa7-444f-bf06-33867d113875 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Received event network-vif-plugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.458 2 DEBUG oslo_concurrency.lockutils [req-bd2017d8-c556-478c-b524-790987ff8560 req-11e964e9-7fa7-444f-bf06-33867d113875 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.459 2 DEBUG oslo_concurrency.lockutils [req-bd2017d8-c556-478c-b524-790987ff8560 req-11e964e9-7fa7-444f-bf06-33867d113875 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.459 2 DEBUG oslo_concurrency.lockutils [req-bd2017d8-c556-478c-b524-790987ff8560 req-11e964e9-7fa7-444f-bf06-33867d113875 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.459 2 DEBUG nova.compute.manager [req-bd2017d8-c556-478c-b524-790987ff8560 req-11e964e9-7fa7-444f-bf06-33867d113875 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Processing event network-vif-plugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.465 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[927b194e-767d-4634-9552-a739a18ebedb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.529 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[eee09854-5845-4fed-8852-8a3665294bc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.532 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap773a5870-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.532 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.533 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap773a5870-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-1 kernel: tap773a5870-00: entered promiscuous mode
Sep 30 21:41:18 compute-1 NetworkManager[51724]: <info>  [1759268478.5373] manager: (tap773a5870-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.544 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap773a5870-00, col_values=(('external_ids', {'iface-id': 'aa6f94e4-e8ea-492a-95c7-d79515929ef7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-1 ovn_controller[94902]: 2025-09-30T21:41:18Z|00523|binding|INFO|Releasing lport aa6f94e4-e8ea-492a-95c7-d79515929ef7 from this chassis (sb_readonly=0)
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.551 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/773a5870-06c0-44a8-97b7-d3402370fd46.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/773a5870-06c0-44a8-97b7-d3402370fd46.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.552 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d99580ba-0572-4301-a5a0-135c22b25dbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.554 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-773a5870-06c0-44a8-97b7-d3402370fd46
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/773a5870-06c0-44a8-97b7-d3402370fd46.pid.haproxy
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 773a5870-06c0-44a8-97b7-d3402370fd46
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:41:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:18.555 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46', 'env', 'PROCESS_TAG=haproxy-773a5870-06c0-44a8-97b7-d3402370fd46', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/773a5870-06c0-44a8-97b7-d3402370fd46.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.664 2 DEBUG nova.network.neutron [req-847b82f4-c6c4-4420-8c99-0334413e115d req-f22806f7-13d2-4984-9c6d-c3a1c6c1a561 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Updated VIF entry in instance network info cache for port fdf3d8c5-ab5c-437f-9233-8eec99d30bc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.665 2 DEBUG nova.network.neutron [req-847b82f4-c6c4-4420-8c99-0334413e115d req-f22806f7-13d2-4984-9c6d-c3a1c6c1a561 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Updating instance_info_cache with network_info: [{"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.786 2 DEBUG oslo_concurrency.lockutils [req-847b82f4-c6c4-4420-8c99-0334413e115d req-f22806f7-13d2-4984-9c6d-c3a1c6c1a561 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c2cfd403-7d78-4290-9c7c-682f8bac568e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.921 2 DEBUG nova.network.neutron [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Updating instance_info_cache with network_info: [{"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:18 compute-1 podman[242151]: 2025-09-30 21:41:18.93461575 +0000 UTC m=+0.051787078 container create aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:41:18 compute-1 systemd[1]: Started libpod-conmon-aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60.scope.
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.993 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Releasing lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.993 2 DEBUG nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Instance network_info: |[{"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.993 2 DEBUG oslo_concurrency.lockutils [req-72aa7d74-a343-4d54-bde9-9c3933799196 req-6f23e111-f7f9-4411-a477-bafb32d9d4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.994 2 DEBUG nova.network.neutron [req-72aa7d74-a343-4d54-bde9-9c3933799196 req-6f23e111-f7f9-4411-a477-bafb32d9d4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Refreshing network info cache for port b4e459d6-f285-4155-91ff-3beefa9c4d81 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:41:18 compute-1 nova_compute[192795]: 2025-09-30 21:41:18.996 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Start _get_guest_xml network_info=[{"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:41:18 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:41:19 compute-1 podman[242151]: 2025-09-30 21:41:18.906278071 +0000 UTC m=+0.023449409 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:41:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a110c24050cfca27f9edf17ae086ac18b998872ecece546f7545fdddc3d417a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.007 2 WARNING nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.011 2 DEBUG nova.virt.libvirt.host [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.012 2 DEBUG nova.virt.libvirt.host [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:41:19 compute-1 podman[242151]: 2025-09-30 21:41:19.014876919 +0000 UTC m=+0.132048267 container init aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:41:19 compute-1 podman[242151]: 2025-09-30 21:41:19.020102629 +0000 UTC m=+0.137273957 container start aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.024 2 DEBUG nova.virt.libvirt.host [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.025 2 DEBUG nova.virt.libvirt.host [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.026 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.026 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.026 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.027 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.027 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.027 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.027 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.027 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.027 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.028 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.028 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.028 2 DEBUG nova.virt.hardware [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.032 2 DEBUG nova.virt.libvirt.vif [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:41:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-686147264',display_name='tempest-TestNetworkBasicOps-server-686147264',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-686147264',id=136,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEuk2BbMitNL/zl9ejInjo2O7BfBcgbeg+gE3A9WVPCfG1vcgqZD0WrcCCmZv6Y26Pwh2RY3dJIxtGKQOrNQzrhkhAVqiqHE8YzJ79HbcB5BIIpxWHIdSv5PoHu9Pzxqtg==',key_name='tempest-TestNetworkBasicOps-336137018',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-ljqt4ziw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:41:14Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=3b552e93-79d1-422a-bfb2-240c6d2e3378,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.032 2 DEBUG nova.network.os_vif_util [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.033 2 DEBUG nova.network.os_vif_util [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:96:08,bridge_name='br-int',has_traffic_filtering=True,id=b4e459d6-f285-4155-91ff-3beefa9c4d81,network=Network(5860c6c2-5aaa-4201-b9cb-964d0ec94e1c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4e459d6-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.034 2 DEBUG nova.objects.instance [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b552e93-79d1-422a-bfb2-240c6d2e3378 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:19 compute-1 neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46[242173]: [NOTICE]   (242178) : New worker (242180) forked
Sep 30 21:41:19 compute-1 neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46[242173]: [NOTICE]   (242178) : Loading success.
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.063 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <uuid>3b552e93-79d1-422a-bfb2-240c6d2e3378</uuid>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <name>instance-00000088</name>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkBasicOps-server-686147264</nova:name>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:41:19</nova:creationTime>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:41:19 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:41:19 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:41:19 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:41:19 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:41:19 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:41:19 compute-1 nova_compute[192795]:         <nova:user uuid="27859618cb1d493cb1531af26b200b92">tempest-TestNetworkBasicOps-2126023928-project-member</nova:user>
Sep 30 21:41:19 compute-1 nova_compute[192795]:         <nova:project uuid="043721d1d0a2480fa785367fa56c1fa4">tempest-TestNetworkBasicOps-2126023928</nova:project>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:41:19 compute-1 nova_compute[192795]:         <nova:port uuid="b4e459d6-f285-4155-91ff-3beefa9c4d81">
Sep 30 21:41:19 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <system>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <entry name="serial">3b552e93-79d1-422a-bfb2-240c6d2e3378</entry>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <entry name="uuid">3b552e93-79d1-422a-bfb2-240c6d2e3378</entry>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     </system>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <os>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   </os>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <features>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   </features>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk.config"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:30:96:08"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <target dev="tapb4e459d6-f2"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/console.log" append="off"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <video>
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     </video>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:41:19 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:41:19 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:41:19 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:41:19 compute-1 nova_compute[192795]: </domain>
Sep 30 21:41:19 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.063 2 DEBUG nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Preparing to wait for external event network-vif-plugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.063 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.064 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.064 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.064 2 DEBUG nova.virt.libvirt.vif [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:41:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-686147264',display_name='tempest-TestNetworkBasicOps-server-686147264',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-686147264',id=136,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEuk2BbMitNL/zl9ejInjo2O7BfBcgbeg+gE3A9WVPCfG1vcgqZD0WrcCCmZv6Y26Pwh2RY3dJIxtGKQOrNQzrhkhAVqiqHE8YzJ79HbcB5BIIpxWHIdSv5PoHu9Pzxqtg==',key_name='tempest-TestNetworkBasicOps-336137018',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-ljqt4ziw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:41:14Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=3b552e93-79d1-422a-bfb2-240c6d2e3378,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.065 2 DEBUG nova.network.os_vif_util [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.065 2 DEBUG nova.network.os_vif_util [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:96:08,bridge_name='br-int',has_traffic_filtering=True,id=b4e459d6-f285-4155-91ff-3beefa9c4d81,network=Network(5860c6c2-5aaa-4201-b9cb-964d0ec94e1c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4e459d6-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.066 2 DEBUG os_vif [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:96:08,bridge_name='br-int',has_traffic_filtering=True,id=b4e459d6-f285-4155-91ff-3beefa9c4d81,network=Network(5860c6c2-5aaa-4201-b9cb-964d0ec94e1c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4e459d6-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.067 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.067 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.070 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb4e459d6-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.070 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb4e459d6-f2, col_values=(('external_ids', {'iface-id': 'b4e459d6-f285-4155-91ff-3beefa9c4d81', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:96:08', 'vm-uuid': '3b552e93-79d1-422a-bfb2-240c6d2e3378'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:19 compute-1 NetworkManager[51724]: <info>  [1759268479.0727] manager: (tapb4e459d6-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.079 2 INFO os_vif [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:96:08,bridge_name='br-int',has_traffic_filtering=True,id=b4e459d6-f285-4155-91ff-3beefa9c4d81,network=Network(5860c6c2-5aaa-4201-b9cb-964d0ec94e1c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4e459d6-f2')
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.138 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.138 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.138 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No VIF found with MAC fa:16:3e:30:96:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.139 2 INFO nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Using config drive
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.428 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268479.4274135, c2cfd403-7d78-4290-9c7c-682f8bac568e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.429 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] VM Started (Lifecycle Event)
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.430 2 DEBUG nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.434 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.438 2 INFO nova.virt.libvirt.driver [-] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Instance spawned successfully.
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.438 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.601 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.605 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.606 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.606 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.606 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.607 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.607 2 DEBUG nova.virt.libvirt.driver [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.611 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.636 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.636 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268479.428241, c2cfd403-7d78-4290-9c7c-682f8bac568e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.636 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] VM Paused (Lifecycle Event)
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.661 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.664 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268479.4332442, c2cfd403-7d78-4290-9c7c-682f8bac568e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.665 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] VM Resumed (Lifecycle Event)
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.687 2 INFO nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Took 5.34 seconds to spawn the instance on the hypervisor.
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.687 2 DEBUG nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.689 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.697 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.748 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.803 2 INFO nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Creating config drive at /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk.config
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.807 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp44ln2a4c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.828 2 INFO nova.compute.manager [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Took 6.06 seconds to build instance.
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.917 2 DEBUG oslo_concurrency.lockutils [None req-41788be0-5eac-43f1-aa6e-44716f2ed776 ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.935 2 DEBUG oslo_concurrency.processutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp44ln2a4c" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:19 compute-1 kernel: tapb4e459d6-f2: entered promiscuous mode
Sep 30 21:41:19 compute-1 ovn_controller[94902]: 2025-09-30T21:41:19Z|00524|binding|INFO|Claiming lport b4e459d6-f285-4155-91ff-3beefa9c4d81 for this chassis.
Sep 30 21:41:19 compute-1 NetworkManager[51724]: <info>  [1759268479.9943] manager: (tapb4e459d6-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:19 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:19.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:20 compute-1 ovn_controller[94902]: 2025-09-30T21:41:19Z|00525|binding|INFO|b4e459d6-f285-4155-91ff-3beefa9c4d81: Claiming fa:16:3e:30:96:08 10.100.0.5
Sep 30 21:41:20 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:41:20 compute-1 NetworkManager[51724]: <info>  [1759268480.0183] device (tapb4e459d6-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:41:20 compute-1 NetworkManager[51724]: <info>  [1759268480.0189] device (tapb4e459d6-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:41:20 compute-1 systemd-machined[152783]: New machine qemu-64-instance-00000088.
Sep 30 21:41:20 compute-1 ovn_controller[94902]: 2025-09-30T21:41:20Z|00526|binding|INFO|Setting lport b4e459d6-f285-4155-91ff-3beefa9c4d81 ovn-installed in OVS
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:20 compute-1 systemd[1]: Started Virtual Machine qemu-64-instance-00000088.
Sep 30 21:41:20 compute-1 ovn_controller[94902]: 2025-09-30T21:41:20Z|00527|binding|INFO|Setting lport b4e459d6-f285-4155-91ff-3beefa9c4d81 up in Southbound
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.089 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:96:08 10.100.0.5'], port_security=['fa:16:3e:30:96:08 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3b552e93-79d1-422a-bfb2-240c6d2e3378', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2dfc4e41-bce6-4c9f-b7c0-ad646d5ba504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eb01a7b-4833-4750-8216-3eeaede6ad3d, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=b4e459d6-f285-4155-91ff-3beefa9c4d81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.090 103861 INFO neutron.agent.ovn.metadata.agent [-] Port b4e459d6-f285-4155-91ff-3beefa9c4d81 in datapath 5860c6c2-5aaa-4201-b9cb-964d0ec94e1c bound to our chassis
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.092 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5860c6c2-5aaa-4201-b9cb-964d0ec94e1c
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.106 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d23ac3d4-92fb-4c39-a02b-7bb8f2f1c488]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.107 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5860c6c2-51 in ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.110 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5860c6c2-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.110 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[29dfa6e0-f2cd-4eb4-8358-30ed692b6d32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.116 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[85401d94-a84d-47d2-b66a-f9e0c8221ec1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.126 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[788fc071-054f-48eb-baab-1c3bdfd18c36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.152 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a03034d8-ca5c-4377-96d2-804127a3abad]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.178 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[55bc0e98-6210-4f7d-9a08-30909ae4a9fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 NetworkManager[51724]: <info>  [1759268480.1859] manager: (tap5860c6c2-50): new Veth device (/org/freedesktop/NetworkManager/Devices/260)
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.184 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec08f45-645e-4018-8685-04b214a88539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.217 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[044f2f8c-2d72-43c4-a1e6-85df0efdbec6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.220 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad8cbc6-f60e-4cdb-af9e-354087e51631]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 NetworkManager[51724]: <info>  [1759268480.2465] device (tap5860c6c2-50): carrier: link connected
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.255 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[28485119-b0da-4b81-95a8-bcbfc34b27c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.273 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc77057-4eb1-4606-b701-3dc409492b24]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5860c6c2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:50:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517591, 'reachable_time': 20240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242232, 'error': None, 'target': 'ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.294 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a2a90e-5481-4db5-80e5-c396096ed4ce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:50b8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517591, 'tstamp': 517591}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242234, 'error': None, 'target': 'ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.312 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[968c3e22-968d-4c0a-a705-74e37dbf5a70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5860c6c2-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:50:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517591, 'reachable_time': 20240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242235, 'error': None, 'target': 'ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.344 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3ab963-99ff-4dbf-aeae-e1ac5593d9dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.411 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[58e0c42f-b51c-4f74-a792-2b9db376b8dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.412 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5860c6c2-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.413 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.413 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5860c6c2-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:20 compute-1 NetworkManager[51724]: <info>  [1759268480.4155] manager: (tap5860c6c2-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Sep 30 21:41:20 compute-1 kernel: tap5860c6c2-50: entered promiscuous mode
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.424 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5860c6c2-50, col_values=(('external_ids', {'iface-id': '95fd8c81-d09c-4f3d-ba72-ae23b5a95cc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:20 compute-1 ovn_controller[94902]: 2025-09-30T21:41:20Z|00528|binding|INFO|Releasing lport 95fd8c81-d09c-4f3d-ba72-ae23b5a95cc0 from this chassis (sb_readonly=0)
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.435 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5860c6c2-5aaa-4201-b9cb-964d0ec94e1c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5860c6c2-5aaa-4201-b9cb-964d0ec94e1c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.438 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[66fdb11b-f575-4d06-96e4-85327118933b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.439 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/5860c6c2-5aaa-4201-b9cb-964d0ec94e1c.pid.haproxy
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 5860c6c2-5aaa-4201-b9cb-964d0ec94e1c
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:41:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:20.439 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c', 'env', 'PROCESS_TAG=haproxy-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5860c6c2-5aaa-4201-b9cb-964d0ec94e1c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.457 2 DEBUG nova.compute.manager [req-12c3cc68-86e3-444b-ab65-2f2b3afec27d req-bf325a8b-f325-45b0-b2d3-f816514571cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received event network-vif-plugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.457 2 DEBUG oslo_concurrency.lockutils [req-12c3cc68-86e3-444b-ab65-2f2b3afec27d req-bf325a8b-f325-45b0-b2d3-f816514571cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.457 2 DEBUG oslo_concurrency.lockutils [req-12c3cc68-86e3-444b-ab65-2f2b3afec27d req-bf325a8b-f325-45b0-b2d3-f816514571cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.458 2 DEBUG oslo_concurrency.lockutils [req-12c3cc68-86e3-444b-ab65-2f2b3afec27d req-bf325a8b-f325-45b0-b2d3-f816514571cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.458 2 DEBUG nova.compute.manager [req-12c3cc68-86e3-444b-ab65-2f2b3afec27d req-bf325a8b-f325-45b0-b2d3-f816514571cc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Processing event network-vif-plugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.545 2 DEBUG nova.compute.manager [req-39c81cd5-034b-4eae-ac18-b6baa494408a req-e763e7e5-e7b2-4e8c-b416-33e52a788170 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Received event network-vif-plugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.545 2 DEBUG oslo_concurrency.lockutils [req-39c81cd5-034b-4eae-ac18-b6baa494408a req-e763e7e5-e7b2-4e8c-b416-33e52a788170 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.546 2 DEBUG oslo_concurrency.lockutils [req-39c81cd5-034b-4eae-ac18-b6baa494408a req-e763e7e5-e7b2-4e8c-b416-33e52a788170 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.546 2 DEBUG oslo_concurrency.lockutils [req-39c81cd5-034b-4eae-ac18-b6baa494408a req-e763e7e5-e7b2-4e8c-b416-33e52a788170 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.546 2 DEBUG nova.compute.manager [req-39c81cd5-034b-4eae-ac18-b6baa494408a req-e763e7e5-e7b2-4e8c-b416-33e52a788170 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] No waiting events found dispatching network-vif-plugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.546 2 WARNING nova.compute.manager [req-39c81cd5-034b-4eae-ac18-b6baa494408a req-e763e7e5-e7b2-4e8c-b416-33e52a788170 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Received unexpected event network-vif-plugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 for instance with vm_state active and task_state None.
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.609 2 DEBUG nova.network.neutron [req-72aa7d74-a343-4d54-bde9-9c3933799196 req-6f23e111-f7f9-4411-a477-bafb32d9d4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Updated VIF entry in instance network info cache for port b4e459d6-f285-4155-91ff-3beefa9c4d81. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.609 2 DEBUG nova.network.neutron [req-72aa7d74-a343-4d54-bde9-9c3933799196 req-6f23e111-f7f9-4411-a477-bafb32d9d4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Updating instance_info_cache with network_info: [{"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.625 2 DEBUG oslo_concurrency.lockutils [req-72aa7d74-a343-4d54-bde9-9c3933799196 req-6f23e111-f7f9-4411-a477-bafb32d9d4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.744 2 DEBUG nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.746 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268480.745063, 3b552e93-79d1-422a-bfb2-240c6d2e3378 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.746 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] VM Started (Lifecycle Event)
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.749 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.752 2 INFO nova.virt.libvirt.driver [-] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Instance spawned successfully.
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.752 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.773 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.777 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.778 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.778 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.779 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.779 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.779 2 DEBUG nova.virt.libvirt.driver [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.786 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.815 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.815 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268480.7452044, 3b552e93-79d1-422a-bfb2-240c6d2e3378 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.815 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] VM Paused (Lifecycle Event)
Sep 30 21:41:20 compute-1 podman[242267]: 2025-09-30 21:41:20.829005307 +0000 UTC m=+0.061551749 container create 20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.839 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.843 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268480.7485795, 3b552e93-79d1-422a-bfb2-240c6d2e3378 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.843 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] VM Resumed (Lifecycle Event)
Sep 30 21:41:20 compute-1 systemd[1]: Started libpod-conmon-20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720.scope.
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.866 2 INFO nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Took 6.26 seconds to spawn the instance on the hypervisor.
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.867 2 DEBUG nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.871 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.878 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:41:20 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:41:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e08a9b407df9fe3ca7f5fc226c64d7ff9dfc325a93f880827df5637709773cb6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:41:20 compute-1 podman[242267]: 2025-09-30 21:41:20.797446801 +0000 UTC m=+0.029993263 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:41:20 compute-1 podman[242267]: 2025-09-30 21:41:20.907714194 +0000 UTC m=+0.140260666 container init 20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.908 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:41:20 compute-1 podman[242267]: 2025-09-30 21:41:20.91463525 +0000 UTC m=+0.147181682 container start 20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:41:20 compute-1 neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c[242288]: [NOTICE]   (242303) : New worker (242306) forked
Sep 30 21:41:20 compute-1 neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c[242288]: [NOTICE]   (242303) : Loading success.
Sep 30 21:41:20 compute-1 podman[242280]: 2025-09-30 21:41:20.945919477 +0000 UTC m=+0.084004600 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.960 2 INFO nova.compute.manager [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Took 7.16 seconds to build instance.
Sep 30 21:41:20 compute-1 nova_compute[192795]: 2025-09-30 21:41:20.984 2 DEBUG oslo_concurrency.lockutils [None req-6eb18f60-9e7e-45fb-bf44-e82bcd435b6e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.579 2 DEBUG nova.compute.manager [req-17e6b04d-883b-4d11-bc46-fcba05ff5785 req-72e2e35d-795a-41f9-a036-df418d7a2ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received event network-vif-plugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.579 2 DEBUG oslo_concurrency.lockutils [req-17e6b04d-883b-4d11-bc46-fcba05ff5785 req-72e2e35d-795a-41f9-a036-df418d7a2ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.579 2 DEBUG oslo_concurrency.lockutils [req-17e6b04d-883b-4d11-bc46-fcba05ff5785 req-72e2e35d-795a-41f9-a036-df418d7a2ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.580 2 DEBUG oslo_concurrency.lockutils [req-17e6b04d-883b-4d11-bc46-fcba05ff5785 req-72e2e35d-795a-41f9-a036-df418d7a2ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.580 2 DEBUG nova.compute.manager [req-17e6b04d-883b-4d11-bc46-fcba05ff5785 req-72e2e35d-795a-41f9-a036-df418d7a2ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] No waiting events found dispatching network-vif-plugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.580 2 WARNING nova.compute.manager [req-17e6b04d-883b-4d11-bc46-fcba05ff5785 req-72e2e35d-795a-41f9-a036-df418d7a2ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received unexpected event network-vif-plugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 for instance with vm_state active and task_state None.
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.932 2 DEBUG oslo_concurrency.lockutils [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "c2cfd403-7d78-4290-9c7c-682f8bac568e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.933 2 DEBUG oslo_concurrency.lockutils [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.933 2 DEBUG oslo_concurrency.lockutils [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.933 2 DEBUG oslo_concurrency.lockutils [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.933 2 DEBUG oslo_concurrency.lockutils [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.948 2 INFO nova.compute.manager [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Terminating instance
Sep 30 21:41:22 compute-1 nova_compute[192795]: 2025-09-30 21:41:22.964 2 DEBUG nova.compute.manager [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:41:22 compute-1 kernel: tapfdf3d8c5-ab (unregistering): left promiscuous mode
Sep 30 21:41:22 compute-1 NetworkManager[51724]: <info>  [1759268482.9892] device (tapfdf3d8c5-ab): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:41:23 compute-1 ovn_controller[94902]: 2025-09-30T21:41:23Z|00529|binding|INFO|Releasing lport fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 from this chassis (sb_readonly=0)
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:23 compute-1 ovn_controller[94902]: 2025-09-30T21:41:23Z|00530|binding|INFO|Setting lport fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 down in Southbound
Sep 30 21:41:23 compute-1 ovn_controller[94902]: 2025-09-30T21:41:23Z|00531|binding|INFO|Removing iface tapfdf3d8c5-ab ovn-installed in OVS
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.028 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:df:6d 10.100.0.13'], port_security=['fa:16:3e:b9:df:6d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c2cfd403-7d78-4290-9c7c-682f8bac568e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-773a5870-06c0-44a8-97b7-d3402370fd46', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f624bcd0e96f4a22879c80070e2afe4e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '30e8623e-093c-4248-9876-ba507ac6ac9f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9303632d-e068-4c79-add1-d0768573add5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=fdf3d8c5-ab5c-437f-9233-8eec99d30bc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.029 103861 INFO neutron.agent.ovn.metadata.agent [-] Port fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 in datapath 773a5870-06c0-44a8-97b7-d3402370fd46 unbound from our chassis
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.031 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 773a5870-06c0-44a8-97b7-d3402370fd46, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.032 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6ce885-3075-4cf2-a6b6-32220b8120b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.033 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46 namespace which is not needed anymore
Sep 30 21:41:23 compute-1 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000087.scope: Deactivated successfully.
Sep 30 21:41:23 compute-1 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000087.scope: Consumed 4.655s CPU time.
Sep 30 21:41:23 compute-1 systemd-machined[152783]: Machine qemu-63-instance-00000087 terminated.
Sep 30 21:41:23 compute-1 neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46[242173]: [NOTICE]   (242178) : haproxy version is 2.8.14-c23fe91
Sep 30 21:41:23 compute-1 neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46[242173]: [NOTICE]   (242178) : path to executable is /usr/sbin/haproxy
Sep 30 21:41:23 compute-1 neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46[242173]: [WARNING]  (242178) : Exiting Master process...
Sep 30 21:41:23 compute-1 neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46[242173]: [WARNING]  (242178) : Exiting Master process...
Sep 30 21:41:23 compute-1 neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46[242173]: [ALERT]    (242178) : Current worker (242180) exited with code 143 (Terminated)
Sep 30 21:41:23 compute-1 neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46[242173]: [WARNING]  (242178) : All workers exited. Exiting... (0)
Sep 30 21:41:23 compute-1 systemd[1]: libpod-aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60.scope: Deactivated successfully.
Sep 30 21:41:23 compute-1 podman[242340]: 2025-09-30 21:41:23.174656697 +0000 UTC m=+0.044803341 container died aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:41:23 compute-1 systemd[1]: var-lib-containers-storage-overlay-6a110c24050cfca27f9edf17ae086ac18b998872ecece546f7545fdddc3d417a-merged.mount: Deactivated successfully.
Sep 30 21:41:23 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60-userdata-shm.mount: Deactivated successfully.
Sep 30 21:41:23 compute-1 podman[242340]: 2025-09-30 21:41:23.232735503 +0000 UTC m=+0.102882117 container cleanup aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.240 2 INFO nova.virt.libvirt.driver [-] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Instance destroyed successfully.
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.241 2 DEBUG nova.objects.instance [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lazy-loading 'resources' on Instance uuid c2cfd403-7d78-4290-9c7c-682f8bac568e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:23 compute-1 systemd[1]: libpod-conmon-aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60.scope: Deactivated successfully.
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.260 2 DEBUG nova.virt.libvirt.vif [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:41:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1070040306',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1070040306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1070040306',id=135,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:41:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f624bcd0e96f4a22879c80070e2afe4e',ramdisk_id='',reservation_id='r-24mw1sup',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-490713827',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-490713827-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:41:19Z,user_data=None,user_id='ab5d15c0f90247089dbbed4ea5c454da',uuid=c2cfd403-7d78-4290-9c7c-682f8bac568e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.261 2 DEBUG nova.network.os_vif_util [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Converting VIF {"id": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "address": "fa:16:3e:b9:df:6d", "network": {"id": "773a5870-06c0-44a8-97b7-d3402370fd46", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-542956611-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f624bcd0e96f4a22879c80070e2afe4e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf3d8c5-ab", "ovs_interfaceid": "fdf3d8c5-ab5c-437f-9233-8eec99d30bc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.261 2 DEBUG nova.network.os_vif_util [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:df:6d,bridge_name='br-int',has_traffic_filtering=True,id=fdf3d8c5-ab5c-437f-9233-8eec99d30bc1,network=Network(773a5870-06c0-44a8-97b7-d3402370fd46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf3d8c5-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.262 2 DEBUG os_vif [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:df:6d,bridge_name='br-int',has_traffic_filtering=True,id=fdf3d8c5-ab5c-437f-9233-8eec99d30bc1,network=Network(773a5870-06c0-44a8-97b7-d3402370fd46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf3d8c5-ab') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.264 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfdf3d8c5-ab, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.270 2 INFO os_vif [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:df:6d,bridge_name='br-int',has_traffic_filtering=True,id=fdf3d8c5-ab5c-437f-9233-8eec99d30bc1,network=Network(773a5870-06c0-44a8-97b7-d3402370fd46),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf3d8c5-ab')
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.271 2 INFO nova.virt.libvirt.driver [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Deleting instance files /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e_del
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.272 2 INFO nova.virt.libvirt.driver [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Deletion of /var/lib/nova/instances/c2cfd403-7d78-4290-9c7c-682f8bac568e_del complete
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.288 2 DEBUG nova.compute.manager [req-73feed3c-d102-4037-b40b-b2f03d0b87d6 req-2caa8f36-f0fb-4c3c-b451-1f3c7bd78da7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Received event network-vif-unplugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.289 2 DEBUG oslo_concurrency.lockutils [req-73feed3c-d102-4037-b40b-b2f03d0b87d6 req-2caa8f36-f0fb-4c3c-b451-1f3c7bd78da7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.289 2 DEBUG oslo_concurrency.lockutils [req-73feed3c-d102-4037-b40b-b2f03d0b87d6 req-2caa8f36-f0fb-4c3c-b451-1f3c7bd78da7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.289 2 DEBUG oslo_concurrency.lockutils [req-73feed3c-d102-4037-b40b-b2f03d0b87d6 req-2caa8f36-f0fb-4c3c-b451-1f3c7bd78da7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.290 2 DEBUG nova.compute.manager [req-73feed3c-d102-4037-b40b-b2f03d0b87d6 req-2caa8f36-f0fb-4c3c-b451-1f3c7bd78da7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] No waiting events found dispatching network-vif-unplugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.290 2 DEBUG nova.compute.manager [req-73feed3c-d102-4037-b40b-b2f03d0b87d6 req-2caa8f36-f0fb-4c3c-b451-1f3c7bd78da7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Received event network-vif-unplugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:41:23 compute-1 podman[242388]: 2025-09-30 21:41:23.310977168 +0000 UTC m=+0.049962119 container remove aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.318 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[55d5af3f-0d77-4d0a-800d-dba99f8dbfa8]: (4, ('Tue Sep 30 09:41:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46 (aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60)\naa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60\nTue Sep 30 09:41:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46 (aa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60)\naa4769c01b5cb57f8d7b401fe82694aa9f9ab5457ba25212453c070435e43d60\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.322 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3e2fff45-d869-48c4-9a68-3aae87ce949b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.323 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap773a5870-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:23 compute-1 kernel: tap773a5870-00: left promiscuous mode
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.364 2 INFO nova.compute.manager [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.365 2 DEBUG oslo.service.loopingcall [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.365 2 DEBUG nova.compute.manager [-] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.365 2 DEBUG nova.network.neutron [-] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.384 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[987b006e-24a5-476a-acd7-f5ca5fac4266]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.417 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[890fd5cb-33e8-439a-9687-6f678dc3d8d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.419 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a83b9cdd-67d1-4235-ab96-001afab0a74f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.435 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[252561ed-a229-4fe0-8bad-cc8b5d880801]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517395, 'reachable_time': 43187, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242404, 'error': None, 'target': 'ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:23 compute-1 systemd[1]: run-netns-ovnmeta\x2d773a5870\x2d06c0\x2d44a8\x2d97b7\x2dd3402370fd46.mount: Deactivated successfully.
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.437 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-773a5870-06c0-44a8-97b7-d3402370fd46 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:41:23 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:23.437 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[95d12419-0019-4a5f-89b5-b922d9fb1605]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:23 compute-1 nova_compute[192795]: 2025-09-30 21:41:23.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.170 2 DEBUG nova.network.neutron [-] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.194 2 INFO nova.compute.manager [-] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Took 0.83 seconds to deallocate network for instance.
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.298 2 DEBUG oslo_concurrency.lockutils [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.299 2 DEBUG oslo_concurrency.lockutils [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.345 2 DEBUG nova.compute.manager [req-8f865562-7d68-489c-81ac-e1f245a915f3 req-8a509216-0a97-43b5-90d9-c2d00836f13c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Received event network-vif-deleted-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.380 2 DEBUG nova.compute.provider_tree [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.401 2 DEBUG nova.scheduler.client.report [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.434 2 DEBUG oslo_concurrency.lockutils [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.461 2 INFO nova.scheduler.client.report [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Deleted allocations for instance c2cfd403-7d78-4290-9c7c-682f8bac568e
Sep 30 21:41:24 compute-1 nova_compute[192795]: 2025-09-30 21:41:24.529 2 DEBUG oslo_concurrency.lockutils [None req-e756d806-7e6a-4345-922d-9811230632ec ab5d15c0f90247089dbbed4ea5c454da f624bcd0e96f4a22879c80070e2afe4e - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:25 compute-1 nova_compute[192795]: 2025-09-30 21:41:25.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:25 compute-1 NetworkManager[51724]: <info>  [1759268485.1004] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Sep 30 21:41:25 compute-1 NetworkManager[51724]: <info>  [1759268485.1019] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Sep 30 21:41:25 compute-1 nova_compute[192795]: 2025-09-30 21:41:25.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:25 compute-1 ovn_controller[94902]: 2025-09-30T21:41:25Z|00532|binding|INFO|Releasing lport 95fd8c81-d09c-4f3d-ba72-ae23b5a95cc0 from this chassis (sb_readonly=0)
Sep 30 21:41:25 compute-1 nova_compute[192795]: 2025-09-30 21:41:25.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:25 compute-1 nova_compute[192795]: 2025-09-30 21:41:25.395 2 DEBUG nova.compute.manager [req-13046244-cbdf-4990-9c3b-9c867b6e0976 req-bbea7a1d-2bf9-417d-ab3e-a56bb945293c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Received event network-vif-plugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:25 compute-1 nova_compute[192795]: 2025-09-30 21:41:25.396 2 DEBUG oslo_concurrency.lockutils [req-13046244-cbdf-4990-9c3b-9c867b6e0976 req-bbea7a1d-2bf9-417d-ab3e-a56bb945293c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:25 compute-1 nova_compute[192795]: 2025-09-30 21:41:25.397 2 DEBUG oslo_concurrency.lockutils [req-13046244-cbdf-4990-9c3b-9c867b6e0976 req-bbea7a1d-2bf9-417d-ab3e-a56bb945293c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:25 compute-1 nova_compute[192795]: 2025-09-30 21:41:25.397 2 DEBUG oslo_concurrency.lockutils [req-13046244-cbdf-4990-9c3b-9c867b6e0976 req-bbea7a1d-2bf9-417d-ab3e-a56bb945293c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c2cfd403-7d78-4290-9c7c-682f8bac568e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:25 compute-1 nova_compute[192795]: 2025-09-30 21:41:25.397 2 DEBUG nova.compute.manager [req-13046244-cbdf-4990-9c3b-9c867b6e0976 req-bbea7a1d-2bf9-417d-ab3e-a56bb945293c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] No waiting events found dispatching network-vif-plugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:41:25 compute-1 nova_compute[192795]: 2025-09-30 21:41:25.397 2 WARNING nova.compute.manager [req-13046244-cbdf-4990-9c3b-9c867b6e0976 req-bbea7a1d-2bf9-417d-ab3e-a56bb945293c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Received unexpected event network-vif-plugged-fdf3d8c5-ab5c-437f-9233-8eec99d30bc1 for instance with vm_state deleted and task_state None.
Sep 30 21:41:26 compute-1 nova_compute[192795]: 2025-09-30 21:41:26.443 2 DEBUG nova.compute.manager [req-c47071c2-097e-41dc-9106-d896ec87f129 req-2c440662-bb50-4f0e-9107-f8428a2c11f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received event network-changed-b4e459d6-f285-4155-91ff-3beefa9c4d81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:26 compute-1 nova_compute[192795]: 2025-09-30 21:41:26.445 2 DEBUG nova.compute.manager [req-c47071c2-097e-41dc-9106-d896ec87f129 req-2c440662-bb50-4f0e-9107-f8428a2c11f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Refreshing instance network info cache due to event network-changed-b4e459d6-f285-4155-91ff-3beefa9c4d81. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:41:26 compute-1 nova_compute[192795]: 2025-09-30 21:41:26.446 2 DEBUG oslo_concurrency.lockutils [req-c47071c2-097e-41dc-9106-d896ec87f129 req-2c440662-bb50-4f0e-9107-f8428a2c11f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:41:26 compute-1 nova_compute[192795]: 2025-09-30 21:41:26.447 2 DEBUG oslo_concurrency.lockutils [req-c47071c2-097e-41dc-9106-d896ec87f129 req-2c440662-bb50-4f0e-9107-f8428a2c11f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:41:26 compute-1 nova_compute[192795]: 2025-09-30 21:41:26.447 2 DEBUG nova.network.neutron [req-c47071c2-097e-41dc-9106-d896ec87f129 req-2c440662-bb50-4f0e-9107-f8428a2c11f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Refreshing network info cache for port b4e459d6-f285-4155-91ff-3beefa9c4d81 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:41:28 compute-1 ovn_controller[94902]: 2025-09-30T21:41:28Z|00533|binding|INFO|Releasing lport 95fd8c81-d09c-4f3d-ba72-ae23b5a95cc0 from this chassis (sb_readonly=0)
Sep 30 21:41:28 compute-1 nova_compute[192795]: 2025-09-30 21:41:28.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:28 compute-1 nova_compute[192795]: 2025-09-30 21:41:28.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:28 compute-1 nova_compute[192795]: 2025-09-30 21:41:28.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:28 compute-1 nova_compute[192795]: 2025-09-30 21:41:28.483 2 DEBUG nova.network.neutron [req-c47071c2-097e-41dc-9106-d896ec87f129 req-2c440662-bb50-4f0e-9107-f8428a2c11f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Updated VIF entry in instance network info cache for port b4e459d6-f285-4155-91ff-3beefa9c4d81. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:41:28 compute-1 nova_compute[192795]: 2025-09-30 21:41:28.484 2 DEBUG nova.network.neutron [req-c47071c2-097e-41dc-9106-d896ec87f129 req-2c440662-bb50-4f0e-9107-f8428a2c11f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Updating instance_info_cache with network_info: [{"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:28 compute-1 nova_compute[192795]: 2025-09-30 21:41:28.524 2 DEBUG oslo_concurrency.lockutils [req-c47071c2-097e-41dc-9106-d896ec87f129 req-2c440662-bb50-4f0e-9107-f8428a2c11f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:41:29 compute-1 podman[242408]: 2025-09-30 21:41:29.242333254 +0000 UTC m=+0.064570160 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:41:29 compute-1 podman[242406]: 2025-09-30 21:41:29.263191483 +0000 UTC m=+0.106151134 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=multipathd)
Sep 30 21:41:29 compute-1 podman[242407]: 2025-09-30 21:41:29.274814814 +0000 UTC m=+0.115301599 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:41:30 compute-1 nova_compute[192795]: 2025-09-30 21:41:30.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:31 compute-1 ovn_controller[94902]: 2025-09-30T21:41:31Z|00534|binding|INFO|Releasing lport 95fd8c81-d09c-4f3d-ba72-ae23b5a95cc0 from this chassis (sb_readonly=0)
Sep 30 21:41:31 compute-1 nova_compute[192795]: 2025-09-30 21:41:31.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:32 compute-1 ovn_controller[94902]: 2025-09-30T21:41:32Z|00535|binding|INFO|Releasing lport 95fd8c81-d09c-4f3d-ba72-ae23b5a95cc0 from this chassis (sb_readonly=0)
Sep 30 21:41:32 compute-1 nova_compute[192795]: 2025-09-30 21:41:32.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:33 compute-1 nova_compute[192795]: 2025-09-30 21:41:33.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:33 compute-1 nova_compute[192795]: 2025-09-30 21:41:33.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:33 compute-1 ovn_controller[94902]: 2025-09-30T21:41:33Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:96:08 10.100.0.5
Sep 30 21:41:33 compute-1 ovn_controller[94902]: 2025-09-30T21:41:33Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:96:08 10.100.0.5
Sep 30 21:41:34 compute-1 podman[242503]: 2025-09-30 21:41:34.208915046 +0000 UTC m=+0.056584775 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:41:35 compute-1 nova_compute[192795]: 2025-09-30 21:41:35.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:38 compute-1 nova_compute[192795]: 2025-09-30 21:41:38.238 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268483.237799, c2cfd403-7d78-4290-9c7c-682f8bac568e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:38 compute-1 nova_compute[192795]: 2025-09-30 21:41:38.239 2 INFO nova.compute.manager [-] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] VM Stopped (Lifecycle Event)
Sep 30 21:41:38 compute-1 nova_compute[192795]: 2025-09-30 21:41:38.259 2 DEBUG nova.compute.manager [None req-ea8ce71d-260c-4995-a9c2-00d73b03466f - - - - - -] [instance: c2cfd403-7d78-4290-9c7c-682f8bac568e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:38 compute-1 nova_compute[192795]: 2025-09-30 21:41:38.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:38 compute-1 nova_compute[192795]: 2025-09-30 21:41:38.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:38.700 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:38.700 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:38.701 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:40 compute-1 nova_compute[192795]: 2025-09-30 21:41:40.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:40 compute-1 nova_compute[192795]: 2025-09-30 21:41:40.387 2 INFO nova.compute.manager [None req-388f7362-eef5-4a42-bb35-016f08663be7 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Get console output
Sep 30 21:41:40 compute-1 nova_compute[192795]: 2025-09-30 21:41:40.391 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:41:41 compute-1 nova_compute[192795]: 2025-09-30 21:41:41.138 2 INFO nova.compute.manager [None req-0d98480a-a47d-430b-93b0-e6ad2ada8805 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Get console output
Sep 30 21:41:41 compute-1 nova_compute[192795]: 2025-09-30 21:41:41.142 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:41:41 compute-1 podman[242525]: 2025-09-30 21:41:41.213950815 +0000 UTC m=+0.054634464 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:41:41 compute-1 podman[242524]: 2025-09-30 21:41:41.216149594 +0000 UTC m=+0.060425499 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=)
Sep 30 21:41:41 compute-1 podman[242526]: 2025-09-30 21:41:41.249203999 +0000 UTC m=+0.086285511 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 21:41:41 compute-1 unix_chkpwd[242587]: password check failed for user (root)
Sep 30 21:41:41 compute-1 sshd-session[242585]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239  user=root
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.119 2 DEBUG nova.compute.manager [req-595eb70b-4df1-4e78-94a2-900123aa2a4e req-99d27bb1-2497-41ee-a462-fbbe3396bd08 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received event network-changed-b4e459d6-f285-4155-91ff-3beefa9c4d81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.120 2 DEBUG nova.compute.manager [req-595eb70b-4df1-4e78-94a2-900123aa2a4e req-99d27bb1-2497-41ee-a462-fbbe3396bd08 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Refreshing instance network info cache due to event network-changed-b4e459d6-f285-4155-91ff-3beefa9c4d81. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.120 2 DEBUG oslo_concurrency.lockutils [req-595eb70b-4df1-4e78-94a2-900123aa2a4e req-99d27bb1-2497-41ee-a462-fbbe3396bd08 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.121 2 DEBUG oslo_concurrency.lockutils [req-595eb70b-4df1-4e78-94a2-900123aa2a4e req-99d27bb1-2497-41ee-a462-fbbe3396bd08 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.121 2 DEBUG nova.network.neutron [req-595eb70b-4df1-4e78-94a2-900123aa2a4e req-99d27bb1-2497-41ee-a462-fbbe3396bd08 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Refreshing network info cache for port b4e459d6-f285-4155-91ff-3beefa9c4d81 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.242 2 DEBUG oslo_concurrency.lockutils [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "3b552e93-79d1-422a-bfb2-240c6d2e3378" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.243 2 DEBUG oslo_concurrency.lockutils [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.243 2 DEBUG oslo_concurrency.lockutils [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.243 2 DEBUG oslo_concurrency.lockutils [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.243 2 DEBUG oslo_concurrency.lockutils [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.254 2 INFO nova.compute.manager [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Terminating instance
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.268 2 DEBUG nova.compute.manager [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:41:42 compute-1 kernel: tapb4e459d6-f2 (unregistering): left promiscuous mode
Sep 30 21:41:42 compute-1 NetworkManager[51724]: <info>  [1759268502.2999] device (tapb4e459d6-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:42 compute-1 ovn_controller[94902]: 2025-09-30T21:41:42Z|00536|binding|INFO|Releasing lport b4e459d6-f285-4155-91ff-3beefa9c4d81 from this chassis (sb_readonly=0)
Sep 30 21:41:42 compute-1 ovn_controller[94902]: 2025-09-30T21:41:42Z|00537|binding|INFO|Setting lport b4e459d6-f285-4155-91ff-3beefa9c4d81 down in Southbound
Sep 30 21:41:42 compute-1 ovn_controller[94902]: 2025-09-30T21:41:42Z|00538|binding|INFO|Removing iface tapb4e459d6-f2 ovn-installed in OVS
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.317 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:96:08 10.100.0.5'], port_security=['fa:16:3e:30:96:08 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3b552e93-79d1-422a-bfb2-240c6d2e3378', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2dfc4e41-bce6-4c9f-b7c0-ad646d5ba504', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4eb01a7b-4833-4750-8216-3eeaede6ad3d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=b4e459d6-f285-4155-91ff-3beefa9c4d81) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.318 103861 INFO neutron.agent.ovn.metadata.agent [-] Port b4e459d6-f285-4155-91ff-3beefa9c4d81 in datapath 5860c6c2-5aaa-4201-b9cb-964d0ec94e1c unbound from our chassis
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.319 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5860c6c2-5aaa-4201-b9cb-964d0ec94e1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.320 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[50e8fcb3-d087-4491-b18c-96fb5095d707]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.321 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c namespace which is not needed anymore
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:42 compute-1 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000088.scope: Deactivated successfully.
Sep 30 21:41:42 compute-1 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000088.scope: Consumed 13.149s CPU time.
Sep 30 21:41:42 compute-1 systemd-machined[152783]: Machine qemu-64-instance-00000088 terminated.
Sep 30 21:41:42 compute-1 neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c[242288]: [NOTICE]   (242303) : haproxy version is 2.8.14-c23fe91
Sep 30 21:41:42 compute-1 neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c[242288]: [NOTICE]   (242303) : path to executable is /usr/sbin/haproxy
Sep 30 21:41:42 compute-1 neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c[242288]: [WARNING]  (242303) : Exiting Master process...
Sep 30 21:41:42 compute-1 neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c[242288]: [WARNING]  (242303) : Exiting Master process...
Sep 30 21:41:42 compute-1 neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c[242288]: [ALERT]    (242303) : Current worker (242306) exited with code 143 (Terminated)
Sep 30 21:41:42 compute-1 neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c[242288]: [WARNING]  (242303) : All workers exited. Exiting... (0)
Sep 30 21:41:42 compute-1 systemd[1]: libpod-20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720.scope: Deactivated successfully.
Sep 30 21:41:42 compute-1 conmon[242288]: conmon 20e89ba2aa7ca93f4cd1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720.scope/container/memory.events
Sep 30 21:41:42 compute-1 podman[242610]: 2025-09-30 21:41:42.468930099 +0000 UTC m=+0.048323724 container died 20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:41:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720-userdata-shm.mount: Deactivated successfully.
Sep 30 21:41:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-e08a9b407df9fe3ca7f5fc226c64d7ff9dfc325a93f880827df5637709773cb6-merged.mount: Deactivated successfully.
Sep 30 21:41:42 compute-1 podman[242610]: 2025-09-30 21:41:42.528544436 +0000 UTC m=+0.107938051 container cleanup 20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:41:42 compute-1 systemd[1]: libpod-conmon-20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720.scope: Deactivated successfully.
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.556 2 INFO nova.virt.libvirt.driver [-] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Instance destroyed successfully.
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.558 2 DEBUG nova.objects.instance [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'resources' on Instance uuid 3b552e93-79d1-422a-bfb2-240c6d2e3378 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.576 2 DEBUG nova.virt.libvirt.vif [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:41:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-686147264',display_name='tempest-TestNetworkBasicOps-server-686147264',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-686147264',id=136,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEuk2BbMitNL/zl9ejInjo2O7BfBcgbeg+gE3A9WVPCfG1vcgqZD0WrcCCmZv6Y26Pwh2RY3dJIxtGKQOrNQzrhkhAVqiqHE8YzJ79HbcB5BIIpxWHIdSv5PoHu9Pzxqtg==',key_name='tempest-TestNetworkBasicOps-336137018',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:41:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-ljqt4ziw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:41:20Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=3b552e93-79d1-422a-bfb2-240c6d2e3378,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.577 2 DEBUG nova.network.os_vif_util [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.579 2 DEBUG nova.network.os_vif_util [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:96:08,bridge_name='br-int',has_traffic_filtering=True,id=b4e459d6-f285-4155-91ff-3beefa9c4d81,network=Network(5860c6c2-5aaa-4201-b9cb-964d0ec94e1c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4e459d6-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.580 2 DEBUG os_vif [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:96:08,bridge_name='br-int',has_traffic_filtering=True,id=b4e459d6-f285-4155-91ff-3beefa9c4d81,network=Network(5860c6c2-5aaa-4201-b9cb-964d0ec94e1c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4e459d6-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.584 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb4e459d6-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.590 2 INFO os_vif [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:96:08,bridge_name='br-int',has_traffic_filtering=True,id=b4e459d6-f285-4155-91ff-3beefa9c4d81,network=Network(5860c6c2-5aaa-4201-b9cb-964d0ec94e1c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4e459d6-f2')
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.591 2 INFO nova.virt.libvirt.driver [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Deleting instance files /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378_del
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.592 2 INFO nova.virt.libvirt.driver [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Deletion of /var/lib/nova/instances/3b552e93-79d1-422a-bfb2-240c6d2e3378_del complete
Sep 30 21:41:42 compute-1 podman[242656]: 2025-09-30 21:41:42.603148223 +0000 UTC m=+0.043428764 container remove 20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.611 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7d18d808-f38c-46bf-b6d3-168d2bcb9e43]: (4, ('Tue Sep 30 09:41:42 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c (20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720)\n20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720\nTue Sep 30 09:41:42 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c (20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720)\n20e89ba2aa7ca93f4cd1b2af0dbfb5f091aaf78ba03a4f30bdb6d90f7ad77720\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.613 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[668a81d1-7b46-45da-b058-d55dfbc77e0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.614 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5860c6c2-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:42 compute-1 kernel: tap5860c6c2-50: left promiscuous mode
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.644 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[52899af8-006b-4a4c-ad9c-2a89b677050a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.673 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1644aa54-7a72-4da9-b8b9-4f10061a279a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.674 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2d6273-91d6-44b1-9f18-6e47c014a162]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.683 2 INFO nova.compute.manager [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.684 2 DEBUG oslo.service.loopingcall [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.684 2 DEBUG nova.compute.manager [-] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:41:42 compute-1 nova_compute[192795]: 2025-09-30 21:41:42.684 2 DEBUG nova.network.neutron [-] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.702 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[67d53e2a-aeb0-4849-8f64-bfe16076f8c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517584, 'reachable_time': 21651, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242671, 'error': None, 'target': 'ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.706 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5860c6c2-5aaa-4201-b9cb-964d0ec94e1c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:41:42 compute-1 systemd[1]: run-netns-ovnmeta\x2d5860c6c2\x2d5aaa\x2d4201\x2db9cb\x2d964d0ec94e1c.mount: Deactivated successfully.
Sep 30 21:41:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:41:42.707 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[f4479ad4-699c-4766-b82a-59912d9df9f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:41:43 compute-1 nova_compute[192795]: 2025-09-30 21:41:43.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:44 compute-1 sshd-session[242585]: Failed password for root from 167.71.248.239 port 43250 ssh2
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.527 2 DEBUG nova.compute.manager [req-fbd965a9-8fd6-436d-821f-9a269c09e5fb req-d1194d50-6977-4844-ac7e-87f897eb94a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received event network-vif-unplugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.528 2 DEBUG oslo_concurrency.lockutils [req-fbd965a9-8fd6-436d-821f-9a269c09e5fb req-d1194d50-6977-4844-ac7e-87f897eb94a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.528 2 DEBUG oslo_concurrency.lockutils [req-fbd965a9-8fd6-436d-821f-9a269c09e5fb req-d1194d50-6977-4844-ac7e-87f897eb94a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.528 2 DEBUG oslo_concurrency.lockutils [req-fbd965a9-8fd6-436d-821f-9a269c09e5fb req-d1194d50-6977-4844-ac7e-87f897eb94a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.528 2 DEBUG nova.compute.manager [req-fbd965a9-8fd6-436d-821f-9a269c09e5fb req-d1194d50-6977-4844-ac7e-87f897eb94a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] No waiting events found dispatching network-vif-unplugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.529 2 DEBUG nova.compute.manager [req-fbd965a9-8fd6-436d-821f-9a269c09e5fb req-d1194d50-6977-4844-ac7e-87f897eb94a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received event network-vif-unplugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.750 2 DEBUG nova.network.neutron [-] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.767 2 INFO nova.compute.manager [-] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Took 2.08 seconds to deallocate network for instance.
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.821 2 DEBUG nova.network.neutron [req-595eb70b-4df1-4e78-94a2-900123aa2a4e req-99d27bb1-2497-41ee-a462-fbbe3396bd08 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Updated VIF entry in instance network info cache for port b4e459d6-f285-4155-91ff-3beefa9c4d81. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.821 2 DEBUG nova.network.neutron [req-595eb70b-4df1-4e78-94a2-900123aa2a4e req-99d27bb1-2497-41ee-a462-fbbe3396bd08 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Updating instance_info_cache with network_info: [{"id": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "address": "fa:16:3e:30:96:08", "network": {"id": "5860c6c2-5aaa-4201-b9cb-964d0ec94e1c", "bridge": "br-int", "label": "tempest-network-smoke--1494618843", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4e459d6-f2", "ovs_interfaceid": "b4e459d6-f285-4155-91ff-3beefa9c4d81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.864 2 DEBUG nova.compute.manager [req-2d5a05e7-cb4a-4ccd-8d74-172401c0f5c8 req-6033bdcb-be81-44cc-a63a-7fd8381d2ec2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received event network-vif-deleted-b4e459d6-f285-4155-91ff-3beefa9c4d81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.866 2 DEBUG oslo_concurrency.lockutils [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.866 2 DEBUG oslo_concurrency.lockutils [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.869 2 DEBUG oslo_concurrency.lockutils [req-595eb70b-4df1-4e78-94a2-900123aa2a4e req-99d27bb1-2497-41ee-a462-fbbe3396bd08 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3b552e93-79d1-422a-bfb2-240c6d2e3378" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.924 2 DEBUG nova.compute.provider_tree [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.937 2 DEBUG nova.scheduler.client.report [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:41:44 compute-1 nova_compute[192795]: 2025-09-30 21:41:44.966 2 DEBUG oslo_concurrency.lockutils [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:45 compute-1 nova_compute[192795]: 2025-09-30 21:41:45.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:45 compute-1 nova_compute[192795]: 2025-09-30 21:41:45.016 2 INFO nova.scheduler.client.report [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Deleted allocations for instance 3b552e93-79d1-422a-bfb2-240c6d2e3378
Sep 30 21:41:45 compute-1 nova_compute[192795]: 2025-09-30 21:41:45.082 2 DEBUG oslo_concurrency.lockutils [None req-a2201a98-88dc-44f0-a18c-bdfde2abb392 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:45 compute-1 sshd-session[242585]: Connection closed by authenticating user root 167.71.248.239 port 43250 [preauth]
Sep 30 21:41:46 compute-1 nova_compute[192795]: 2025-09-30 21:41:46.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:46 compute-1 nova_compute[192795]: 2025-09-30 21:41:46.693 2 DEBUG nova.compute.manager [req-7293168d-0acd-4c2b-b5f9-232ff6053434 req-74f88b07-199d-4248-b60c-a3e5a6e10010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received event network-vif-plugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:41:46 compute-1 nova_compute[192795]: 2025-09-30 21:41:46.694 2 DEBUG oslo_concurrency.lockutils [req-7293168d-0acd-4c2b-b5f9-232ff6053434 req-74f88b07-199d-4248-b60c-a3e5a6e10010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:46 compute-1 nova_compute[192795]: 2025-09-30 21:41:46.694 2 DEBUG oslo_concurrency.lockutils [req-7293168d-0acd-4c2b-b5f9-232ff6053434 req-74f88b07-199d-4248-b60c-a3e5a6e10010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:46 compute-1 nova_compute[192795]: 2025-09-30 21:41:46.694 2 DEBUG oslo_concurrency.lockutils [req-7293168d-0acd-4c2b-b5f9-232ff6053434 req-74f88b07-199d-4248-b60c-a3e5a6e10010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3b552e93-79d1-422a-bfb2-240c6d2e3378-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:46 compute-1 nova_compute[192795]: 2025-09-30 21:41:46.694 2 DEBUG nova.compute.manager [req-7293168d-0acd-4c2b-b5f9-232ff6053434 req-74f88b07-199d-4248-b60c-a3e5a6e10010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] No waiting events found dispatching network-vif-plugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:41:46 compute-1 nova_compute[192795]: 2025-09-30 21:41:46.695 2 WARNING nova.compute.manager [req-7293168d-0acd-4c2b-b5f9-232ff6053434 req-74f88b07-199d-4248-b60c-a3e5a6e10010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Received unexpected event network-vif-plugged-b4e459d6-f285-4155-91ff-3beefa9c4d81 for instance with vm_state deleted and task_state None.
Sep 30 21:41:47 compute-1 nova_compute[192795]: 2025-09-30 21:41:47.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:48 compute-1 nova_compute[192795]: 2025-09-30 21:41:48.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:49 compute-1 nova_compute[192795]: 2025-09-30 21:41:49.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:49 compute-1 nova_compute[192795]: 2025-09-30 21:41:49.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:51 compute-1 podman[242673]: 2025-09-30 21:41:51.245869053 +0000 UTC m=+0.073749706 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid)
Sep 30 21:41:52 compute-1 nova_compute[192795]: 2025-09-30 21:41:52.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.720 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.721 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.721 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.721 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.901 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.902 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5668MB free_disk=73.31673049926758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.902 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.903 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.991 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:41:53 compute-1 nova_compute[192795]: 2025-09-30 21:41:53.992 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:41:54 compute-1 nova_compute[192795]: 2025-09-30 21:41:54.014 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:41:54 compute-1 nova_compute[192795]: 2025-09-30 21:41:54.033 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:41:54 compute-1 nova_compute[192795]: 2025-09-30 21:41:54.034 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:41:54 compute-1 nova_compute[192795]: 2025-09-30 21:41:54.054 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:41:54 compute-1 nova_compute[192795]: 2025-09-30 21:41:54.076 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:41:54 compute-1 nova_compute[192795]: 2025-09-30 21:41:54.100 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:41:54 compute-1 nova_compute[192795]: 2025-09-30 21:41:54.132 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:41:54 compute-1 nova_compute[192795]: 2025-09-30 21:41:54.188 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:41:54 compute-1 nova_compute[192795]: 2025-09-30 21:41:54.189 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:56 compute-1 nova_compute[192795]: 2025-09-30 21:41:56.189 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:57 compute-1 nova_compute[192795]: 2025-09-30 21:41:57.556 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268502.5531464, 3b552e93-79d1-422a-bfb2-240c6d2e3378 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:57 compute-1 nova_compute[192795]: 2025-09-30 21:41:57.557 2 INFO nova.compute.manager [-] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] VM Stopped (Lifecycle Event)
Sep 30 21:41:57 compute-1 nova_compute[192795]: 2025-09-30 21:41:57.577 2 DEBUG nova.compute.manager [None req-389b5ef3-a806-4536-b338-0c492c5484dc - - - - - -] [instance: 3b552e93-79d1-422a-bfb2-240c6d2e3378] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:57 compute-1 nova_compute[192795]: 2025-09-30 21:41:57.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:58 compute-1 nova_compute[192795]: 2025-09-30 21:41:58.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:58 compute-1 nova_compute[192795]: 2025-09-30 21:41:58.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:58 compute-1 nova_compute[192795]: 2025-09-30 21:41:58.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:58 compute-1 nova_compute[192795]: 2025-09-30 21:41:58.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:42:00 compute-1 podman[242698]: 2025-09-30 21:42:00.260018348 +0000 UTC m=+0.076490219 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:42:00 compute-1 podman[242696]: 2025-09-30 21:42:00.293655899 +0000 UTC m=+0.122195653 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 21:42:00 compute-1 podman[242697]: 2025-09-30 21:42:00.308104766 +0000 UTC m=+0.131243805 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:42:01 compute-1 nova_compute[192795]: 2025-09-30 21:42:01.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:02 compute-1 nova_compute[192795]: 2025-09-30 21:42:02.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:02 compute-1 nova_compute[192795]: 2025-09-30 21:42:02.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:03 compute-1 nova_compute[192795]: 2025-09-30 21:42:03.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:05 compute-1 podman[242764]: 2025-09-30 21:42:05.238905162 +0000 UTC m=+0.081750278 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:42:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:05.296 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:42:05 compute-1 nova_compute[192795]: 2025-09-30 21:42:05.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:05.298 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:42:06 compute-1 nova_compute[192795]: 2025-09-30 21:42:06.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:06 compute-1 nova_compute[192795]: 2025-09-30 21:42:06.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:42:06 compute-1 nova_compute[192795]: 2025-09-30 21:42:06.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:42:06 compute-1 nova_compute[192795]: 2025-09-30 21:42:06.742 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:42:06 compute-1 nova_compute[192795]: 2025-09-30 21:42:06.742 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:07 compute-1 nova_compute[192795]: 2025-09-30 21:42:07.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:08 compute-1 nova_compute[192795]: 2025-09-30 21:42:08.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:10 compute-1 nova_compute[192795]: 2025-09-30 21:42:10.737 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:12 compute-1 podman[242785]: 2025-09-30 21:42:12.220780429 +0000 UTC m=+0.057855877 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 21:42:12 compute-1 podman[242783]: 2025-09-30 21:42:12.220795099 +0000 UTC m=+0.062945332 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, version=9.6)
Sep 30 21:42:12 compute-1 podman[242784]: 2025-09-30 21:42:12.236705617 +0000 UTC m=+0.077666759 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:42:12 compute-1 nova_compute[192795]: 2025-09-30 21:42:12.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:13 compute-1 nova_compute[192795]: 2025-09-30 21:42:13.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:15.085 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:2b:af 10.100.0.2 2001:db8::f816:3eff:feed:2baf'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feed:2baf/64', 'neutron:device_id': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59e9be54-e45c-44fd-b8d8-090d788a8eee, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5f2fed01-3e8a-4c9d-b394-f15b7f60366f) old=Port_Binding(mac=['fa:16:3e:ed:2b:af 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:42:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:15.087 103861 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5f2fed01-3e8a-4c9d-b394-f15b7f60366f in datapath 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb updated
Sep 30 21:42:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:15.090 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:42:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:15.091 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3f63f8f6-6a79-41ab-95c2-d19d15c3af7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:15.301 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:17 compute-1 nova_compute[192795]: 2025-09-30 21:42:17.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:18 compute-1 nova_compute[192795]: 2025-09-30 21:42:18.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:22 compute-1 podman[242845]: 2025-09-30 21:42:22.224593134 +0000 UTC m=+0.064590977 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:42:22 compute-1 nova_compute[192795]: 2025-09-30 21:42:22.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:23 compute-1 nova_compute[192795]: 2025-09-30 21:42:23.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:27 compute-1 nova_compute[192795]: 2025-09-30 21:42:27.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:28 compute-1 nova_compute[192795]: 2025-09-30 21:42:28.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:31 compute-1 podman[242867]: 2025-09-30 21:42:31.222088615 +0000 UTC m=+0.058812482 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:42:31 compute-1 podman[242865]: 2025-09-30 21:42:31.22636061 +0000 UTC m=+0.067664601 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:42:31 compute-1 podman[242866]: 2025-09-30 21:42:31.277405771 +0000 UTC m=+0.109798282 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:42:32 compute-1 nova_compute[192795]: 2025-09-30 21:42:32.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:32 compute-1 ovn_controller[94902]: 2025-09-30T21:42:32Z|00539|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 21:42:33 compute-1 nova_compute[192795]: 2025-09-30 21:42:33.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:36 compute-1 podman[242929]: 2025-09-30 21:42:36.222067407 +0000 UTC m=+0.070270981 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:42:37 compute-1 nova_compute[192795]: 2025-09-30 21:42:37.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:38 compute-1 nova_compute[192795]: 2025-09-30 21:42:38.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:38.700 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:38.701 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:38.701 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:39 compute-1 nova_compute[192795]: 2025-09-30 21:42:39.692 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:39 compute-1 nova_compute[192795]: 2025-09-30 21:42:39.693 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "2384614e-09de-4607-9336-54877ec23545" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:39 compute-1 nova_compute[192795]: 2025-09-30 21:42:39.772 2 DEBUG nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:42:41 compute-1 nova_compute[192795]: 2025-09-30 21:42:41.813 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:41 compute-1 nova_compute[192795]: 2025-09-30 21:42:41.814 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:41 compute-1 nova_compute[192795]: 2025-09-30 21:42:41.825 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:42:41 compute-1 nova_compute[192795]: 2025-09-30 21:42:41.826 2 INFO nova.compute.claims [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.025 2 DEBUG nova.compute.provider_tree [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.055 2 DEBUG nova.scheduler.client.report [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.087 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.088 2 DEBUG nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.201 2 DEBUG nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.201 2 DEBUG nova.network.neutron [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.250 2 INFO nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.276 2 DEBUG nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.455 2 DEBUG nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.457 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.458 2 INFO nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Creating image(s)
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.459 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.460 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.462 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.479 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.541 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.542 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.543 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.554 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.608 2 DEBUG nova.policy [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.646 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.647 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.690 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:42.690 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:42:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:42.691 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.691 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.692 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.784 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.785 2 DEBUG nova.virt.disk.api [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.785 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.852 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.853 2 DEBUG nova.virt.disk.api [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.853 2 DEBUG nova.objects.instance [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid 2384614e-09de-4607-9336-54877ec23545 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.888 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.889 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Ensure instance console log exists: /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.890 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.890 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:42 compute-1 nova_compute[192795]: 2025-09-30 21:42:42.890 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:43 compute-1 podman[242964]: 2025-09-30 21:42:43.230327893 +0000 UTC m=+0.069237422 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Sep 30 21:42:43 compute-1 podman[242965]: 2025-09-30 21:42:43.23988993 +0000 UTC m=+0.068041190 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:42:43 compute-1 podman[242966]: 2025-09-30 21:42:43.239945251 +0000 UTC m=+0.069140289 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:42:43 compute-1 nova_compute[192795]: 2025-09-30 21:42:43.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:42:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:42:45 compute-1 nova_compute[192795]: 2025-09-30 21:42:45.354 2 DEBUG nova.network.neutron [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Successfully created port: f62fc673-df33-48e7-96fb-bd4f42eca73a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:42:47 compute-1 nova_compute[192795]: 2025-09-30 21:42:47.449 2 DEBUG nova.network.neutron [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Successfully updated port: f62fc673-df33-48e7-96fb-bd4f42eca73a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:42:47 compute-1 nova_compute[192795]: 2025-09-30 21:42:47.473 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:42:47 compute-1 nova_compute[192795]: 2025-09-30 21:42:47.473 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:42:47 compute-1 nova_compute[192795]: 2025-09-30 21:42:47.474 2 DEBUG nova.network.neutron [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:42:47 compute-1 nova_compute[192795]: 2025-09-30 21:42:47.563 2 DEBUG nova.compute.manager [req-11f0c020-29b3-4e44-b523-d5604726ad3b req-321d515f-f3af-479a-b809-fb95787bb602 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received event network-changed-f62fc673-df33-48e7-96fb-bd4f42eca73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:42:47 compute-1 nova_compute[192795]: 2025-09-30 21:42:47.563 2 DEBUG nova.compute.manager [req-11f0c020-29b3-4e44-b523-d5604726ad3b req-321d515f-f3af-479a-b809-fb95787bb602 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Refreshing instance network info cache due to event network-changed-f62fc673-df33-48e7-96fb-bd4f42eca73a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:42:47 compute-1 nova_compute[192795]: 2025-09-30 21:42:47.564 2 DEBUG oslo_concurrency.lockutils [req-11f0c020-29b3-4e44-b523-d5604726ad3b req-321d515f-f3af-479a-b809-fb95787bb602 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:42:47 compute-1 nova_compute[192795]: 2025-09-30 21:42:47.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:47 compute-1 nova_compute[192795]: 2025-09-30 21:42:47.856 2 DEBUG nova.network.neutron [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:42:48 compute-1 nova_compute[192795]: 2025-09-30 21:42:48.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.036 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.037 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.062 2 DEBUG nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.146 2 DEBUG nova.network.neutron [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Updating instance_info_cache with network_info: [{"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.197 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.198 2 DEBUG nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Instance network_info: |[{"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.199 2 DEBUG oslo_concurrency.lockutils [req-11f0c020-29b3-4e44-b523-d5604726ad3b req-321d515f-f3af-479a-b809-fb95787bb602 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.199 2 DEBUG nova.network.neutron [req-11f0c020-29b3-4e44-b523-d5604726ad3b req-321d515f-f3af-479a-b809-fb95787bb602 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Refreshing network info cache for port f62fc673-df33-48e7-96fb-bd4f42eca73a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.202 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Start _get_guest_xml network_info=[{"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.208 2 WARNING nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.217 2 DEBUG nova.virt.libvirt.host [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.217 2 DEBUG nova.virt.libvirt.host [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.225 2 DEBUG nova.virt.libvirt.host [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.225 2 DEBUG nova.virt.libvirt.host [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.227 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.227 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.228 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.228 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.228 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.228 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.228 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.229 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.229 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.229 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.229 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.230 2 DEBUG nova.virt.hardware [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.233 2 DEBUG nova.virt.libvirt.vif [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:42:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2055139287',display_name='tempest-TestNetworkAdvancedServerOps-server-2055139287',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2055139287',id=140,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBehqnc6d89NlpWlFkGSrh3zkfip/+VLqupR92Yr35G7qE4Lwo0WTn/PbJgkZacVF8bBGZIqn0TojkTkbuYSZcYRnrojpJav/wMRzr1lB8gfPWFkn7iP96oX0K0AZMrrNg==',key_name='tempest-TestNetworkAdvancedServerOps-1291575231',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-ggzu7oo6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:42:42Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=2384614e-09de-4607-9336-54877ec23545,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.233 2 DEBUG nova.network.os_vif_util [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.234 2 DEBUG nova.network.os_vif_util [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.235 2 DEBUG nova.objects.instance [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2384614e-09de-4607-9336-54877ec23545 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.257 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.258 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.269 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.269 2 INFO nova.compute.claims [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.274 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <uuid>2384614e-09de-4607-9336-54877ec23545</uuid>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <name>instance-0000008c</name>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-2055139287</nova:name>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:42:50</nova:creationTime>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:42:50 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:42:50 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:42:50 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:42:50 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:42:50 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:42:50 compute-1 nova_compute[192795]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:42:50 compute-1 nova_compute[192795]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:42:50 compute-1 nova_compute[192795]:         <nova:port uuid="f62fc673-df33-48e7-96fb-bd4f42eca73a">
Sep 30 21:42:50 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <system>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <entry name="serial">2384614e-09de-4607-9336-54877ec23545</entry>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <entry name="uuid">2384614e-09de-4607-9336-54877ec23545</entry>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     </system>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <os>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   </os>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <features>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   </features>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.config"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:c8:eb:10"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <target dev="tapf62fc673-df"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/console.log" append="off"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <video>
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     </video>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:42:50 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:42:50 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:42:50 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:42:50 compute-1 nova_compute[192795]: </domain>
Sep 30 21:42:50 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.275 2 DEBUG nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Preparing to wait for external event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.276 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.276 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.276 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.277 2 DEBUG nova.virt.libvirt.vif [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:42:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2055139287',display_name='tempest-TestNetworkAdvancedServerOps-server-2055139287',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2055139287',id=140,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBehqnc6d89NlpWlFkGSrh3zkfip/+VLqupR92Yr35G7qE4Lwo0WTn/PbJgkZacVF8bBGZIqn0TojkTkbuYSZcYRnrojpJav/wMRzr1lB8gfPWFkn7iP96oX0K0AZMrrNg==',key_name='tempest-TestNetworkAdvancedServerOps-1291575231',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-ggzu7oo6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:42:42Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=2384614e-09de-4607-9336-54877ec23545,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.277 2 DEBUG nova.network.os_vif_util [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.278 2 DEBUG nova.network.os_vif_util [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.278 2 DEBUG os_vif [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.279 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.279 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.286 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf62fc673-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.287 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf62fc673-df, col_values=(('external_ids', {'iface-id': 'f62fc673-df33-48e7-96fb-bd4f42eca73a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:eb:10', 'vm-uuid': '2384614e-09de-4607-9336-54877ec23545'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:50 compute-1 NetworkManager[51724]: <info>  [1759268570.2920] manager: (tapf62fc673-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.300 2 INFO os_vif [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df')
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.401 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.402 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.402 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No VIF found with MAC fa:16:3e:c8:eb:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.403 2 INFO nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Using config drive
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.532 2 DEBUG nova.compute.provider_tree [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.548 2 DEBUG nova.scheduler.client.report [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.580 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.581 2 DEBUG nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.666 2 DEBUG nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.667 2 DEBUG nova.network.neutron [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.701 2 INFO nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:42:50 compute-1 nova_compute[192795]: 2025-09-30 21:42:50.779 2 DEBUG nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.003 2 DEBUG nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.004 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.004 2 INFO nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Creating image(s)
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.005 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "/var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.005 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.006 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.019 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.109 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.110 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.111 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.123 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.202 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.203 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.244 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.245 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.246 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.331 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.333 2 DEBUG nova.virt.disk.api [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Checking if we can resize image /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.333 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.403 2 DEBUG nova.policy [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27859618cb1d493cb1531af26b200b92', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '043721d1d0a2480fa785367fa56c1fa4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.413 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.413 2 DEBUG nova.virt.disk.api [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Cannot resize image /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.414 2 DEBUG nova.objects.instance [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'migration_context' on Instance uuid abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.461 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.462 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Ensure instance console log exists: /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.463 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.463 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.463 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:51.692 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.726 2 INFO nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Creating config drive at /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.config
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.734 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ikx_o2b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.885 2 DEBUG oslo_concurrency.processutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ikx_o2b" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:51 compute-1 kernel: tapf62fc673-df: entered promiscuous mode
Sep 30 21:42:51 compute-1 NetworkManager[51724]: <info>  [1759268571.9837] manager: (tapf62fc673-df): new Tun device (/org/freedesktop/NetworkManager/Devices/265)
Sep 30 21:42:51 compute-1 ovn_controller[94902]: 2025-09-30T21:42:51Z|00540|binding|INFO|Claiming lport f62fc673-df33-48e7-96fb-bd4f42eca73a for this chassis.
Sep 30 21:42:51 compute-1 ovn_controller[94902]: 2025-09-30T21:42:51Z|00541|binding|INFO|f62fc673-df33-48e7-96fb-bd4f42eca73a: Claiming fa:16:3e:c8:eb:10 10.100.0.5
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:51 compute-1 nova_compute[192795]: 2025-09-30 21:42:51.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.009 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:eb:10 10.100.0.5'], port_security=['fa:16:3e:c8:eb:10 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2384614e-09de-4607-9336-54877ec23545', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baf5ec1a-3e90-4e29-9575-409781d80e84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8f75955c-5947-4d54-b6d6-411d65ede0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50db7480-e4c4-40a4-9fa3-258b3c77b8ab, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f62fc673-df33-48e7-96fb-bd4f42eca73a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.010 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f62fc673-df33-48e7-96fb-bd4f42eca73a in datapath baf5ec1a-3e90-4e29-9575-409781d80e84 bound to our chassis
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.012 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network baf5ec1a-3e90-4e29-9575-409781d80e84
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.024 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e72be7df-fd33-4a53-9a9b-a981c82a021e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.025 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbaf5ec1a-31 in ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.027 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbaf5ec1a-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.027 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[718620d9-a30b-4622-92f5-ad5dcb57631a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.028 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cc890155-9517-4ac6-b2f9-4f25864452fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 systemd-udevd[243058]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.048 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc49612-be1f-4954-81e2-04ac34e903ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 NetworkManager[51724]: <info>  [1759268572.0510] device (tapf62fc673-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:42:52 compute-1 NetworkManager[51724]: <info>  [1759268572.0528] device (tapf62fc673-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:42:52 compute-1 nova_compute[192795]: 2025-09-30 21:42:52.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:52 compute-1 systemd-machined[152783]: New machine qemu-65-instance-0000008c.
Sep 30 21:42:52 compute-1 ovn_controller[94902]: 2025-09-30T21:42:52Z|00542|binding|INFO|Setting lport f62fc673-df33-48e7-96fb-bd4f42eca73a ovn-installed in OVS
Sep 30 21:42:52 compute-1 ovn_controller[94902]: 2025-09-30T21:42:52Z|00543|binding|INFO|Setting lport f62fc673-df33-48e7-96fb-bd4f42eca73a up in Southbound
Sep 30 21:42:52 compute-1 nova_compute[192795]: 2025-09-30 21:42:52.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.091 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6d42413b-bd1e-4435-b7b3-54eeb877fa79]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 systemd[1]: Started Virtual Machine qemu-65-instance-0000008c.
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.133 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[28d3be06-ac13-4fc0-aaba-7701f4693915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.149 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[818aaaf0-38b4-4d63-baee-c38b89260a14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 NetworkManager[51724]: <info>  [1759268572.1507] manager: (tapbaf5ec1a-30): new Veth device (/org/freedesktop/NetworkManager/Devices/266)
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.195 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[0ffda7ee-6e44-4e99-973c-f670f93282d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.199 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d28b85-8524-478a-8c9b-80e4cc1b3eaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 NetworkManager[51724]: <info>  [1759268572.2337] device (tapbaf5ec1a-30): carrier: link connected
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.245 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[3555043f-dc94-44ec-abf6-d80f18598026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.270 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b92e04f5-774e-4f76-b78a-3ce2fbab0dd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaf5ec1a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:68:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526790, 'reachable_time': 31024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243092, 'error': None, 'target': 'ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.300 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e7073388-fc48-446e-9279-0d3cab1749bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed2:681e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526790, 'tstamp': 526790}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243093, 'error': None, 'target': 'ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.331 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8e339eb5-b245-404b-8cf3-bf30f2c4cb18]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbaf5ec1a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:68:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526790, 'reachable_time': 31024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243094, 'error': None, 'target': 'ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.389 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[85df70f2-ffd5-46fd-bed1-696dc7892375]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.500 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[393de949-3435-4aba-845f-9becce8e2230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.503 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaf5ec1a-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.504 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.505 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbaf5ec1a-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:52 compute-1 nova_compute[192795]: 2025-09-30 21:42:52.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:52 compute-1 NetworkManager[51724]: <info>  [1759268572.5090] manager: (tapbaf5ec1a-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Sep 30 21:42:52 compute-1 kernel: tapbaf5ec1a-30: entered promiscuous mode
Sep 30 21:42:52 compute-1 nova_compute[192795]: 2025-09-30 21:42:52.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.517 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbaf5ec1a-30, col_values=(('external_ids', {'iface-id': '025745f8-ca1e-4afd-a51d-a630718f9ff8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:52 compute-1 nova_compute[192795]: 2025-09-30 21:42:52.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:52 compute-1 ovn_controller[94902]: 2025-09-30T21:42:52Z|00544|binding|INFO|Releasing lport 025745f8-ca1e-4afd-a51d-a630718f9ff8 from this chassis (sb_readonly=0)
Sep 30 21:42:52 compute-1 nova_compute[192795]: 2025-09-30 21:42:52.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.548 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/baf5ec1a-3e90-4e29-9575-409781d80e84.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/baf5ec1a-3e90-4e29-9575-409781d80e84.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.549 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[db9b060e-a978-445b-85a9-1eaf16b9b03f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.550 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-baf5ec1a-3e90-4e29-9575-409781d80e84
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/baf5ec1a-3e90-4e29-9575-409781d80e84.pid.haproxy
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID baf5ec1a-3e90-4e29-9575-409781d80e84
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:42:52 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:42:52.551 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84', 'env', 'PROCESS_TAG=haproxy-baf5ec1a-3e90-4e29-9575-409781d80e84', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/baf5ec1a-3e90-4e29-9575-409781d80e84.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:42:52 compute-1 podman[243133]: 2025-09-30 21:42:52.979197128 +0000 UTC m=+0.058531135 container create c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 21:42:52 compute-1 nova_compute[192795]: 2025-09-30 21:42:52.980 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268572.9800928, 2384614e-09de-4607-9336-54877ec23545 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:42:52 compute-1 nova_compute[192795]: 2025-09-30 21:42:52.981 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] VM Started (Lifecycle Event)
Sep 30 21:42:53 compute-1 systemd[1]: Started libpod-conmon-c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41.scope.
Sep 30 21:42:53 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.030 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.038 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268572.9802706, 2384614e-09de-4607-9336-54877ec23545 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.038 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] VM Paused (Lifecycle Event)
Sep 30 21:42:53 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a210be76288a1125e9ce4844e2f5f07b6d2a15c97f7a9dbb7dce8373384970b7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:42:53 compute-1 podman[243133]: 2025-09-30 21:42:52.949310374 +0000 UTC m=+0.028644381 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:42:53 compute-1 podman[243133]: 2025-09-30 21:42:53.0515057 +0000 UTC m=+0.130839687 container init c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:42:53 compute-1 podman[243133]: 2025-09-30 21:42:53.058333494 +0000 UTC m=+0.137667481 container start c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.059 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.066 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:42:53 compute-1 neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84[243149]: [NOTICE]   (243168) : New worker (243175) forked
Sep 30 21:42:53 compute-1 neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84[243149]: [NOTICE]   (243168) : Loading success.
Sep 30 21:42:53 compute-1 podman[243146]: 2025-09-30 21:42:53.091584868 +0000 UTC m=+0.069975902 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.092 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.728 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.730 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.730 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.806 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.855 2 DEBUG nova.network.neutron [req-11f0c020-29b3-4e44-b523-d5604726ad3b req-321d515f-f3af-479a-b809-fb95787bb602 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Updated VIF entry in instance network info cache for port f62fc673-df33-48e7-96fb-bd4f42eca73a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.857 2 DEBUG nova.network.neutron [req-11f0c020-29b3-4e44-b523-d5604726ad3b req-321d515f-f3af-479a-b809-fb95787bb602 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Updating instance_info_cache with network_info: [{"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.870 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.871 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.904 2 DEBUG oslo_concurrency.lockutils [req-11f0c020-29b3-4e44-b523-d5604726ad3b req-321d515f-f3af-479a-b809-fb95787bb602 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:42:53 compute-1 nova_compute[192795]: 2025-09-30 21:42:53.971 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.135 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.136 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5579MB free_disk=73.31567764282227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.136 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.136 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.220 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 2384614e-09de-4607-9336-54877ec23545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.221 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.221 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.221 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.268 2 DEBUG nova.network.neutron [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Successfully created port: 4408871f-a0c1-4b9e-85e2-631f65861d3f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.308 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.324 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.361 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:42:54 compute-1 nova_compute[192795]: 2025-09-30 21:42:54.361 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:55 compute-1 nova_compute[192795]: 2025-09-30 21:42:55.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:55 compute-1 nova_compute[192795]: 2025-09-30 21:42:55.357 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:56 compute-1 nova_compute[192795]: 2025-09-30 21:42:56.707 2 DEBUG nova.network.neutron [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Successfully updated port: 4408871f-a0c1-4b9e-85e2-631f65861d3f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:42:56 compute-1 nova_compute[192795]: 2025-09-30 21:42:56.821 2 DEBUG nova.compute.manager [req-26b73925-2271-47a1-9b7e-17349e965e6c req-846401bb-f7f9-4326-b175-cbecca95fc2a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received event network-changed-4408871f-a0c1-4b9e-85e2-631f65861d3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:42:56 compute-1 nova_compute[192795]: 2025-09-30 21:42:56.822 2 DEBUG nova.compute.manager [req-26b73925-2271-47a1-9b7e-17349e965e6c req-846401bb-f7f9-4326-b175-cbecca95fc2a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Refreshing instance network info cache due to event network-changed-4408871f-a0c1-4b9e-85e2-631f65861d3f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:42:56 compute-1 nova_compute[192795]: 2025-09-30 21:42:56.822 2 DEBUG oslo_concurrency.lockutils [req-26b73925-2271-47a1-9b7e-17349e965e6c req-846401bb-f7f9-4326-b175-cbecca95fc2a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:42:56 compute-1 nova_compute[192795]: 2025-09-30 21:42:56.822 2 DEBUG oslo_concurrency.lockutils [req-26b73925-2271-47a1-9b7e-17349e965e6c req-846401bb-f7f9-4326-b175-cbecca95fc2a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:42:56 compute-1 nova_compute[192795]: 2025-09-30 21:42:56.822 2 DEBUG nova.network.neutron [req-26b73925-2271-47a1-9b7e-17349e965e6c req-846401bb-f7f9-4326-b175-cbecca95fc2a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Refreshing network info cache for port 4408871f-a0c1-4b9e-85e2-631f65861d3f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:42:56 compute-1 nova_compute[192795]: 2025-09-30 21:42:56.828 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "refresh_cache-abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.024 2 DEBUG nova.network.neutron [req-26b73925-2271-47a1-9b7e-17349e965e6c req-846401bb-f7f9-4326-b175-cbecca95fc2a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.087 2 DEBUG nova.compute.manager [req-5cfaaaeb-3032-4f03-bb5b-838f2088c575 req-61dc9be8-73eb-45e7-b176-672540678e6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.088 2 DEBUG oslo_concurrency.lockutils [req-5cfaaaeb-3032-4f03-bb5b-838f2088c575 req-61dc9be8-73eb-45e7-b176-672540678e6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.088 2 DEBUG oslo_concurrency.lockutils [req-5cfaaaeb-3032-4f03-bb5b-838f2088c575 req-61dc9be8-73eb-45e7-b176-672540678e6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.088 2 DEBUG oslo_concurrency.lockutils [req-5cfaaaeb-3032-4f03-bb5b-838f2088c575 req-61dc9be8-73eb-45e7-b176-672540678e6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.088 2 DEBUG nova.compute.manager [req-5cfaaaeb-3032-4f03-bb5b-838f2088c575 req-61dc9be8-73eb-45e7-b176-672540678e6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Processing event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.089 2 DEBUG nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.093 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268577.0928924, 2384614e-09de-4607-9336-54877ec23545 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.093 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] VM Resumed (Lifecycle Event)
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.096 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.099 2 INFO nova.virt.libvirt.driver [-] [instance: 2384614e-09de-4607-9336-54877ec23545] Instance spawned successfully.
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.099 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.145 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.150 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.150 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.150 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.151 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.151 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.152 2 DEBUG nova.virt.libvirt.driver [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.156 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.204 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.237 2 INFO nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Took 14.78 seconds to spawn the instance on the hypervisor.
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.238 2 DEBUG nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.356 2 INFO nova.compute.manager [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Took 15.63 seconds to build instance.
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.390 2 DEBUG oslo_concurrency.lockutils [None req-4f7dfb78-ecd6-46ef-9561-3b22ab26cbd7 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "2384614e-09de-4607-9336-54877ec23545" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.563 2 DEBUG nova.network.neutron [req-26b73925-2271-47a1-9b7e-17349e965e6c req-846401bb-f7f9-4326-b175-cbecca95fc2a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.603 2 DEBUG oslo_concurrency.lockutils [req-26b73925-2271-47a1-9b7e-17349e965e6c req-846401bb-f7f9-4326-b175-cbecca95fc2a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.604 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquired lock "refresh_cache-abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.604 2 DEBUG nova.network.neutron [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:57 compute-1 nova_compute[192795]: 2025-09-30 21:42:57.888 2 DEBUG nova.network.neutron [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:42:58 compute-1 nova_compute[192795]: 2025-09-30 21:42:58.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:58 compute-1 nova_compute[192795]: 2025-09-30 21:42:58.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:58 compute-1 nova_compute[192795]: 2025-09-30 21:42:58.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:42:59 compute-1 nova_compute[192795]: 2025-09-30 21:42:59.219 2 DEBUG nova.compute.manager [req-709dd5aa-ff64-4aff-a205-2b5e3b068876 req-98652cb0-53c0-4913-bd12-b6e8635c5af5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:42:59 compute-1 nova_compute[192795]: 2025-09-30 21:42:59.219 2 DEBUG oslo_concurrency.lockutils [req-709dd5aa-ff64-4aff-a205-2b5e3b068876 req-98652cb0-53c0-4913-bd12-b6e8635c5af5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:59 compute-1 nova_compute[192795]: 2025-09-30 21:42:59.220 2 DEBUG oslo_concurrency.lockutils [req-709dd5aa-ff64-4aff-a205-2b5e3b068876 req-98652cb0-53c0-4913-bd12-b6e8635c5af5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:59 compute-1 nova_compute[192795]: 2025-09-30 21:42:59.220 2 DEBUG oslo_concurrency.lockutils [req-709dd5aa-ff64-4aff-a205-2b5e3b068876 req-98652cb0-53c0-4913-bd12-b6e8635c5af5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:59 compute-1 nova_compute[192795]: 2025-09-30 21:42:59.220 2 DEBUG nova.compute.manager [req-709dd5aa-ff64-4aff-a205-2b5e3b068876 req-98652cb0-53c0-4913-bd12-b6e8635c5af5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] No waiting events found dispatching network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:42:59 compute-1 nova_compute[192795]: 2025-09-30 21:42:59.220 2 WARNING nova.compute.manager [req-709dd5aa-ff64-4aff-a205-2b5e3b068876 req-98652cb0-53c0-4913-bd12-b6e8635c5af5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received unexpected event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a for instance with vm_state active and task_state None.
Sep 30 21:42:59 compute-1 nova_compute[192795]: 2025-09-30 21:42:59.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:00 compute-1 unix_chkpwd[243194]: password check failed for user (root)
Sep 30 21:43:00 compute-1 sshd-session[243192]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116  user=root
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.437 2 DEBUG nova.network.neutron [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Updating instance_info_cache with network_info: [{"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.470 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Releasing lock "refresh_cache-abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.471 2 DEBUG nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Instance network_info: |[{"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.474 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Start _get_guest_xml network_info=[{"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.479 2 WARNING nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.490 2 DEBUG nova.virt.libvirt.host [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.492 2 DEBUG nova.virt.libvirt.host [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.501 2 DEBUG nova.virt.libvirt.host [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.503 2 DEBUG nova.virt.libvirt.host [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.504 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.505 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.505 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.506 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.506 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.506 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.507 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.507 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.507 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.507 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.508 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.508 2 DEBUG nova.virt.hardware [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.515 2 DEBUG nova.virt.libvirt.vif [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:42:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-269626595',display_name='tempest-TestNetworkBasicOps-server-269626595',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-269626595',id=141,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPBJwE9lrqDVozBxDct8rHGsLRnwscgct3aneZTZ5QtwO9hhgIC8XlalzJL9uoPkBn0lbZqAsfo0YUpHMuqMbRlPqmYTV0ypgdGX0DmtQYVI/XI4mn4siubboDD7J464Yg==',key_name='tempest-TestNetworkBasicOps-148598438',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-aitxybp5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:42:50Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.516 2 DEBUG nova.network.os_vif_util [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.516 2 DEBUG nova.network.os_vif_util [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:8b:7d,bridge_name='br-int',has_traffic_filtering=True,id=4408871f-a0c1-4b9e-85e2-631f65861d3f,network=Network(bf75606a-b15b-463d-a313-4ac836e894cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4408871f-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.518 2 DEBUG nova.objects.instance [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'pci_devices' on Instance uuid abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.542 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <uuid>abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2</uuid>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <name>instance-0000008d</name>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkBasicOps-server-269626595</nova:name>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:43:00</nova:creationTime>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:43:00 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:43:00 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:43:00 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:43:00 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:43:00 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:43:00 compute-1 nova_compute[192795]:         <nova:user uuid="27859618cb1d493cb1531af26b200b92">tempest-TestNetworkBasicOps-2126023928-project-member</nova:user>
Sep 30 21:43:00 compute-1 nova_compute[192795]:         <nova:project uuid="043721d1d0a2480fa785367fa56c1fa4">tempest-TestNetworkBasicOps-2126023928</nova:project>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:43:00 compute-1 nova_compute[192795]:         <nova:port uuid="4408871f-a0c1-4b9e-85e2-631f65861d3f">
Sep 30 21:43:00 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <system>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <entry name="serial">abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2</entry>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <entry name="uuid">abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2</entry>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     </system>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <os>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   </os>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <features>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   </features>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk.config"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:2e:8b:7d"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <target dev="tap4408871f-a0"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/console.log" append="off"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <video>
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     </video>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:43:00 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:43:00 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:43:00 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:43:00 compute-1 nova_compute[192795]: </domain>
Sep 30 21:43:00 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.544 2 DEBUG nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Preparing to wait for external event network-vif-plugged-4408871f-a0c1-4b9e-85e2-631f65861d3f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.544 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.544 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.545 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.546 2 DEBUG nova.virt.libvirt.vif [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:42:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-269626595',display_name='tempest-TestNetworkBasicOps-server-269626595',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-269626595',id=141,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPBJwE9lrqDVozBxDct8rHGsLRnwscgct3aneZTZ5QtwO9hhgIC8XlalzJL9uoPkBn0lbZqAsfo0YUpHMuqMbRlPqmYTV0ypgdGX0DmtQYVI/XI4mn4siubboDD7J464Yg==',key_name='tempest-TestNetworkBasicOps-148598438',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-aitxybp5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:42:50Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.546 2 DEBUG nova.network.os_vif_util [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.548 2 DEBUG nova.network.os_vif_util [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:8b:7d,bridge_name='br-int',has_traffic_filtering=True,id=4408871f-a0c1-4b9e-85e2-631f65861d3f,network=Network(bf75606a-b15b-463d-a313-4ac836e894cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4408871f-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.550 2 DEBUG os_vif [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:8b:7d,bridge_name='br-int',has_traffic_filtering=True,id=4408871f-a0c1-4b9e-85e2-631f65861d3f,network=Network(bf75606a-b15b-463d-a313-4ac836e894cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4408871f-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.551 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.551 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.556 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4408871f-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.557 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4408871f-a0, col_values=(('external_ids', {'iface-id': '4408871f-a0c1-4b9e-85e2-631f65861d3f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:8b:7d', 'vm-uuid': 'abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:00 compute-1 NetworkManager[51724]: <info>  [1759268580.5605] manager: (tap4408871f-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.575 2 INFO os_vif [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:8b:7d,bridge_name='br-int',has_traffic_filtering=True,id=4408871f-a0c1-4b9e-85e2-631f65861d3f,network=Network(bf75606a-b15b-463d-a313-4ac836e894cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4408871f-a0')
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.634 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.634 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.635 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No VIF found with MAC fa:16:3e:2e:8b:7d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:43:00 compute-1 nova_compute[192795]: 2025-09-30 21:43:00.635 2 INFO nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Using config drive
Sep 30 21:43:01 compute-1 nova_compute[192795]: 2025-09-30 21:43:01.666 2 INFO nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Creating config drive at /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk.config
Sep 30 21:43:01 compute-1 nova_compute[192795]: 2025-09-30 21:43:01.674 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnfwa4fqu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:01 compute-1 nova_compute[192795]: 2025-09-30 21:43:01.698 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:01 compute-1 nova_compute[192795]: 2025-09-30 21:43:01.805 2 DEBUG oslo_concurrency.processutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnfwa4fqu" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:01 compute-1 kernel: tap4408871f-a0: entered promiscuous mode
Sep 30 21:43:01 compute-1 nova_compute[192795]: 2025-09-30 21:43:01.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:01 compute-1 NetworkManager[51724]: <info>  [1759268581.9502] manager: (tap4408871f-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/269)
Sep 30 21:43:01 compute-1 ovn_controller[94902]: 2025-09-30T21:43:01Z|00545|binding|INFO|Claiming lport 4408871f-a0c1-4b9e-85e2-631f65861d3f for this chassis.
Sep 30 21:43:01 compute-1 ovn_controller[94902]: 2025-09-30T21:43:01Z|00546|binding|INFO|4408871f-a0c1-4b9e-85e2-631f65861d3f: Claiming fa:16:3e:2e:8b:7d 10.100.0.3
Sep 30 21:43:01 compute-1 nova_compute[192795]: 2025-09-30 21:43:01.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:01.965 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:8b:7d 10.100.0.3'], port_security=['fa:16:3e:2e:8b:7d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf75606a-b15b-463d-a313-4ac836e894cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '276c5ccb-38f7-4177-b9ec-e2540af07d07', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e607f567-3922-431a-b906-15ee0ad56230, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=4408871f-a0c1-4b9e-85e2-631f65861d3f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:43:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:01.968 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 4408871f-a0c1-4b9e-85e2-631f65861d3f in datapath bf75606a-b15b-463d-a313-4ac836e894cf bound to our chassis
Sep 30 21:43:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:01.970 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bf75606a-b15b-463d-a313-4ac836e894cf
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:01.998 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[53c27266-cf13-4d19-a0aa-5dab686886f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.000 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbf75606a-b1 in ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.006 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbf75606a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.006 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5d831830-7a08-422b-8780-72214c19516b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.007 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec547f4-41c5-483a-b947-6bee96b5b2d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_controller[94902]: 2025-09-30T21:43:02Z|00547|binding|INFO|Setting lport 4408871f-a0c1-4b9e-85e2-631f65861d3f ovn-installed in OVS
Sep 30 21:43:02 compute-1 ovn_controller[94902]: 2025-09-30T21:43:02Z|00548|binding|INFO|Setting lport 4408871f-a0c1-4b9e-85e2-631f65861d3f up in Southbound
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:02 compute-1 systemd-udevd[243260]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.026 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[46f13fd2-ea3b-4362-9e0b-cd0e366dcbee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 systemd-machined[152783]: New machine qemu-66-instance-0000008d.
Sep 30 21:43:02 compute-1 podman[243209]: 2025-09-30 21:43:02.033439431 +0000 UTC m=+0.110416427 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:43:02 compute-1 systemd[1]: Started Virtual Machine qemu-66-instance-0000008d.
Sep 30 21:43:02 compute-1 NetworkManager[51724]: <info>  [1759268582.0406] device (tap4408871f-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:43:02 compute-1 NetworkManager[51724]: <info>  [1759268582.0418] device (tap4408871f-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:43:02 compute-1 podman[243207]: 2025-09-30 21:43:02.056332757 +0000 UTC m=+0.136160051 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, org.label-schema.build-date=20250923)
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.059 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[14a2418a-5270-43d8-a371-f0edcbe138e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 podman[243208]: 2025-09-30 21:43:02.085605224 +0000 UTC m=+0.163962787 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible)
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.107 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7384149b-6327-4fde-a06e-a48f3b5d71ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 NetworkManager[51724]: <info>  [1759268582.1160] manager: (tapbf75606a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/270)
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.115 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7f9f13-2922-4c30-b36b-8c66f116cefd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.153 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5122c2b3-8b60-4c92-9062-6632b0e03b69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.157 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc25e23-ac89-47d5-8c28-ac3c843d45a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 NetworkManager[51724]: <info>  [1759268582.1848] device (tapbf75606a-b0): carrier: link connected
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.192 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[c370e87c-03af-452e-95e9-871bf51ac1b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.212 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e90d7912-7460-4651-bf37-d5c4770ba24b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbf75606a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:07:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527785, 'reachable_time': 43984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243316, 'error': None, 'target': 'ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.240 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[98a5337a-f386-44ba-8268-ed3df2cf6187]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9a:712'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527785, 'tstamp': 527785}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243317, 'error': None, 'target': 'ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.263 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b762e47c-d90c-4004-9ba6-0154941fc6be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbf75606a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:07:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527785, 'reachable_time': 43984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243318, 'error': None, 'target': 'ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 sshd-session[243192]: Failed password for root from 80.94.95.116 port 35426 ssh2
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.309 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c1023828-46e2-4ab1-9736-600579ca45b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:02 compute-1 NetworkManager[51724]: <info>  [1759268582.3301] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Sep 30 21:43:02 compute-1 NetworkManager[51724]: <info>  [1759268582.3308] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.389 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cb85f080-90c6-476c-995c-b012179fcfae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.391 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf75606a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.391 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.392 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf75606a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:02 compute-1 NetworkManager[51724]: <info>  [1759268582.3946] manager: (tapbf75606a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.413 2 DEBUG nova.compute.manager [req-fb085448-a6a9-40b9-b216-a5895d856a97 req-18ba707a-7285-4b56-9304-ee442f78c79c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received event network-vif-plugged-4408871f-a0c1-4b9e-85e2-631f65861d3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.414 2 DEBUG oslo_concurrency.lockutils [req-fb085448-a6a9-40b9-b216-a5895d856a97 req-18ba707a-7285-4b56-9304-ee442f78c79c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.414 2 DEBUG oslo_concurrency.lockutils [req-fb085448-a6a9-40b9-b216-a5895d856a97 req-18ba707a-7285-4b56-9304-ee442f78c79c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.414 2 DEBUG oslo_concurrency.lockutils [req-fb085448-a6a9-40b9-b216-a5895d856a97 req-18ba707a-7285-4b56-9304-ee442f78c79c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.415 2 DEBUG nova.compute.manager [req-fb085448-a6a9-40b9-b216-a5895d856a97 req-18ba707a-7285-4b56-9304-ee442f78c79c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Processing event network-vif-plugged-4408871f-a0c1-4b9e-85e2-631f65861d3f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:43:02 compute-1 kernel: tapbf75606a-b0: entered promiscuous mode
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.464 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbf75606a-b0, col_values=(('external_ids', {'iface-id': '8e8a51cc-4f03-40bb-a3ee-964e48da3ea6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.468 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bf75606a-b15b-463d-a313-4ac836e894cf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bf75606a-b15b-463d-a313-4ac836e894cf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.468 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c3afc780-9ad7-42bb-a836-254f024c5470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.469 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-bf75606a-b15b-463d-a313-4ac836e894cf
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/bf75606a-b15b-463d-a313-4ac836e894cf.pid.haproxy
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID bf75606a-b15b-463d-a313-4ac836e894cf
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:43:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:02.471 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf', 'env', 'PROCESS_TAG=haproxy-bf75606a-b15b-463d-a313-4ac836e894cf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bf75606a-b15b-463d-a313-4ac836e894cf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:43:02 compute-1 ovn_controller[94902]: 2025-09-30T21:43:02Z|00549|binding|INFO|Releasing lport 8e8a51cc-4f03-40bb-a3ee-964e48da3ea6 from this chassis (sb_readonly=0)
Sep 30 21:43:02 compute-1 ovn_controller[94902]: 2025-09-30T21:43:02Z|00550|binding|INFO|Releasing lport 025745f8-ca1e-4afd-a51d-a630718f9ff8 from this chassis (sb_readonly=0)
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:02 compute-1 nova_compute[192795]: 2025-09-30 21:43:02.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:02 compute-1 podman[243357]: 2025-09-30 21:43:02.889541081 +0000 UTC m=+0.064121465 container create 423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:43:02 compute-1 systemd[1]: Started libpod-conmon-423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128.scope.
Sep 30 21:43:02 compute-1 podman[243357]: 2025-09-30 21:43:02.856904193 +0000 UTC m=+0.031484617 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:43:02 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:43:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ea76e0b16c5ecc5a000f268249515ef47d08706c30941f590a9de3c1fd9abcc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:43:02 compute-1 podman[243357]: 2025-09-30 21:43:02.995672874 +0000 UTC m=+0.170253288 container init 423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:43:03 compute-1 podman[243357]: 2025-09-30 21:43:03.004209403 +0000 UTC m=+0.178789797 container start 423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:43:03 compute-1 neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf[243372]: [NOTICE]   (243376) : New worker (243378) forked
Sep 30 21:43:03 compute-1 neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf[243372]: [NOTICE]   (243376) : Loading success.
Sep 30 21:43:03 compute-1 nova_compute[192795]: 2025-09-30 21:43:03.151 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268583.1504173, abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:03 compute-1 nova_compute[192795]: 2025-09-30 21:43:03.152 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] VM Started (Lifecycle Event)
Sep 30 21:43:03 compute-1 nova_compute[192795]: 2025-09-30 21:43:03.155 2 DEBUG nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:43:03 compute-1 nova_compute[192795]: 2025-09-30 21:43:03.163 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:43:03 compute-1 nova_compute[192795]: 2025-09-30 21:43:03.167 2 INFO nova.virt.libvirt.driver [-] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Instance spawned successfully.
Sep 30 21:43:03 compute-1 nova_compute[192795]: 2025-09-30 21:43:03.168 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:43:03 compute-1 nova_compute[192795]: 2025-09-30 21:43:03.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:03 compute-1 sshd-session[243192]: Connection closed by authenticating user root 80.94.95.116 port 35426 [preauth]
Sep 30 21:43:03 compute-1 nova_compute[192795]: 2025-09-30 21:43:03.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.355 2 DEBUG nova.compute.manager [req-81f2fe16-9a0b-452a-814d-21cfd5966fb6 req-9ba532f6-e656-4ce9-8b83-abb65a4c9ef5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received event network-changed-f62fc673-df33-48e7-96fb-bd4f42eca73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.356 2 DEBUG nova.compute.manager [req-81f2fe16-9a0b-452a-814d-21cfd5966fb6 req-9ba532f6-e656-4ce9-8b83-abb65a4c9ef5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Refreshing instance network info cache due to event network-changed-f62fc673-df33-48e7-96fb-bd4f42eca73a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.357 2 DEBUG oslo_concurrency.lockutils [req-81f2fe16-9a0b-452a-814d-21cfd5966fb6 req-9ba532f6-e656-4ce9-8b83-abb65a4c9ef5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.358 2 DEBUG oslo_concurrency.lockutils [req-81f2fe16-9a0b-452a-814d-21cfd5966fb6 req-9ba532f6-e656-4ce9-8b83-abb65a4c9ef5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.358 2 DEBUG nova.network.neutron [req-81f2fe16-9a0b-452a-814d-21cfd5966fb6 req-9ba532f6-e656-4ce9-8b83-abb65a4c9ef5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Refreshing network info cache for port f62fc673-df33-48e7-96fb-bd4f42eca73a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.372 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.381 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.393 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.394 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.395 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.396 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.396 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.397 2 DEBUG nova.virt.libvirt.driver [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.434 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.435 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268583.1515806, abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.435 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] VM Paused (Lifecycle Event)
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.490 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.495 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268583.1579635, abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.496 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] VM Resumed (Lifecycle Event)
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.533 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.538 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.574 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.643 2 INFO nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Took 13.64 seconds to spawn the instance on the hypervisor.
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.643 2 DEBUG nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.868 2 INFO nova.compute.manager [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Took 14.70 seconds to build instance.
Sep 30 21:43:04 compute-1 nova_compute[192795]: 2025-09-30 21:43:04.907 2 DEBUG oslo_concurrency.lockutils [None req-c9953bf1-4479-46c2-988a-11648f8868b2 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:05 compute-1 nova_compute[192795]: 2025-09-30 21:43:05.349 2 DEBUG nova.compute.manager [req-6891ad55-6447-4b67-b804-4eeea22893ca req-352718f6-42b4-4d35-9553-c657e9bcf862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received event network-vif-plugged-4408871f-a0c1-4b9e-85e2-631f65861d3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:05 compute-1 nova_compute[192795]: 2025-09-30 21:43:05.350 2 DEBUG oslo_concurrency.lockutils [req-6891ad55-6447-4b67-b804-4eeea22893ca req-352718f6-42b4-4d35-9553-c657e9bcf862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:05 compute-1 nova_compute[192795]: 2025-09-30 21:43:05.351 2 DEBUG oslo_concurrency.lockutils [req-6891ad55-6447-4b67-b804-4eeea22893ca req-352718f6-42b4-4d35-9553-c657e9bcf862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:05 compute-1 nova_compute[192795]: 2025-09-30 21:43:05.351 2 DEBUG oslo_concurrency.lockutils [req-6891ad55-6447-4b67-b804-4eeea22893ca req-352718f6-42b4-4d35-9553-c657e9bcf862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:05 compute-1 nova_compute[192795]: 2025-09-30 21:43:05.351 2 DEBUG nova.compute.manager [req-6891ad55-6447-4b67-b804-4eeea22893ca req-352718f6-42b4-4d35-9553-c657e9bcf862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] No waiting events found dispatching network-vif-plugged-4408871f-a0c1-4b9e-85e2-631f65861d3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:05 compute-1 nova_compute[192795]: 2025-09-30 21:43:05.352 2 WARNING nova.compute.manager [req-6891ad55-6447-4b67-b804-4eeea22893ca req-352718f6-42b4-4d35-9553-c657e9bcf862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received unexpected event network-vif-plugged-4408871f-a0c1-4b9e-85e2-631f65861d3f for instance with vm_state active and task_state None.
Sep 30 21:43:05 compute-1 nova_compute[192795]: 2025-09-30 21:43:05.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:06 compute-1 nova_compute[192795]: 2025-09-30 21:43:06.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:06 compute-1 nova_compute[192795]: 2025-09-30 21:43:06.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:43:06 compute-1 nova_compute[192795]: 2025-09-30 21:43:06.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:43:07 compute-1 podman[243387]: 2025-09-30 21:43:07.260048474 +0000 UTC m=+0.090250056 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:43:07 compute-1 nova_compute[192795]: 2025-09-30 21:43:07.617 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:07 compute-1 nova_compute[192795]: 2025-09-30 21:43:07.724 2 DEBUG nova.network.neutron [req-81f2fe16-9a0b-452a-814d-21cfd5966fb6 req-9ba532f6-e656-4ce9-8b83-abb65a4c9ef5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Updated VIF entry in instance network info cache for port f62fc673-df33-48e7-96fb-bd4f42eca73a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:43:07 compute-1 nova_compute[192795]: 2025-09-30 21:43:07.725 2 DEBUG nova.network.neutron [req-81f2fe16-9a0b-452a-814d-21cfd5966fb6 req-9ba532f6-e656-4ce9-8b83-abb65a4c9ef5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Updating instance_info_cache with network_info: [{"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:07 compute-1 nova_compute[192795]: 2025-09-30 21:43:07.764 2 DEBUG oslo_concurrency.lockutils [req-81f2fe16-9a0b-452a-814d-21cfd5966fb6 req-9ba532f6-e656-4ce9-8b83-abb65a4c9ef5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:07 compute-1 nova_compute[192795]: 2025-09-30 21:43:07.765 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:07 compute-1 nova_compute[192795]: 2025-09-30 21:43:07.766 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:43:07 compute-1 nova_compute[192795]: 2025-09-30 21:43:07.766 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2384614e-09de-4607-9336-54877ec23545 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:08 compute-1 nova_compute[192795]: 2025-09-30 21:43:08.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:10 compute-1 ovn_controller[94902]: 2025-09-30T21:43:10Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:eb:10 10.100.0.5
Sep 30 21:43:10 compute-1 ovn_controller[94902]: 2025-09-30T21:43:10Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:eb:10 10.100.0.5
Sep 30 21:43:10 compute-1 nova_compute[192795]: 2025-09-30 21:43:10.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:10 compute-1 nova_compute[192795]: 2025-09-30 21:43:10.870 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] Updating instance_info_cache with network_info: [{"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:10 compute-1 nova_compute[192795]: 2025-09-30 21:43:10.967 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:10 compute-1 nova_compute[192795]: 2025-09-30 21:43:10.968 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:43:10 compute-1 nova_compute[192795]: 2025-09-30 21:43:10.969 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:11 compute-1 nova_compute[192795]: 2025-09-30 21:43:11.378 2 DEBUG nova.compute.manager [req-d676845a-e4ac-4256-ae2d-1c8b0bd3a2cb req-e959d374-8029-47fa-87bb-eda062a9dd43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received event network-changed-4408871f-a0c1-4b9e-85e2-631f65861d3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:11 compute-1 nova_compute[192795]: 2025-09-30 21:43:11.378 2 DEBUG nova.compute.manager [req-d676845a-e4ac-4256-ae2d-1c8b0bd3a2cb req-e959d374-8029-47fa-87bb-eda062a9dd43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Refreshing instance network info cache due to event network-changed-4408871f-a0c1-4b9e-85e2-631f65861d3f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:43:11 compute-1 nova_compute[192795]: 2025-09-30 21:43:11.379 2 DEBUG oslo_concurrency.lockutils [req-d676845a-e4ac-4256-ae2d-1c8b0bd3a2cb req-e959d374-8029-47fa-87bb-eda062a9dd43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:11 compute-1 nova_compute[192795]: 2025-09-30 21:43:11.379 2 DEBUG oslo_concurrency.lockutils [req-d676845a-e4ac-4256-ae2d-1c8b0bd3a2cb req-e959d374-8029-47fa-87bb-eda062a9dd43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:11 compute-1 nova_compute[192795]: 2025-09-30 21:43:11.379 2 DEBUG nova.network.neutron [req-d676845a-e4ac-4256-ae2d-1c8b0bd3a2cb req-e959d374-8029-47fa-87bb-eda062a9dd43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Refreshing network info cache for port 4408871f-a0c1-4b9e-85e2-631f65861d3f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:43:13 compute-1 nova_compute[192795]: 2025-09-30 21:43:13.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:14 compute-1 podman[243423]: 2025-09-30 21:43:14.220151086 +0000 UTC m=+0.052913773 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:43:14 compute-1 podman[243424]: 2025-09-30 21:43:14.22809184 +0000 UTC m=+0.054671541 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 21:43:14 compute-1 podman[243422]: 2025-09-30 21:43:14.234753169 +0000 UTC m=+0.072748036 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Sep 30 21:43:15 compute-1 nova_compute[192795]: 2025-09-30 21:43:15.339 2 DEBUG nova.network.neutron [req-d676845a-e4ac-4256-ae2d-1c8b0bd3a2cb req-e959d374-8029-47fa-87bb-eda062a9dd43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Updated VIF entry in instance network info cache for port 4408871f-a0c1-4b9e-85e2-631f65861d3f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:43:15 compute-1 nova_compute[192795]: 2025-09-30 21:43:15.341 2 DEBUG nova.network.neutron [req-d676845a-e4ac-4256-ae2d-1c8b0bd3a2cb req-e959d374-8029-47fa-87bb-eda062a9dd43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Updating instance_info_cache with network_info: [{"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:15 compute-1 nova_compute[192795]: 2025-09-30 21:43:15.489 2 DEBUG oslo_concurrency.lockutils [req-d676845a-e4ac-4256-ae2d-1c8b0bd3a2cb req-e959d374-8029-47fa-87bb-eda062a9dd43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:15 compute-1 nova_compute[192795]: 2025-09-30 21:43:15.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:15 compute-1 ovn_controller[94902]: 2025-09-30T21:43:15Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2e:8b:7d 10.100.0.3
Sep 30 21:43:15 compute-1 ovn_controller[94902]: 2025-09-30T21:43:15Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:8b:7d 10.100.0.3
Sep 30 21:43:16 compute-1 nova_compute[192795]: 2025-09-30 21:43:16.223 2 INFO nova.compute.manager [None req-9625992e-62cd-4cea-bc39-489e687ff827 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Get console output
Sep 30 21:43:16 compute-1 nova_compute[192795]: 2025-09-30 21:43:16.229 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:43:18 compute-1 nova_compute[192795]: 2025-09-30 21:43:18.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:19 compute-1 nova_compute[192795]: 2025-09-30 21:43:19.291 2 INFO nova.compute.manager [None req-cbb6e333-9c42-4632-b7eb-f2db5d3e058b 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Get console output
Sep 30 21:43:19 compute-1 nova_compute[192795]: 2025-09-30 21:43:19.297 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:43:20 compute-1 nova_compute[192795]: 2025-09-30 21:43:20.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:21 compute-1 nova_compute[192795]: 2025-09-30 21:43:21.842 2 INFO nova.compute.manager [None req-b41ed9e0-e44d-49a3-b3a2-f4bcb228f493 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Get console output
Sep 30 21:43:21 compute-1 nova_compute[192795]: 2025-09-30 21:43:21.850 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.215 2 DEBUG oslo_concurrency.lockutils [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.216 2 DEBUG oslo_concurrency.lockutils [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.217 2 DEBUG oslo_concurrency.lockutils [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.217 2 DEBUG oslo_concurrency.lockutils [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.218 2 DEBUG oslo_concurrency.lockutils [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.280 2 INFO nova.compute.manager [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Terminating instance
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.318 2 DEBUG nova.compute.manager [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:43:22 compute-1 kernel: tap4408871f-a0 (unregistering): left promiscuous mode
Sep 30 21:43:22 compute-1 NetworkManager[51724]: <info>  [1759268602.3548] device (tap4408871f-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:43:22 compute-1 ovn_controller[94902]: 2025-09-30T21:43:22Z|00551|binding|INFO|Releasing lport 4408871f-a0c1-4b9e-85e2-631f65861d3f from this chassis (sb_readonly=0)
Sep 30 21:43:22 compute-1 ovn_controller[94902]: 2025-09-30T21:43:22Z|00552|binding|INFO|Setting lport 4408871f-a0c1-4b9e-85e2-631f65861d3f down in Southbound
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 ovn_controller[94902]: 2025-09-30T21:43:22Z|00553|binding|INFO|Removing iface tap4408871f-a0 ovn-installed in OVS
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.409 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:8b:7d 10.100.0.3'], port_security=['fa:16:3e:2e:8b:7d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf75606a-b15b-463d-a313-4ac836e894cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '276c5ccb-38f7-4177-b9ec-e2540af07d07', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e607f567-3922-431a-b906-15ee0ad56230, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=4408871f-a0c1-4b9e-85e2-631f65861d3f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.410 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 4408871f-a0c1-4b9e-85e2-631f65861d3f in datapath bf75606a-b15b-463d-a313-4ac836e894cf unbound from our chassis
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.413 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bf75606a-b15b-463d-a313-4ac836e894cf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.414 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb53d3b-7926-49d0-b5fb-3284ecd34380]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.414 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf namespace which is not needed anymore
Sep 30 21:43:22 compute-1 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008d.scope: Deactivated successfully.
Sep 30 21:43:22 compute-1 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008d.scope: Consumed 14.013s CPU time.
Sep 30 21:43:22 compute-1 systemd-machined[152783]: Machine qemu-66-instance-0000008d terminated.
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf[243372]: [NOTICE]   (243376) : haproxy version is 2.8.14-c23fe91
Sep 30 21:43:22 compute-1 neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf[243372]: [NOTICE]   (243376) : path to executable is /usr/sbin/haproxy
Sep 30 21:43:22 compute-1 neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf[243372]: [WARNING]  (243376) : Exiting Master process...
Sep 30 21:43:22 compute-1 neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf[243372]: [ALERT]    (243376) : Current worker (243378) exited with code 143 (Terminated)
Sep 30 21:43:22 compute-1 neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf[243372]: [WARNING]  (243376) : All workers exited. Exiting... (0)
Sep 30 21:43:22 compute-1 systemd[1]: libpod-423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128.scope: Deactivated successfully.
Sep 30 21:43:22 compute-1 podman[243519]: 2025-09-30 21:43:22.600444709 +0000 UTC m=+0.063601161 container died 423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.600 2 INFO nova.virt.libvirt.driver [-] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Instance destroyed successfully.
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.601 2 DEBUG nova.objects.instance [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'resources' on Instance uuid abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.640 2 DEBUG nova.virt.libvirt.vif [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:42:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-269626595',display_name='tempest-TestNetworkBasicOps-server-269626595',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-269626595',id=141,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPBJwE9lrqDVozBxDct8rHGsLRnwscgct3aneZTZ5QtwO9hhgIC8XlalzJL9uoPkBn0lbZqAsfo0YUpHMuqMbRlPqmYTV0ypgdGX0DmtQYVI/XI4mn4siubboDD7J464Yg==',key_name='tempest-TestNetworkBasicOps-148598438',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:43:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-aitxybp5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:43:04Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.642 2 DEBUG nova.network.os_vif_util [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "address": "fa:16:3e:2e:8b:7d", "network": {"id": "bf75606a-b15b-463d-a313-4ac836e894cf", "bridge": "br-int", "label": "tempest-network-smoke--2052623570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4408871f-a0", "ovs_interfaceid": "4408871f-a0c1-4b9e-85e2-631f65861d3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.644 2 DEBUG nova.network.os_vif_util [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:8b:7d,bridge_name='br-int',has_traffic_filtering=True,id=4408871f-a0c1-4b9e-85e2-631f65861d3f,network=Network(bf75606a-b15b-463d-a313-4ac836e894cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4408871f-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.645 2 DEBUG os_vif [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:8b:7d,bridge_name='br-int',has_traffic_filtering=True,id=4408871f-a0c1-4b9e-85e2-631f65861d3f,network=Network(bf75606a-b15b-463d-a313-4ac836e894cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4408871f-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.649 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4408871f-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:22 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128-userdata-shm.mount: Deactivated successfully.
Sep 30 21:43:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-7ea76e0b16c5ecc5a000f268249515ef47d08706c30941f590a9de3c1fd9abcc-merged.mount: Deactivated successfully.
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.659 2 INFO os_vif [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:8b:7d,bridge_name='br-int',has_traffic_filtering=True,id=4408871f-a0c1-4b9e-85e2-631f65861d3f,network=Network(bf75606a-b15b-463d-a313-4ac836e894cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4408871f-a0')
Sep 30 21:43:22 compute-1 podman[243519]: 2025-09-30 21:43:22.661803127 +0000 UTC m=+0.124959519 container cleanup 423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.661 2 INFO nova.virt.libvirt.driver [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Deleting instance files /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2_del
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.663 2 INFO nova.virt.libvirt.driver [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Deletion of /var/lib/nova/instances/abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2_del complete
Sep 30 21:43:22 compute-1 systemd[1]: libpod-conmon-423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128.scope: Deactivated successfully.
Sep 30 21:43:22 compute-1 podman[243563]: 2025-09-30 21:43:22.75568121 +0000 UTC m=+0.057976658 container remove 423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.764 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ea92592c-7c9a-4ca2-8cb9-efc7b100122a]: (4, ('Tue Sep 30 09:43:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf (423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128)\n423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128\nTue Sep 30 09:43:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf (423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128)\n423a20d81adbc33c8865f6501082da6ee940f2b0aefcc763106439d1f12bb128\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.766 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5f9db14c-a104-4627-ada5-6ecc9ffabab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.767 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf75606a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:22 compute-1 kernel: tapbf75606a-b0: left promiscuous mode
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.784 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6ade5f76-4358-4df2-8d90-65fb6c12be22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.813 2 INFO nova.compute.manager [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Took 0.49 seconds to destroy the instance on the hypervisor.
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.814 2 DEBUG oslo.service.loopingcall [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.815 2 DEBUG nova.compute.manager [-] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:43:22 compute-1 nova_compute[192795]: 2025-09-30 21:43:22.815 2 DEBUG nova.network.neutron [-] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.815 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d50c4c1e-f5ef-4081-b561-6ed43e454f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.817 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cd30919c-e6a7-4013-a58c-6cde5ec82854]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.835 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c92205ad-d498-4a8d-be10-1a07bcc3ce33]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527776, 'reachable_time': 39768, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243578, 'error': None, 'target': 'ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.838 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bf75606a-b15b-463d-a313-4ac836e894cf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:43:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:22.838 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[ac2e31c6-c7e3-4ca7-b2c5-5531d2533afb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:22 compute-1 systemd[1]: run-netns-ovnmeta\x2dbf75606a\x2db15b\x2d463d\x2da313\x2d4ac836e894cf.mount: Deactivated successfully.
Sep 30 21:43:23 compute-1 nova_compute[192795]: 2025-09-30 21:43:23.073 2 DEBUG nova.compute.manager [req-6a6fead3-acbb-472f-b7d7-ae2846046442 req-dbdb9606-c746-4cff-ad83-2d344b09e478 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received event network-vif-unplugged-4408871f-a0c1-4b9e-85e2-631f65861d3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:23 compute-1 nova_compute[192795]: 2025-09-30 21:43:23.074 2 DEBUG oslo_concurrency.lockutils [req-6a6fead3-acbb-472f-b7d7-ae2846046442 req-dbdb9606-c746-4cff-ad83-2d344b09e478 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:23 compute-1 nova_compute[192795]: 2025-09-30 21:43:23.075 2 DEBUG oslo_concurrency.lockutils [req-6a6fead3-acbb-472f-b7d7-ae2846046442 req-dbdb9606-c746-4cff-ad83-2d344b09e478 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:23 compute-1 nova_compute[192795]: 2025-09-30 21:43:23.075 2 DEBUG oslo_concurrency.lockutils [req-6a6fead3-acbb-472f-b7d7-ae2846046442 req-dbdb9606-c746-4cff-ad83-2d344b09e478 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:23 compute-1 nova_compute[192795]: 2025-09-30 21:43:23.075 2 DEBUG nova.compute.manager [req-6a6fead3-acbb-472f-b7d7-ae2846046442 req-dbdb9606-c746-4cff-ad83-2d344b09e478 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] No waiting events found dispatching network-vif-unplugged-4408871f-a0c1-4b9e-85e2-631f65861d3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:23 compute-1 nova_compute[192795]: 2025-09-30 21:43:23.076 2 DEBUG nova.compute.manager [req-6a6fead3-acbb-472f-b7d7-ae2846046442 req-dbdb9606-c746-4cff-ad83-2d344b09e478 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received event network-vif-unplugged-4408871f-a0c1-4b9e-85e2-631f65861d3f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:43:23 compute-1 podman[243580]: 2025-09-30 21:43:23.247035827 +0000 UTC m=+0.083345951 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:43:23 compute-1 nova_compute[192795]: 2025-09-30 21:43:23.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.443 2 DEBUG nova.network.neutron [-] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.513 2 DEBUG nova.compute.manager [req-43882129-2524-4482-b5cf-b9a81b0c01f6 req-a74baf16-f45c-4283-ba7e-a81d91ae1e71 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received event network-vif-deleted-4408871f-a0c1-4b9e-85e2-631f65861d3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.514 2 INFO nova.compute.manager [req-43882129-2524-4482-b5cf-b9a81b0c01f6 req-a74baf16-f45c-4283-ba7e-a81d91ae1e71 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Neutron deleted interface 4408871f-a0c1-4b9e-85e2-631f65861d3f; detaching it from the instance and deleting it from the info cache
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.514 2 DEBUG nova.network.neutron [req-43882129-2524-4482-b5cf-b9a81b0c01f6 req-a74baf16-f45c-4283-ba7e-a81d91ae1e71 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.518 2 INFO nova.compute.manager [-] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Took 1.70 seconds to deallocate network for instance.
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.548 2 DEBUG nova.compute.manager [req-43882129-2524-4482-b5cf-b9a81b0c01f6 req-a74baf16-f45c-4283-ba7e-a81d91ae1e71 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Detach interface failed, port_id=4408871f-a0c1-4b9e-85e2-631f65861d3f, reason: Instance abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.618 2 DEBUG oslo_concurrency.lockutils [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.619 2 DEBUG oslo_concurrency.lockutils [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.742 2 DEBUG nova.compute.provider_tree [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.869 2 DEBUG nova.scheduler.client.report [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.926 2 DEBUG oslo_concurrency.lockutils [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.935 2 DEBUG oslo_concurrency.lockutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.935 2 DEBUG oslo_concurrency.lockutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquired lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.936 2 DEBUG nova.network.neutron [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:43:24 compute-1 nova_compute[192795]: 2025-09-30 21:43:24.976 2 INFO nova.scheduler.client.report [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Deleted allocations for instance abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2
Sep 30 21:43:25 compute-1 nova_compute[192795]: 2025-09-30 21:43:25.140 2 DEBUG oslo_concurrency.lockutils [None req-eb3a4e0f-5921-43f6-9803-66575ee82681 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:25 compute-1 nova_compute[192795]: 2025-09-30 21:43:25.267 2 DEBUG nova.compute.manager [req-f253d09a-5229-491c-be8b-01c872cfd822 req-ca360fd3-99b7-48fe-a7d9-9891e0ccb5c9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received event network-vif-plugged-4408871f-a0c1-4b9e-85e2-631f65861d3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:25 compute-1 nova_compute[192795]: 2025-09-30 21:43:25.267 2 DEBUG oslo_concurrency.lockutils [req-f253d09a-5229-491c-be8b-01c872cfd822 req-ca360fd3-99b7-48fe-a7d9-9891e0ccb5c9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:25 compute-1 nova_compute[192795]: 2025-09-30 21:43:25.268 2 DEBUG oslo_concurrency.lockutils [req-f253d09a-5229-491c-be8b-01c872cfd822 req-ca360fd3-99b7-48fe-a7d9-9891e0ccb5c9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:25 compute-1 nova_compute[192795]: 2025-09-30 21:43:25.268 2 DEBUG oslo_concurrency.lockutils [req-f253d09a-5229-491c-be8b-01c872cfd822 req-ca360fd3-99b7-48fe-a7d9-9891e0ccb5c9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:25 compute-1 nova_compute[192795]: 2025-09-30 21:43:25.268 2 DEBUG nova.compute.manager [req-f253d09a-5229-491c-be8b-01c872cfd822 req-ca360fd3-99b7-48fe-a7d9-9891e0ccb5c9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] No waiting events found dispatching network-vif-plugged-4408871f-a0c1-4b9e-85e2-631f65861d3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:25 compute-1 nova_compute[192795]: 2025-09-30 21:43:25.268 2 WARNING nova.compute.manager [req-f253d09a-5229-491c-be8b-01c872cfd822 req-ca360fd3-99b7-48fe-a7d9-9891e0ccb5c9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Received unexpected event network-vif-plugged-4408871f-a0c1-4b9e-85e2-631f65861d3f for instance with vm_state deleted and task_state None.
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.037 2 DEBUG nova.network.neutron [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Updating instance_info_cache with network_info: [{"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.058 2 DEBUG oslo_concurrency.lockutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Releasing lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.261 2 DEBUG nova.virt.libvirt.driver [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.262 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Creating file /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/4ba166626225470981c54bd19fc0e7a3.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.262 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/4ba166626225470981c54bd19fc0e7a3.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.806 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/4ba166626225470981c54bd19fc0e7a3.tmp" returned: 1 in 0.543s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.807 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/4ba166626225470981c54bd19fc0e7a3.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.808 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Creating directory /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 21:43:27 compute-1 nova_compute[192795]: 2025-09-30 21:43:27.809 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:28 compute-1 nova_compute[192795]: 2025-09-30 21:43:28.055 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545" returned: 0 in 0.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:28 compute-1 nova_compute[192795]: 2025-09-30 21:43:28.063 2 DEBUG nova.virt.libvirt.driver [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:43:28 compute-1 nova_compute[192795]: 2025-09-30 21:43:28.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:30 compute-1 kernel: tapf62fc673-df (unregistering): left promiscuous mode
Sep 30 21:43:30 compute-1 NetworkManager[51724]: <info>  [1759268610.2178] device (tapf62fc673-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:43:30 compute-1 ovn_controller[94902]: 2025-09-30T21:43:30Z|00554|binding|INFO|Releasing lport f62fc673-df33-48e7-96fb-bd4f42eca73a from this chassis (sb_readonly=0)
Sep 30 21:43:30 compute-1 ovn_controller[94902]: 2025-09-30T21:43:30Z|00555|binding|INFO|Setting lport f62fc673-df33-48e7-96fb-bd4f42eca73a down in Southbound
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:30 compute-1 ovn_controller[94902]: 2025-09-30T21:43:30Z|00556|binding|INFO|Removing iface tapf62fc673-df ovn-installed in OVS
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.239 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:eb:10 10.100.0.5'], port_security=['fa:16:3e:c8:eb:10 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2384614e-09de-4607-9336-54877ec23545', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baf5ec1a-3e90-4e29-9575-409781d80e84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8f75955c-5947-4d54-b6d6-411d65ede0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.226'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50db7480-e4c4-40a4-9fa3-258b3c77b8ab, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f62fc673-df33-48e7-96fb-bd4f42eca73a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.243 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f62fc673-df33-48e7-96fb-bd4f42eca73a in datapath baf5ec1a-3e90-4e29-9575-409781d80e84 unbound from our chassis
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.245 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network baf5ec1a-3e90-4e29-9575-409781d80e84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.251 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[08d85941-14e8-4602-9735-76b7ebb75e98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.253 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84 namespace which is not needed anymore
Sep 30 21:43:30 compute-1 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Sep 30 21:43:30 compute-1 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008c.scope: Consumed 15.122s CPU time.
Sep 30 21:43:30 compute-1 systemd-machined[152783]: Machine qemu-65-instance-0000008c terminated.
Sep 30 21:43:30 compute-1 neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84[243149]: [NOTICE]   (243168) : haproxy version is 2.8.14-c23fe91
Sep 30 21:43:30 compute-1 neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84[243149]: [NOTICE]   (243168) : path to executable is /usr/sbin/haproxy
Sep 30 21:43:30 compute-1 neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84[243149]: [WARNING]  (243168) : Exiting Master process...
Sep 30 21:43:30 compute-1 neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84[243149]: [ALERT]    (243168) : Current worker (243175) exited with code 143 (Terminated)
Sep 30 21:43:30 compute-1 neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84[243149]: [WARNING]  (243168) : All workers exited. Exiting... (0)
Sep 30 21:43:30 compute-1 systemd[1]: libpod-c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41.scope: Deactivated successfully.
Sep 30 21:43:30 compute-1 podman[243629]: 2025-09-30 21:43:30.41071618 +0000 UTC m=+0.052049460 container died c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:43:30 compute-1 systemd[1]: var-lib-containers-storage-overlay-a210be76288a1125e9ce4844e2f5f07b6d2a15c97f7a9dbb7dce8373384970b7-merged.mount: Deactivated successfully.
Sep 30 21:43:30 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41-userdata-shm.mount: Deactivated successfully.
Sep 30 21:43:30 compute-1 podman[243629]: 2025-09-30 21:43:30.453442098 +0000 UTC m=+0.094775378 container cleanup c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:43:30 compute-1 systemd[1]: libpod-conmon-c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41.scope: Deactivated successfully.
Sep 30 21:43:30 compute-1 podman[243666]: 2025-09-30 21:43:30.528309291 +0000 UTC m=+0.051196078 container remove c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.537 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5761426a-69b7-4c74-a9ef-9c054576daf2]: (4, ('Tue Sep 30 09:43:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84 (c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41)\nc2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41\nTue Sep 30 09:43:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84 (c2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41)\nc2fff8ffbcc3984e349aa65bf355b540c085b43cb8f0b3ef5235cbd33478eb41\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.538 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4dafe4-3127-47ef-b535-e6fe2383716e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.540 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaf5ec1a-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:30 compute-1 kernel: tapbaf5ec1a-30: left promiscuous mode
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.565 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d636d9e2-f068-451b-b2b3-24e6f464b8e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.599 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[38979340-a947-4164-b87b-cb42ae3c8839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.601 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[642ab8c1-9ade-4ce8-aa34-17dca5b2d8f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.622 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[27d0101b-458f-4bda-968a-cc5a2fa4990d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526779, 'reachable_time': 35625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243695, 'error': None, 'target': 'ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.624 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-baf5ec1a-3e90-4e29-9575-409781d80e84 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:43:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:30.624 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[6625843e-dc0a-4d46-a4ad-ba373f218010]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:30 compute-1 systemd[1]: run-netns-ovnmeta\x2dbaf5ec1a\x2d3e90\x2d4e29\x2d9575\x2d409781d80e84.mount: Deactivated successfully.
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.962 2 DEBUG nova.compute.manager [req-feabcee9-42ce-4884-83e5-307812054f88 req-75b0afd5-cee6-4bb8-93b9-a632bfd169f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received event network-vif-unplugged-f62fc673-df33-48e7-96fb-bd4f42eca73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.963 2 DEBUG oslo_concurrency.lockutils [req-feabcee9-42ce-4884-83e5-307812054f88 req-75b0afd5-cee6-4bb8-93b9-a632bfd169f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.963 2 DEBUG oslo_concurrency.lockutils [req-feabcee9-42ce-4884-83e5-307812054f88 req-75b0afd5-cee6-4bb8-93b9-a632bfd169f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.964 2 DEBUG oslo_concurrency.lockutils [req-feabcee9-42ce-4884-83e5-307812054f88 req-75b0afd5-cee6-4bb8-93b9-a632bfd169f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.964 2 DEBUG nova.compute.manager [req-feabcee9-42ce-4884-83e5-307812054f88 req-75b0afd5-cee6-4bb8-93b9-a632bfd169f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] No waiting events found dispatching network-vif-unplugged-f62fc673-df33-48e7-96fb-bd4f42eca73a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:30 compute-1 nova_compute[192795]: 2025-09-30 21:43:30.965 2 WARNING nova.compute.manager [req-feabcee9-42ce-4884-83e5-307812054f88 req-75b0afd5-cee6-4bb8-93b9-a632bfd169f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received unexpected event network-vif-unplugged-f62fc673-df33-48e7-96fb-bd4f42eca73a for instance with vm_state active and task_state resize_migrating.
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.082 2 INFO nova.virt.libvirt.driver [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Instance shutdown successfully after 3 seconds.
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.090 2 INFO nova.virt.libvirt.driver [-] [instance: 2384614e-09de-4607-9336-54877ec23545] Instance destroyed successfully.
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.091 2 DEBUG nova.virt.libvirt.vif [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:42:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2055139287',display_name='tempest-TestNetworkAdvancedServerOps-server-2055139287',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2055139287',id=140,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBehqnc6d89NlpWlFkGSrh3zkfip/+VLqupR92Yr35G7qE4Lwo0WTn/PbJgkZacVF8bBGZIqn0TojkTkbuYSZcYRnrojpJav/wMRzr1lB8gfPWFkn7iP96oX0K0AZMrrNg==',key_name='tempest-TestNetworkAdvancedServerOps-1291575231',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:42:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-ggzu7oo6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:43:24Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=2384614e-09de-4607-9336-54877ec23545,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1531280698", "vif_mac": "fa:16:3e:c8:eb:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.092 2 DEBUG nova.network.os_vif_util [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converting VIF {"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1531280698", "vif_mac": "fa:16:3e:c8:eb:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.094 2 DEBUG nova.network.os_vif_util [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.095 2 DEBUG os_vif [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.098 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf62fc673-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.106 2 INFO os_vif [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df')
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.113 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.181 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.183 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.283 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.289 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Copying file /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545_resize/disk to 192.168.122.102:/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:43:31 compute-1 nova_compute[192795]: 2025-09-30 21:43:31.291 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545_resize/disk 192.168.122.102:/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:32 compute-1 podman[243706]: 2025-09-30 21:43:32.236596673 +0000 UTC m=+0.059841230 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:43:32 compute-1 podman[243704]: 2025-09-30 21:43:32.257490364 +0000 UTC m=+0.093514144 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:43:32 compute-1 podman[243705]: 2025-09-30 21:43:32.259680473 +0000 UTC m=+0.092048695 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:43:32 compute-1 nova_compute[192795]: 2025-09-30 21:43:32.824 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "scp -r /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545_resize/disk 192.168.122.102:/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk" returned: 0 in 1.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:32 compute-1 nova_compute[192795]: 2025-09-30 21:43:32.826 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Copying file /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:43:32 compute-1 nova_compute[192795]: 2025-09-30 21:43:32.827 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545_resize/disk.config 192.168.122.102:/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.071 2 DEBUG nova.compute.manager [req-21221f3a-c84f-49dd-a504-9b37aa380d23 req-5b98b60f-d17e-483e-8692-eca147b5e094 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.072 2 DEBUG oslo_concurrency.lockutils [req-21221f3a-c84f-49dd-a504-9b37aa380d23 req-5b98b60f-d17e-483e-8692-eca147b5e094 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.072 2 DEBUG oslo_concurrency.lockutils [req-21221f3a-c84f-49dd-a504-9b37aa380d23 req-5b98b60f-d17e-483e-8692-eca147b5e094 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.072 2 DEBUG oslo_concurrency.lockutils [req-21221f3a-c84f-49dd-a504-9b37aa380d23 req-5b98b60f-d17e-483e-8692-eca147b5e094 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.073 2 DEBUG nova.compute.manager [req-21221f3a-c84f-49dd-a504-9b37aa380d23 req-5b98b60f-d17e-483e-8692-eca147b5e094 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] No waiting events found dispatching network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.073 2 WARNING nova.compute.manager [req-21221f3a-c84f-49dd-a504-9b37aa380d23 req-5b98b60f-d17e-483e-8692-eca147b5e094 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received unexpected event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a for instance with vm_state active and task_state resize_migrating.
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.153 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "scp -C -r /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545_resize/disk.config 192.168.122.102:/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.config" returned: 0 in 0.326s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.155 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Copying file /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.155 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545_resize/disk.info 192.168.122.102:/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.393 2 DEBUG oslo_concurrency.processutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "scp -C -r /var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545_resize/disk.info 192.168.122.102:/var/lib/nova/instances/2384614e-09de-4607-9336-54877ec23545/disk.info" returned: 0 in 0.237s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.602 2 DEBUG neutronclient.v2_0.client [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port f62fc673-df33-48e7-96fb-bd4f42eca73a for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.766 2 DEBUG oslo_concurrency.lockutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.766 2 DEBUG oslo_concurrency.lockutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:33 compute-1 nova_compute[192795]: 2025-09-30 21:43:33.767 2 DEBUG oslo_concurrency.lockutils [None req-8f9ce74e-2c92-4fd0-827c-17655e299cca 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:34 compute-1 nova_compute[192795]: 2025-09-30 21:43:34.654 2 DEBUG nova.compute.manager [req-25724a71-ed3f-45d7-91b3-c647599da004 req-d07a72d4-62b8-44d4-8c0d-e1b736b57f6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received event network-changed-f62fc673-df33-48e7-96fb-bd4f42eca73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:34 compute-1 nova_compute[192795]: 2025-09-30 21:43:34.656 2 DEBUG nova.compute.manager [req-25724a71-ed3f-45d7-91b3-c647599da004 req-d07a72d4-62b8-44d4-8c0d-e1b736b57f6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Refreshing instance network info cache due to event network-changed-f62fc673-df33-48e7-96fb-bd4f42eca73a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:43:34 compute-1 nova_compute[192795]: 2025-09-30 21:43:34.656 2 DEBUG oslo_concurrency.lockutils [req-25724a71-ed3f-45d7-91b3-c647599da004 req-d07a72d4-62b8-44d4-8c0d-e1b736b57f6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:34 compute-1 nova_compute[192795]: 2025-09-30 21:43:34.657 2 DEBUG oslo_concurrency.lockutils [req-25724a71-ed3f-45d7-91b3-c647599da004 req-d07a72d4-62b8-44d4-8c0d-e1b736b57f6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:34 compute-1 nova_compute[192795]: 2025-09-30 21:43:34.657 2 DEBUG nova.network.neutron [req-25724a71-ed3f-45d7-91b3-c647599da004 req-d07a72d4-62b8-44d4-8c0d-e1b736b57f6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Refreshing network info cache for port f62fc673-df33-48e7-96fb-bd4f42eca73a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:43:34 compute-1 nova_compute[192795]: 2025-09-30 21:43:34.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:34 compute-1 nova_compute[192795]: 2025-09-30 21:43:34.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:35.701 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:43:35 compute-1 nova_compute[192795]: 2025-09-30 21:43:35.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:35 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:35.703 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:43:36 compute-1 nova_compute[192795]: 2025-09-30 21:43:36.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:36 compute-1 nova_compute[192795]: 2025-09-30 21:43:36.695 2 DEBUG nova.network.neutron [req-25724a71-ed3f-45d7-91b3-c647599da004 req-d07a72d4-62b8-44d4-8c0d-e1b736b57f6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Updated VIF entry in instance network info cache for port f62fc673-df33-48e7-96fb-bd4f42eca73a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:43:36 compute-1 nova_compute[192795]: 2025-09-30 21:43:36.695 2 DEBUG nova.network.neutron [req-25724a71-ed3f-45d7-91b3-c647599da004 req-d07a72d4-62b8-44d4-8c0d-e1b736b57f6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Updating instance_info_cache with network_info: [{"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:36 compute-1 nova_compute[192795]: 2025-09-30 21:43:36.733 2 DEBUG oslo_concurrency.lockutils [req-25724a71-ed3f-45d7-91b3-c647599da004 req-d07a72d4-62b8-44d4-8c0d-e1b736b57f6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:37 compute-1 nova_compute[192795]: 2025-09-30 21:43:37.565 2 DEBUG nova.compute.manager [req-7aeb5fea-4659-49da-a872-ae21bfe3f7ed req-54e992af-7241-4a87-abfe-a01b745e371c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:37 compute-1 nova_compute[192795]: 2025-09-30 21:43:37.566 2 DEBUG oslo_concurrency.lockutils [req-7aeb5fea-4659-49da-a872-ae21bfe3f7ed req-54e992af-7241-4a87-abfe-a01b745e371c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:37 compute-1 nova_compute[192795]: 2025-09-30 21:43:37.566 2 DEBUG oslo_concurrency.lockutils [req-7aeb5fea-4659-49da-a872-ae21bfe3f7ed req-54e992af-7241-4a87-abfe-a01b745e371c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:37 compute-1 nova_compute[192795]: 2025-09-30 21:43:37.567 2 DEBUG oslo_concurrency.lockutils [req-7aeb5fea-4659-49da-a872-ae21bfe3f7ed req-54e992af-7241-4a87-abfe-a01b745e371c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:37 compute-1 nova_compute[192795]: 2025-09-30 21:43:37.567 2 DEBUG nova.compute.manager [req-7aeb5fea-4659-49da-a872-ae21bfe3f7ed req-54e992af-7241-4a87-abfe-a01b745e371c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] No waiting events found dispatching network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:37 compute-1 nova_compute[192795]: 2025-09-30 21:43:37.567 2 WARNING nova.compute.manager [req-7aeb5fea-4659-49da-a872-ae21bfe3f7ed req-54e992af-7241-4a87-abfe-a01b745e371c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received unexpected event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a for instance with vm_state active and task_state resize_finish.
Sep 30 21:43:37 compute-1 nova_compute[192795]: 2025-09-30 21:43:37.597 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268602.5960817, abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:37 compute-1 nova_compute[192795]: 2025-09-30 21:43:37.598 2 INFO nova.compute.manager [-] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] VM Stopped (Lifecycle Event)
Sep 30 21:43:37 compute-1 nova_compute[192795]: 2025-09-30 21:43:37.621 2 DEBUG nova.compute.manager [None req-d7ff3182-17f1-4df6-9a76-c94f2487c05d - - - - - -] [instance: abfc95fd-ab00-4512-9c1d-9bb8d0cbfaa2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:38 compute-1 podman[243778]: 2025-09-30 21:43:38.216792198 +0000 UTC m=+0.057263410 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:43:38 compute-1 nova_compute[192795]: 2025-09-30 21:43:38.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:38.701 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:38.702 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:38.702 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:39 compute-1 nova_compute[192795]: 2025-09-30 21:43:39.698 2 DEBUG nova.compute.manager [req-6ef14915-dfe7-4184-bdcd-2e6d192a73bb req-8df2c8b8-f8c3-4a30-9264-049f213b92d8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:39 compute-1 nova_compute[192795]: 2025-09-30 21:43:39.699 2 DEBUG oslo_concurrency.lockutils [req-6ef14915-dfe7-4184-bdcd-2e6d192a73bb req-8df2c8b8-f8c3-4a30-9264-049f213b92d8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:39 compute-1 nova_compute[192795]: 2025-09-30 21:43:39.699 2 DEBUG oslo_concurrency.lockutils [req-6ef14915-dfe7-4184-bdcd-2e6d192a73bb req-8df2c8b8-f8c3-4a30-9264-049f213b92d8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:39 compute-1 nova_compute[192795]: 2025-09-30 21:43:39.699 2 DEBUG oslo_concurrency.lockutils [req-6ef14915-dfe7-4184-bdcd-2e6d192a73bb req-8df2c8b8-f8c3-4a30-9264-049f213b92d8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2384614e-09de-4607-9336-54877ec23545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:39 compute-1 nova_compute[192795]: 2025-09-30 21:43:39.699 2 DEBUG nova.compute.manager [req-6ef14915-dfe7-4184-bdcd-2e6d192a73bb req-8df2c8b8-f8c3-4a30-9264-049f213b92d8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] No waiting events found dispatching network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:39 compute-1 nova_compute[192795]: 2025-09-30 21:43:39.700 2 WARNING nova.compute.manager [req-6ef14915-dfe7-4184-bdcd-2e6d192a73bb req-8df2c8b8-f8c3-4a30-9264-049f213b92d8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Received unexpected event network-vif-plugged-f62fc673-df33-48e7-96fb-bd4f42eca73a for instance with vm_state resized and task_state None.
Sep 30 21:43:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:43:39.707 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:41 compute-1 nova_compute[192795]: 2025-09-30 21:43:41.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:41 compute-1 nova_compute[192795]: 2025-09-30 21:43:41.407 2 DEBUG oslo_concurrency.lockutils [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "2384614e-09de-4607-9336-54877ec23545" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:41 compute-1 nova_compute[192795]: 2025-09-30 21:43:41.407 2 DEBUG oslo_concurrency.lockutils [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "2384614e-09de-4607-9336-54877ec23545" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:41 compute-1 nova_compute[192795]: 2025-09-30 21:43:41.408 2 DEBUG nova.compute.manager [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Going to confirm migration 17 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Sep 30 21:43:41 compute-1 nova_compute[192795]: 2025-09-30 21:43:41.575 2 DEBUG nova.objects.instance [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'info_cache' on Instance uuid 2384614e-09de-4607-9336-54877ec23545 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:42 compute-1 nova_compute[192795]: 2025-09-30 21:43:42.491 2 DEBUG neutronclient.v2_0.client [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port f62fc673-df33-48e7-96fb-bd4f42eca73a for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Sep 30 21:43:42 compute-1 nova_compute[192795]: 2025-09-30 21:43:42.492 2 DEBUG oslo_concurrency.lockutils [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:42 compute-1 nova_compute[192795]: 2025-09-30 21:43:42.492 2 DEBUG oslo_concurrency.lockutils [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:42 compute-1 nova_compute[192795]: 2025-09-30 21:43:42.492 2 DEBUG nova.network.neutron [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:43:43 compute-1 nova_compute[192795]: 2025-09-30 21:43:43.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.262 2 DEBUG nova.network.neutron [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 2384614e-09de-4607-9336-54877ec23545] Updating instance_info_cache with network_info: [{"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.492 2 DEBUG oslo_concurrency.lockutils [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-2384614e-09de-4607-9336-54877ec23545" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.493 2 DEBUG nova.objects.instance [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid 2384614e-09de-4607-9336-54877ec23545 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.533 2 DEBUG nova.virt.libvirt.vif [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:42:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2055139287',display_name='tempest-TestNetworkAdvancedServerOps-server-2055139287',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2055139287',id=140,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBehqnc6d89NlpWlFkGSrh3zkfip/+VLqupR92Yr35G7qE4Lwo0WTn/PbJgkZacVF8bBGZIqn0TojkTkbuYSZcYRnrojpJav/wMRzr1lB8gfPWFkn7iP96oX0K0AZMrrNg==',key_name='tempest-TestNetworkAdvancedServerOps-1291575231',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:43:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-ggzu7oo6',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:43:39Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=2384614e-09de-4607-9336-54877ec23545,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.534 2 DEBUG nova.network.os_vif_util [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "address": "fa:16:3e:c8:eb:10", "network": {"id": "baf5ec1a-3e90-4e29-9575-409781d80e84", "bridge": "br-int", "label": "tempest-network-smoke--1531280698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62fc673-df", "ovs_interfaceid": "f62fc673-df33-48e7-96fb-bd4f42eca73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.534 2 DEBUG nova.network.os_vif_util [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.535 2 DEBUG os_vif [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.536 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf62fc673-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.537 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.538 2 INFO os_vif [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:eb:10,bridge_name='br-int',has_traffic_filtering=True,id=f62fc673-df33-48e7-96fb-bd4f42eca73a,network=Network(baf5ec1a-3e90-4e29-9575-409781d80e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62fc673-df')
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.539 2 DEBUG oslo_concurrency.lockutils [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.539 2 DEBUG oslo_concurrency.lockutils [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.744 2 DEBUG nova.compute.provider_tree [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:43:44 compute-1 nova_compute[192795]: 2025-09-30 21:43:44.988 2 DEBUG nova.scheduler.client.report [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:43:45 compute-1 nova_compute[192795]: 2025-09-30 21:43:45.113 2 DEBUG oslo_concurrency.lockutils [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:45 compute-1 podman[243800]: 2025-09-30 21:43:45.223212294 +0000 UTC m=+0.058657358 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:43:45 compute-1 podman[243801]: 2025-09-30 21:43:45.228200548 +0000 UTC m=+0.058402170 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:43:45 compute-1 podman[243799]: 2025-09-30 21:43:45.237102957 +0000 UTC m=+0.079262401 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Sep 30 21:43:45 compute-1 nova_compute[192795]: 2025-09-30 21:43:45.277 2 INFO nova.scheduler.client.report [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Deleted allocation for migration 2a259fd3-1548-4c49-b9d9-ffe5ef4a30f1
Sep 30 21:43:45 compute-1 nova_compute[192795]: 2025-09-30 21:43:45.367 2 DEBUG oslo_concurrency.lockutils [None req-d5ec9637-cfa1-4b41-90dc-f3f984c613e3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "2384614e-09de-4607-9336-54877ec23545" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.959s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:45 compute-1 nova_compute[192795]: 2025-09-30 21:43:45.505 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268610.5044694, 2384614e-09de-4607-9336-54877ec23545 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:45 compute-1 nova_compute[192795]: 2025-09-30 21:43:45.506 2 INFO nova.compute.manager [-] [instance: 2384614e-09de-4607-9336-54877ec23545] VM Stopped (Lifecycle Event)
Sep 30 21:43:45 compute-1 nova_compute[192795]: 2025-09-30 21:43:45.524 2 DEBUG nova.compute.manager [None req-43bbe360-7702-404a-8eae-f2053a5b9e0d - - - - - -] [instance: 2384614e-09de-4607-9336-54877ec23545] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:46 compute-1 nova_compute[192795]: 2025-09-30 21:43:46.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:48 compute-1 nova_compute[192795]: 2025-09-30 21:43:48.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:51 compute-1 nova_compute[192795]: 2025-09-30 21:43:51.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:53 compute-1 nova_compute[192795]: 2025-09-30 21:43:53.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:54 compute-1 podman[243860]: 2025-09-30 21:43:54.212322879 +0000 UTC m=+0.059727175 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible)
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.759 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.760 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.760 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.760 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.944 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.945 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5688MB free_disk=73.31673431396484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.946 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:55 compute-1 nova_compute[192795]: 2025-09-30 21:43:55.946 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:56 compute-1 nova_compute[192795]: 2025-09-30 21:43:56.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:56 compute-1 nova_compute[192795]: 2025-09-30 21:43:56.176 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:43:56 compute-1 nova_compute[192795]: 2025-09-30 21:43:56.177 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:43:56 compute-1 nova_compute[192795]: 2025-09-30 21:43:56.195 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:43:56 compute-1 nova_compute[192795]: 2025-09-30 21:43:56.354 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:43:56 compute-1 nova_compute[192795]: 2025-09-30 21:43:56.623 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:43:56 compute-1 nova_compute[192795]: 2025-09-30 21:43:56.624 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:58 compute-1 nova_compute[192795]: 2025-09-30 21:43:58.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:58 compute-1 nova_compute[192795]: 2025-09-30 21:43:58.624 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:58 compute-1 nova_compute[192795]: 2025-09-30 21:43:58.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:58 compute-1 nova_compute[192795]: 2025-09-30 21:43:58.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:44:00 compute-1 nova_compute[192795]: 2025-09-30 21:44:00.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:01 compute-1 nova_compute[192795]: 2025-09-30 21:44:01.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:03 compute-1 podman[243884]: 2025-09-30 21:44:03.216402438 +0000 UTC m=+0.055433232 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:44:03 compute-1 podman[243882]: 2025-09-30 21:44:03.219739527 +0000 UTC m=+0.064056482 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:44:03 compute-1 podman[243883]: 2025-09-30 21:44:03.246232939 +0000 UTC m=+0.087110742 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:44:03 compute-1 nova_compute[192795]: 2025-09-30 21:44:03.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:03 compute-1 nova_compute[192795]: 2025-09-30 21:44:03.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:04 compute-1 nova_compute[192795]: 2025-09-30 21:44:04.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:06 compute-1 nova_compute[192795]: 2025-09-30 21:44:06.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:08 compute-1 nova_compute[192795]: 2025-09-30 21:44:08.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:08 compute-1 nova_compute[192795]: 2025-09-30 21:44:08.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:08 compute-1 nova_compute[192795]: 2025-09-30 21:44:08.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:44:08 compute-1 nova_compute[192795]: 2025-09-30 21:44:08.946 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:44:08 compute-1 nova_compute[192795]: 2025-09-30 21:44:08.947 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:09 compute-1 podman[243950]: 2025-09-30 21:44:09.239202899 +0000 UTC m=+0.075266254 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:44:11 compute-1 nova_compute[192795]: 2025-09-30 21:44:11.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:12 compute-1 nova_compute[192795]: 2025-09-30 21:44:12.942 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:13 compute-1 nova_compute[192795]: 2025-09-30 21:44:13.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:16 compute-1 nova_compute[192795]: 2025-09-30 21:44:16.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:16 compute-1 podman[243972]: 2025-09-30 21:44:16.21748892 +0000 UTC m=+0.054655561 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:44:16 compute-1 podman[243970]: 2025-09-30 21:44:16.217571422 +0000 UTC m=+0.063329234 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Sep 30 21:44:16 compute-1 podman[243971]: 2025-09-30 21:44:16.247187328 +0000 UTC m=+0.084523254 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:44:18 compute-1 nova_compute[192795]: 2025-09-30 21:44:18.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:21 compute-1 nova_compute[192795]: 2025-09-30 21:44:21.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:22 compute-1 nova_compute[192795]: 2025-09-30 21:44:22.928 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:22 compute-1 nova_compute[192795]: 2025-09-30 21:44:22.928 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:22 compute-1 nova_compute[192795]: 2025-09-30 21:44:22.981 2 DEBUG nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.109 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.109 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.120 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.121 2 INFO nova.compute.claims [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.282 2 DEBUG nova.compute.provider_tree [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.295 2 DEBUG nova.scheduler.client.report [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.315 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.316 2 DEBUG nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.377 2 DEBUG nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.378 2 DEBUG nova.network.neutron [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.395 2 INFO nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.442 2 DEBUG nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.563 2 DEBUG nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.565 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.565 2 INFO nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Creating image(s)
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.566 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.566 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.566 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.578 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.631 2 DEBUG nova.policy [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.672 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.673 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.674 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.684 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.745 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.746 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.788 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.789 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.790 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.866 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.867 2 DEBUG nova.virt.disk.api [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.868 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.927 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.928 2 DEBUG nova.virt.disk.api [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.929 2 DEBUG nova.objects.instance [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.973 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.974 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Ensure instance console log exists: /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.974 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.975 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:23 compute-1 nova_compute[192795]: 2025-09-30 21:44:23.975 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:25 compute-1 podman[244047]: 2025-09-30 21:44:25.216053038 +0000 UTC m=+0.054271229 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:44:25 compute-1 nova_compute[192795]: 2025-09-30 21:44:25.234 2 DEBUG nova.network.neutron [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Successfully created port: c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:44:26 compute-1 nova_compute[192795]: 2025-09-30 21:44:26.022 2 DEBUG nova.network.neutron [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Successfully updated port: c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:44:26 compute-1 nova_compute[192795]: 2025-09-30 21:44:26.036 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:26 compute-1 nova_compute[192795]: 2025-09-30 21:44:26.037 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:26 compute-1 nova_compute[192795]: 2025-09-30 21:44:26.037 2 DEBUG nova.network.neutron [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:44:26 compute-1 nova_compute[192795]: 2025-09-30 21:44:26.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:26 compute-1 nova_compute[192795]: 2025-09-30 21:44:26.168 2 DEBUG nova.compute.manager [req-2c5318da-90e0-4e1b-9225-3bdec6c98d03 req-60641343-a18f-47e6-be17-c6e5af2b620d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:26 compute-1 nova_compute[192795]: 2025-09-30 21:44:26.169 2 DEBUG nova.compute.manager [req-2c5318da-90e0-4e1b-9225-3bdec6c98d03 req-60641343-a18f-47e6-be17-c6e5af2b620d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing instance network info cache due to event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:44:26 compute-1 nova_compute[192795]: 2025-09-30 21:44:26.169 2 DEBUG oslo_concurrency.lockutils [req-2c5318da-90e0-4e1b-9225-3bdec6c98d03 req-60641343-a18f-47e6-be17-c6e5af2b620d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:26 compute-1 nova_compute[192795]: 2025-09-30 21:44:26.268 2 DEBUG nova.network.neutron [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.880 2 DEBUG nova.network.neutron [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.902 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.903 2 DEBUG nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance network_info: |[{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.903 2 DEBUG oslo_concurrency.lockutils [req-2c5318da-90e0-4e1b-9225-3bdec6c98d03 req-60641343-a18f-47e6-be17-c6e5af2b620d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.903 2 DEBUG nova.network.neutron [req-2c5318da-90e0-4e1b-9225-3bdec6c98d03 req-60641343-a18f-47e6-be17-c6e5af2b620d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.906 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Start _get_guest_xml network_info=[{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.910 2 WARNING nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.917 2 DEBUG nova.virt.libvirt.host [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.918 2 DEBUG nova.virt.libvirt.host [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.921 2 DEBUG nova.virt.libvirt.host [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.922 2 DEBUG nova.virt.libvirt.host [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.923 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.923 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.923 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.924 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.924 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.924 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.924 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.924 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.925 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.925 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.925 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.925 2 DEBUG nova.virt.hardware [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.929 2 DEBUG nova.virt.libvirt.vif [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:44:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1387484208',display_name='tempest-TestNetworkAdvancedServerOps-server-1387484208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1387484208',id=145,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGR5X4mrP2QSdtYc3f8JfOkFpP2VM6oi0AFFxhCennHtpd7iMpKCrxhgpbTxKW+iQFQ9j7tfzEdahfdLg9KKVDLd5BB9lwZMhpRJKZ+A5gEfm5/7y6uIYpwFPAd9COVo+g==',key_name='tempest-TestNetworkAdvancedServerOps-302836404',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-be0b95e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:44:23Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=7dbd9807-9843-4455-8bf1-3bd7f4fa37c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.929 2 DEBUG nova.network.os_vif_util [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.930 2 DEBUG nova.network.os_vif_util [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.931 2 DEBUG nova.objects.instance [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.962 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <uuid>7dbd9807-9843-4455-8bf1-3bd7f4fa37c6</uuid>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <name>instance-00000091</name>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1387484208</nova:name>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:44:27</nova:creationTime>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:44:27 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:44:27 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:44:27 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:44:27 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:44:27 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:44:27 compute-1 nova_compute[192795]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:44:27 compute-1 nova_compute[192795]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:44:27 compute-1 nova_compute[192795]:         <nova:port uuid="c00d1cce-5707-41ba-9ca0-2aeecbf662d8">
Sep 30 21:44:27 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <system>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <entry name="serial">7dbd9807-9843-4455-8bf1-3bd7f4fa37c6</entry>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <entry name="uuid">7dbd9807-9843-4455-8bf1-3bd7f4fa37c6</entry>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     </system>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <os>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   </os>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <features>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   </features>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:e7:65:6e"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <target dev="tapc00d1cce-57"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/console.log" append="off"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <video>
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     </video>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:44:27 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:44:27 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:44:27 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:44:27 compute-1 nova_compute[192795]: </domain>
Sep 30 21:44:27 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.964 2 DEBUG nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Preparing to wait for external event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.964 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.964 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.964 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.965 2 DEBUG nova.virt.libvirt.vif [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:44:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1387484208',display_name='tempest-TestNetworkAdvancedServerOps-server-1387484208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1387484208',id=145,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGR5X4mrP2QSdtYc3f8JfOkFpP2VM6oi0AFFxhCennHtpd7iMpKCrxhgpbTxKW+iQFQ9j7tfzEdahfdLg9KKVDLd5BB9lwZMhpRJKZ+A5gEfm5/7y6uIYpwFPAd9COVo+g==',key_name='tempest-TestNetworkAdvancedServerOps-302836404',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-be0b95e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:44:23Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=7dbd9807-9843-4455-8bf1-3bd7f4fa37c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.965 2 DEBUG nova.network.os_vif_util [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.966 2 DEBUG nova.network.os_vif_util [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.966 2 DEBUG os_vif [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.967 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.967 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.970 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc00d1cce-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.970 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc00d1cce-57, col_values=(('external_ids', {'iface-id': 'c00d1cce-5707-41ba-9ca0-2aeecbf662d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:65:6e', 'vm-uuid': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:27 compute-1 NetworkManager[51724]: <info>  [1759268667.9741] manager: (tapc00d1cce-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:27 compute-1 nova_compute[192795]: 2025-09-30 21:44:27.980 2 INFO os_vif [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57')
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.023 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.023 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.023 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No VIF found with MAC fa:16:3e:e7:65:6e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.024 2 INFO nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Using config drive
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.514 2 INFO nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Creating config drive at /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.520 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwkay1uab execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.647 2 DEBUG oslo_concurrency.processutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwkay1uab" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:28 compute-1 kernel: tapc00d1cce-57: entered promiscuous mode
Sep 30 21:44:28 compute-1 NetworkManager[51724]: <info>  [1759268668.7159] manager: (tapc00d1cce-57): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Sep 30 21:44:28 compute-1 ovn_controller[94902]: 2025-09-30T21:44:28Z|00557|binding|INFO|Claiming lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for this chassis.
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:28 compute-1 ovn_controller[94902]: 2025-09-30T21:44:28Z|00558|binding|INFO|c00d1cce-5707-41ba-9ca0-2aeecbf662d8: Claiming fa:16:3e:e7:65:6e 10.100.0.14
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.735 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.736 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 bound to our chassis
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.737 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 033e9c33-7065-4faf-8a4b-e2705c450c67
Sep 30 21:44:28 compute-1 systemd-udevd[244086]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.750 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[83e8c366-0083-403f-8b07-344ebbcf6e1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.751 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap033e9c33-71 in ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.754 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap033e9c33-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.754 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[69a67f2e-053c-4ac0-9935-935356e0b677]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.755 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8701cd-36b2-4140-a825-7b010c53e536]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 NetworkManager[51724]: <info>  [1759268668.7602] device (tapc00d1cce-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:44:28 compute-1 NetworkManager[51724]: <info>  [1759268668.7613] device (tapc00d1cce-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:44:28 compute-1 systemd-machined[152783]: New machine qemu-67-instance-00000091.
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.770 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[ed085181-dc5f-4711-a57c-f2609f107bfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:28 compute-1 ovn_controller[94902]: 2025-09-30T21:44:28Z|00559|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 ovn-installed in OVS
Sep 30 21:44:28 compute-1 ovn_controller[94902]: 2025-09-30T21:44:28Z|00560|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 up in Southbound
Sep 30 21:44:28 compute-1 nova_compute[192795]: 2025-09-30 21:44:28.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:28 compute-1 systemd[1]: Started Virtual Machine qemu-67-instance-00000091.
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.795 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5c217cb2-a5df-407a-aa4d-a3b5c1014023]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.825 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b7555610-23a9-4155-905e-7a2653b01f43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 NetworkManager[51724]: <info>  [1759268668.8329] manager: (tap033e9c33-70): new Veth device (/org/freedesktop/NetworkManager/Devices/276)
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.832 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7f8522-3282-48ef-89d8-07aa4f7feec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.869 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[15f93652-d3d8-4a3b-88da-4b30146a879d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.875 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[db89c891-38a1-4b65-ae89-c2cbd838a8a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 NetworkManager[51724]: <info>  [1759268668.8981] device (tap033e9c33-70): carrier: link connected
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.903 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5f679c23-39b4-4437-8332-815820eb5b28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.921 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[df98dd99-a30c-4937-9457-87bbb8744963]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap033e9c33-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:16:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536457, 'reachable_time': 19969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244120, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.939 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2748a1a0-4998-4ebb-88b8-af32b8929866]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:16ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536457, 'tstamp': 536457}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244121, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.957 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ab0ab7-21e0-4456-b1ce-d87f1317fb8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap033e9c33-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:16:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536457, 'reachable_time': 19969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244122, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:28 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:28.990 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[411ebe01-6af2-44fc-8226-3dbff83e100a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:29.053 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2ae71745-999c-4187-badf-de924a967c5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:29.055 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap033e9c33-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:29.055 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:29.056 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap033e9c33-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:29 compute-1 NetworkManager[51724]: <info>  [1759268669.0591] manager: (tap033e9c33-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Sep 30 21:44:29 compute-1 kernel: tap033e9c33-70: entered promiscuous mode
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:29.063 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap033e9c33-70, col_values=(('external_ids', {'iface-id': 'c20141c7-d465-400d-879d-d68081c3646d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:29 compute-1 ovn_controller[94902]: 2025-09-30T21:44:29Z|00561|binding|INFO|Releasing lport c20141c7-d465-400d-879d-d68081c3646d from this chassis (sb_readonly=0)
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:29.065 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/033e9c33-7065-4faf-8a4b-e2705c450c67.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/033e9c33-7065-4faf-8a4b-e2705c450c67.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:29.066 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8fee63d2-54d1-48a1-b0f5-8db0d51fcce8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:29.067 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-033e9c33-7065-4faf-8a4b-e2705c450c67
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/033e9c33-7065-4faf-8a4b-e2705c450c67.pid.haproxy
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 033e9c33-7065-4faf-8a4b-e2705c450c67
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:44:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:29.068 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'env', 'PROCESS_TAG=haproxy-033e9c33-7065-4faf-8a4b-e2705c450c67', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/033e9c33-7065-4faf-8a4b-e2705c450c67.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:29 compute-1 podman[244161]: 2025-09-30 21:44:29.469235468 +0000 UTC m=+0.085301973 container create 94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:44:29 compute-1 podman[244161]: 2025-09-30 21:44:29.412952376 +0000 UTC m=+0.029018971 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:44:29 compute-1 systemd[1]: Started libpod-conmon-94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e.scope.
Sep 30 21:44:29 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:44:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b061a1f63e9c2820b5ae17850d435996d788002035a552cf008bf731b886d7d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:44:29 compute-1 podman[244161]: 2025-09-30 21:44:29.571864346 +0000 UTC m=+0.187930941 container init 94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:44:29 compute-1 podman[244161]: 2025-09-30 21:44:29.578415163 +0000 UTC m=+0.194481698 container start 94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:44:29 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244178]: [NOTICE]   (244182) : New worker (244184) forked
Sep 30 21:44:29 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244178]: [NOTICE]   (244182) : Loading success.
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.611 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268669.610128, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.613 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Started (Lifecycle Event)
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.643 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.650 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268669.6104043, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.651 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Paused (Lifecycle Event)
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.679 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.684 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.698 2 DEBUG nova.compute.manager [req-6513ad93-09a5-420d-9772-0dc513f2901a req-ab8f03a5-170e-4cd0-bd76-5d12998ee933 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.699 2 DEBUG oslo_concurrency.lockutils [req-6513ad93-09a5-420d-9772-0dc513f2901a req-ab8f03a5-170e-4cd0-bd76-5d12998ee933 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.699 2 DEBUG oslo_concurrency.lockutils [req-6513ad93-09a5-420d-9772-0dc513f2901a req-ab8f03a5-170e-4cd0-bd76-5d12998ee933 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.700 2 DEBUG oslo_concurrency.lockutils [req-6513ad93-09a5-420d-9772-0dc513f2901a req-ab8f03a5-170e-4cd0-bd76-5d12998ee933 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.701 2 DEBUG nova.compute.manager [req-6513ad93-09a5-420d-9772-0dc513f2901a req-ab8f03a5-170e-4cd0-bd76-5d12998ee933 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Processing event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.707 2 DEBUG nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.711 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.716 2 INFO nova.virt.libvirt.driver [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance spawned successfully.
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.717 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.720 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.722 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268669.7111802, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.722 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Resumed (Lifecycle Event)
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.750 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.754 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.755 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.756 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.757 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.757 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.758 2 DEBUG nova.virt.libvirt.driver [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.767 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.805 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.847 2 INFO nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Took 6.28 seconds to spawn the instance on the hypervisor.
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.848 2 DEBUG nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.940 2 INFO nova.compute.manager [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Took 6.88 seconds to build instance.
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.949 2 DEBUG nova.network.neutron [req-2c5318da-90e0-4e1b-9225-3bdec6c98d03 req-60641343-a18f-47e6-be17-c6e5af2b620d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updated VIF entry in instance network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.950 2 DEBUG nova.network.neutron [req-2c5318da-90e0-4e1b-9225-3bdec6c98d03 req-60641343-a18f-47e6-be17-c6e5af2b620d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.978 2 DEBUG oslo_concurrency.lockutils [req-2c5318da-90e0-4e1b-9225-3bdec6c98d03 req-60641343-a18f-47e6-be17-c6e5af2b620d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:29 compute-1 nova_compute[192795]: 2025-09-30 21:44:29.979 2 DEBUG oslo_concurrency.lockutils [None req-9c7a921c-1f66-4cb1-81c2-5767ca22a485 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:31 compute-1 nova_compute[192795]: 2025-09-30 21:44:31.842 2 DEBUG nova.compute.manager [req-302c19d7-b417-42af-954f-24b2a21aae2a req-e7358e43-c472-4228-ba38-deb96b7a95bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:31 compute-1 nova_compute[192795]: 2025-09-30 21:44:31.842 2 DEBUG oslo_concurrency.lockutils [req-302c19d7-b417-42af-954f-24b2a21aae2a req-e7358e43-c472-4228-ba38-deb96b7a95bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:31 compute-1 nova_compute[192795]: 2025-09-30 21:44:31.842 2 DEBUG oslo_concurrency.lockutils [req-302c19d7-b417-42af-954f-24b2a21aae2a req-e7358e43-c472-4228-ba38-deb96b7a95bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:31 compute-1 nova_compute[192795]: 2025-09-30 21:44:31.843 2 DEBUG oslo_concurrency.lockutils [req-302c19d7-b417-42af-954f-24b2a21aae2a req-e7358e43-c472-4228-ba38-deb96b7a95bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:31 compute-1 nova_compute[192795]: 2025-09-30 21:44:31.843 2 DEBUG nova.compute.manager [req-302c19d7-b417-42af-954f-24b2a21aae2a req-e7358e43-c472-4228-ba38-deb96b7a95bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:31 compute-1 nova_compute[192795]: 2025-09-30 21:44:31.843 2 WARNING nova.compute.manager [req-302c19d7-b417-42af-954f-24b2a21aae2a req-e7358e43-c472-4228-ba38-deb96b7a95bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state None.
Sep 30 21:44:32 compute-1 nova_compute[192795]: 2025-09-30 21:44:32.963 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "f5164a8d-e5aa-4bb7-9075-73debca4d516" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:32 compute-1 nova_compute[192795]: 2025-09-30 21:44:32.964 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:32 compute-1 nova_compute[192795]: 2025-09-30 21:44:32.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:32 compute-1 nova_compute[192795]: 2025-09-30 21:44:32.995 2 DEBUG nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.140 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.141 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.148 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.148 2 INFO nova.compute.claims [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:44:33 compute-1 NetworkManager[51724]: <info>  [1759268673.2818] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Sep 30 21:44:33 compute-1 NetworkManager[51724]: <info>  [1759268673.2828] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.329 2 DEBUG nova.compute.provider_tree [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.348 2 DEBUG nova.scheduler.client.report [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.372 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.373 2 DEBUG nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:44:33 compute-1 ovn_controller[94902]: 2025-09-30T21:44:33Z|00562|binding|INFO|Releasing lport c20141c7-d465-400d-879d-d68081c3646d from this chassis (sb_readonly=0)
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.529 2 DEBUG nova.compute.manager [req-2216419a-f2b0-42ec-807f-4d90fedc9875 req-ec499002-0216-4c5a-845b-8abb97fa93d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.529 2 DEBUG nova.compute.manager [req-2216419a-f2b0-42ec-807f-4d90fedc9875 req-ec499002-0216-4c5a-845b-8abb97fa93d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing instance network info cache due to event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.529 2 DEBUG oslo_concurrency.lockutils [req-2216419a-f2b0-42ec-807f-4d90fedc9875 req-ec499002-0216-4c5a-845b-8abb97fa93d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.529 2 DEBUG oslo_concurrency.lockutils [req-2216419a-f2b0-42ec-807f-4d90fedc9875 req-ec499002-0216-4c5a-845b-8abb97fa93d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.530 2 DEBUG nova.network.neutron [req-2216419a-f2b0-42ec-807f-4d90fedc9875 req-ec499002-0216-4c5a-845b-8abb97fa93d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.605 2 DEBUG nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.605 2 DEBUG nova.network.neutron [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.830 2 INFO nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.847 2 DEBUG nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.959 2 DEBUG nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.961 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.961 2 INFO nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Creating image(s)
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.961 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "/var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.962 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.962 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:33 compute-1 nova_compute[192795]: 2025-09-30 21:44:33.978 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.014 2 DEBUG nova.policy [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.061 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.062 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.063 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.074 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.138 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.139 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.184 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.185 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.185 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.250 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.251 2 DEBUG nova.virt.disk.api [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Checking if we can resize image /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.251 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:34 compute-1 podman[244198]: 2025-09-30 21:44:34.259592865 +0000 UTC m=+0.081613554 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_id=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd)
Sep 30 21:44:34 compute-1 podman[244203]: 2025-09-30 21:44:34.267802586 +0000 UTC m=+0.093871114 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:44:34 compute-1 podman[244202]: 2025-09-30 21:44:34.30701349 +0000 UTC m=+0.136620452 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.333 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.334 2 DEBUG nova.virt.disk.api [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Cannot resize image /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.334 2 DEBUG nova.objects.instance [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'migration_context' on Instance uuid f5164a8d-e5aa-4bb7-9075-73debca4d516 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.355 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.355 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Ensure instance console log exists: /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.356 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.356 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:34 compute-1 nova_compute[192795]: 2025-09-30 21:44:34.356 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:35 compute-1 nova_compute[192795]: 2025-09-30 21:44:35.641 2 DEBUG nova.network.neutron [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Successfully created port: 4bdd817d-4233-447a-a80b-3afc021a638a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:44:35 compute-1 nova_compute[192795]: 2025-09-30 21:44:35.774 2 DEBUG nova.network.neutron [req-2216419a-f2b0-42ec-807f-4d90fedc9875 req-ec499002-0216-4c5a-845b-8abb97fa93d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updated VIF entry in instance network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:44:35 compute-1 nova_compute[192795]: 2025-09-30 21:44:35.775 2 DEBUG nova.network.neutron [req-2216419a-f2b0-42ec-807f-4d90fedc9875 req-ec499002-0216-4c5a-845b-8abb97fa93d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:35 compute-1 nova_compute[192795]: 2025-09-30 21:44:35.795 2 DEBUG oslo_concurrency.lockutils [req-2216419a-f2b0-42ec-807f-4d90fedc9875 req-ec499002-0216-4c5a-845b-8abb97fa93d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:37 compute-1 nova_compute[192795]: 2025-09-30 21:44:37.819 2 DEBUG nova.network.neutron [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Successfully updated port: 4bdd817d-4233-447a-a80b-3afc021a638a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:44:37 compute-1 nova_compute[192795]: 2025-09-30 21:44:37.834 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:37 compute-1 nova_compute[192795]: 2025-09-30 21:44:37.835 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquired lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:37 compute-1 nova_compute[192795]: 2025-09-30 21:44:37.835 2 DEBUG nova.network.neutron [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:44:37 compute-1 nova_compute[192795]: 2025-09-30 21:44:37.913 2 DEBUG nova.compute.manager [req-5e97ece3-2c2f-4171-ab9a-eb60e748c967 req-262f4542-368f-49a6-b11b-496ec2f0c16e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received event network-changed-4bdd817d-4233-447a-a80b-3afc021a638a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:37 compute-1 nova_compute[192795]: 2025-09-30 21:44:37.913 2 DEBUG nova.compute.manager [req-5e97ece3-2c2f-4171-ab9a-eb60e748c967 req-262f4542-368f-49a6-b11b-496ec2f0c16e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Refreshing instance network info cache due to event network-changed-4bdd817d-4233-447a-a80b-3afc021a638a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:44:37 compute-1 nova_compute[192795]: 2025-09-30 21:44:37.913 2 DEBUG oslo_concurrency.lockutils [req-5e97ece3-2c2f-4171-ab9a-eb60e748c967 req-262f4542-368f-49a6-b11b-496ec2f0c16e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:37 compute-1 nova_compute[192795]: 2025-09-30 21:44:37.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:38 compute-1 nova_compute[192795]: 2025-09-30 21:44:38.501 2 DEBUG nova.network.neutron [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:44:38 compute-1 nova_compute[192795]: 2025-09-30 21:44:38.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:38.702 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:38.703 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:38.704 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:40 compute-1 podman[244277]: 2025-09-30 21:44:40.226525795 +0000 UTC m=+0.066730045 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.496 2 DEBUG nova.network.neutron [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updating instance_info_cache with network_info: [{"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.522 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Releasing lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.523 2 DEBUG nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Instance network_info: |[{"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.523 2 DEBUG oslo_concurrency.lockutils [req-5e97ece3-2c2f-4171-ab9a-eb60e748c967 req-262f4542-368f-49a6-b11b-496ec2f0c16e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.523 2 DEBUG nova.network.neutron [req-5e97ece3-2c2f-4171-ab9a-eb60e748c967 req-262f4542-368f-49a6-b11b-496ec2f0c16e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Refreshing network info cache for port 4bdd817d-4233-447a-a80b-3afc021a638a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.526 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Start _get_guest_xml network_info=[{"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.531 2 WARNING nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.535 2 DEBUG nova.virt.libvirt.host [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.536 2 DEBUG nova.virt.libvirt.host [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.542 2 DEBUG nova.virt.libvirt.host [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.542 2 DEBUG nova.virt.libvirt.host [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.544 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.544 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.545 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.545 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.545 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.545 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.546 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.546 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.546 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.546 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.547 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.547 2 DEBUG nova.virt.hardware [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.551 2 DEBUG nova.virt.libvirt.vif [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:44:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=147,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL1/fVTmGYuOOwxSGbH9N8TvZbqR3p/LXcDidDQZkj1x/r1cncivgRIPmN5OLuNtDCDFO3TM2NYeeoJUUNE27RLr42R4vAhyNzhABLK9jtr/4dNw2wyPqxeNtjp2/KxusQ==',key_name='tempest-TestSecurityGroupsBasicOps-1980363703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-fmqlieta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:44:33Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=f5164a8d-e5aa-4bb7-9075-73debca4d516,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.551 2 DEBUG nova.network.os_vif_util [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.552 2 DEBUG nova.network.os_vif_util [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:b2:a2,bridge_name='br-int',has_traffic_filtering=True,id=4bdd817d-4233-447a-a80b-3afc021a638a,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bdd817d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.553 2 DEBUG nova.objects.instance [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'pci_devices' on Instance uuid f5164a8d-e5aa-4bb7-9075-73debca4d516 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.572 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <uuid>f5164a8d-e5aa-4bb7-9075-73debca4d516</uuid>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <name>instance-00000093</name>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737</nova:name>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:44:40</nova:creationTime>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:44:40 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:44:40 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:44:40 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:44:40 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:44:40 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:44:40 compute-1 nova_compute[192795]:         <nova:user uuid="c33a752ef8234bba917ace1e73763490">tempest-TestSecurityGroupsBasicOps-2108116341-project-member</nova:user>
Sep 30 21:44:40 compute-1 nova_compute[192795]:         <nova:project uuid="1ff42902541948f7a6df344fac87c2b7">tempest-TestSecurityGroupsBasicOps-2108116341</nova:project>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:44:40 compute-1 nova_compute[192795]:         <nova:port uuid="4bdd817d-4233-447a-a80b-3afc021a638a">
Sep 30 21:44:40 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <system>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <entry name="serial">f5164a8d-e5aa-4bb7-9075-73debca4d516</entry>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <entry name="uuid">f5164a8d-e5aa-4bb7-9075-73debca4d516</entry>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     </system>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <os>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   </os>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <features>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   </features>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.config"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:8e:b2:a2"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <target dev="tap4bdd817d-42"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/console.log" append="off"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <video>
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     </video>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:44:40 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:44:40 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:44:40 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:44:40 compute-1 nova_compute[192795]: </domain>
Sep 30 21:44:40 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.573 2 DEBUG nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Preparing to wait for external event network-vif-plugged-4bdd817d-4233-447a-a80b-3afc021a638a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.573 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.574 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.574 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.575 2 DEBUG nova.virt.libvirt.vif [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:44:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=147,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL1/fVTmGYuOOwxSGbH9N8TvZbqR3p/LXcDidDQZkj1x/r1cncivgRIPmN5OLuNtDCDFO3TM2NYeeoJUUNE27RLr42R4vAhyNzhABLK9jtr/4dNw2wyPqxeNtjp2/KxusQ==',key_name='tempest-TestSecurityGroupsBasicOps-1980363703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-fmqlieta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:44:33Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=f5164a8d-e5aa-4bb7-9075-73debca4d516,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.575 2 DEBUG nova.network.os_vif_util [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.575 2 DEBUG nova.network.os_vif_util [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:b2:a2,bridge_name='br-int',has_traffic_filtering=True,id=4bdd817d-4233-447a-a80b-3afc021a638a,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bdd817d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.576 2 DEBUG os_vif [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:b2:a2,bridge_name='br-int',has_traffic_filtering=True,id=4bdd817d-4233-447a-a80b-3afc021a638a,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bdd817d-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.577 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.577 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.580 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bdd817d-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.581 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4bdd817d-42, col_values=(('external_ids', {'iface-id': '4bdd817d-4233-447a-a80b-3afc021a638a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:b2:a2', 'vm-uuid': 'f5164a8d-e5aa-4bb7-9075-73debca4d516'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:40 compute-1 NetworkManager[51724]: <info>  [1759268680.5832] manager: (tap4bdd817d-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.590 2 INFO os_vif [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:b2:a2,bridge_name='br-int',has_traffic_filtering=True,id=4bdd817d-4233-447a-a80b-3afc021a638a,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bdd817d-42')
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.649 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.650 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.650 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No VIF found with MAC fa:16:3e:8e:b2:a2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:44:40 compute-1 nova_compute[192795]: 2025-09-30 21:44:40.651 2 INFO nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Using config drive
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.478 2 INFO nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Creating config drive at /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.config
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.484 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp400cpdz2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.618 2 DEBUG oslo_concurrency.processutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp400cpdz2" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:41 compute-1 kernel: tap4bdd817d-42: entered promiscuous mode
Sep 30 21:44:41 compute-1 NetworkManager[51724]: <info>  [1759268681.6927] manager: (tap4bdd817d-42): new Tun device (/org/freedesktop/NetworkManager/Devices/281)
Sep 30 21:44:41 compute-1 ovn_controller[94902]: 2025-09-30T21:44:41Z|00563|binding|INFO|Claiming lport 4bdd817d-4233-447a-a80b-3afc021a638a for this chassis.
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:41 compute-1 ovn_controller[94902]: 2025-09-30T21:44:41Z|00564|binding|INFO|4bdd817d-4233-447a-a80b-3afc021a638a: Claiming fa:16:3e:8e:b2:a2 10.100.0.13
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.715 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:b2:a2 10.100.0.13'], port_security=['fa:16:3e:8e:b2:a2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02e17e31-bee3-4214-8c0d-d336d8499304 9c7bc5b7-aa16-4110-aeee-11b59189f128', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea4dfefc-3776-4359-bdd5-ef1ed99a61d3, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=4bdd817d-4233-447a-a80b-3afc021a638a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.717 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 4bdd817d-4233-447a-a80b-3afc021a638a in datapath cad5d4b4-0147-4d5b-8e82-ad8835d4110a bound to our chassis
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.719 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cad5d4b4-0147-4d5b-8e82-ad8835d4110a
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:41 compute-1 ovn_controller[94902]: 2025-09-30T21:44:41Z|00565|binding|INFO|Setting lport 4bdd817d-4233-447a-a80b-3afc021a638a ovn-installed in OVS
Sep 30 21:44:41 compute-1 ovn_controller[94902]: 2025-09-30T21:44:41Z|00566|binding|INFO|Setting lport 4bdd817d-4233-447a-a80b-3afc021a638a up in Southbound
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.733 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[110805bb-7a05-46ec-ac30-4c58b41fe723]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.734 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcad5d4b4-01 in ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:44:41 compute-1 systemd-udevd[244339]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.742 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcad5d4b4-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.743 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6a80486f-7985-4c02-969d-45ea546202b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.744 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4b6b8e-e6cc-456b-b0fc-902f9c0a56d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 systemd-machined[152783]: New machine qemu-68-instance-00000093.
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.757 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[c10a9540-30b3-4923-8468-14767c845acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 systemd[1]: Started Virtual Machine qemu-68-instance-00000093.
Sep 30 21:44:41 compute-1 NetworkManager[51724]: <info>  [1759268681.7597] device (tap4bdd817d-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:44:41 compute-1 NetworkManager[51724]: <info>  [1759268681.7609] device (tap4bdd817d-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.781 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ca2a2b1c-b1a4-4d2c-b513-0440ef3b24d5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.818 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[62a561ac-b9f0-4ab7-a89a-00159ad8cf09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 NetworkManager[51724]: <info>  [1759268681.8268] manager: (tapcad5d4b4-00): new Veth device (/org/freedesktop/NetworkManager/Devices/282)
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.825 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8d056e96-91c7-4022-a36a-aa138ea0c725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.862 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b3151934-7de6-4aa6-80d7-dd46a78189f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.865 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[0f059bc6-0311-4b81-97aa-2e97cff0191e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 NetworkManager[51724]: <info>  [1759268681.8919] device (tapcad5d4b4-00): carrier: link connected
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.898 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6c057d-d73b-4661-b454-80cbbf13003f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.919 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0bdb227d-fda4-4541-a300-d272f9997873]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcad5d4b4-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:d6:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537756, 'reachable_time': 22011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244372, 'error': None, 'target': 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.938 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1b770c10-80b0-4566-aae4-dd90d887b2b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe41:d629'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537756, 'tstamp': 537756}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244373, 'error': None, 'target': 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:41.961 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[08fb9788-b9de-41cb-bc62-0096715a7b06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcad5d4b4-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:d6:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537756, 'reachable_time': 22011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244374, 'error': None, 'target': 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.990 2 DEBUG nova.compute.manager [req-d40e71f1-3b93-47ad-ba9b-555883abab16 req-8b8bc75a-3e8b-41ac-aaba-6082ba454568 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received event network-vif-plugged-4bdd817d-4233-447a-a80b-3afc021a638a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.991 2 DEBUG oslo_concurrency.lockutils [req-d40e71f1-3b93-47ad-ba9b-555883abab16 req-8b8bc75a-3e8b-41ac-aaba-6082ba454568 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.991 2 DEBUG oslo_concurrency.lockutils [req-d40e71f1-3b93-47ad-ba9b-555883abab16 req-8b8bc75a-3e8b-41ac-aaba-6082ba454568 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.991 2 DEBUG oslo_concurrency.lockutils [req-d40e71f1-3b93-47ad-ba9b-555883abab16 req-8b8bc75a-3e8b-41ac-aaba-6082ba454568 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:41 compute-1 nova_compute[192795]: 2025-09-30 21:44:41.991 2 DEBUG nova.compute.manager [req-d40e71f1-3b93-47ad-ba9b-555883abab16 req-8b8bc75a-3e8b-41ac-aaba-6082ba454568 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Processing event network-vif-plugged-4bdd817d-4233-447a-a80b-3afc021a638a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.000 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[43d4e4e7-9787-451b-bd65-6fb413fe319d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:42 compute-1 ovn_controller[94902]: 2025-09-30T21:44:42Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e7:65:6e 10.100.0.14
Sep 30 21:44:42 compute-1 ovn_controller[94902]: 2025-09-30T21:44:42Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:65:6e 10.100.0.14
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.072 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9e7a232c-666f-47b2-9fbd-757e5c5cab2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.074 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcad5d4b4-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.074 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.075 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcad5d4b4-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:42 compute-1 kernel: tapcad5d4b4-00: entered promiscuous mode
Sep 30 21:44:42 compute-1 NetworkManager[51724]: <info>  [1759268682.0790] manager: (tapcad5d4b4-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.081 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcad5d4b4-00, col_values=(('external_ids', {'iface-id': '775a2c33-e2fa-46d8-9fd2-dfbcafa64142'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:42 compute-1 ovn_controller[94902]: 2025-09-30T21:44:42Z|00567|binding|INFO|Releasing lport 775a2c33-e2fa-46d8-9fd2-dfbcafa64142 from this chassis (sb_readonly=0)
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.086 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cad5d4b4-0147-4d5b-8e82-ad8835d4110a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cad5d4b4-0147-4d5b-8e82-ad8835d4110a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.087 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[712b835f-3dbb-47ba-9151-8b9f3073e901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.088 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-cad5d4b4-0147-4d5b-8e82-ad8835d4110a
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/cad5d4b4-0147-4d5b-8e82-ad8835d4110a.pid.haproxy
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID cad5d4b4-0147-4d5b-8e82-ad8835d4110a
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:44:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:42.090 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'env', 'PROCESS_TAG=haproxy-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cad5d4b4-0147-4d5b-8e82-ad8835d4110a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.369 2 DEBUG nova.network.neutron [req-5e97ece3-2c2f-4171-ab9a-eb60e748c967 req-262f4542-368f-49a6-b11b-496ec2f0c16e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updated VIF entry in instance network info cache for port 4bdd817d-4233-447a-a80b-3afc021a638a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.370 2 DEBUG nova.network.neutron [req-5e97ece3-2c2f-4171-ab9a-eb60e748c967 req-262f4542-368f-49a6-b11b-496ec2f0c16e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updating instance_info_cache with network_info: [{"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.385 2 DEBUG oslo_concurrency.lockutils [req-5e97ece3-2c2f-4171-ab9a-eb60e748c967 req-262f4542-368f-49a6-b11b-496ec2f0c16e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:42 compute-1 podman[244413]: 2025-09-30 21:44:42.484655126 +0000 UTC m=+0.084857852 container create 20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:44:42 compute-1 podman[244413]: 2025-09-30 21:44:42.427872739 +0000 UTC m=+0.028075505 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:44:42 compute-1 systemd[1]: Started libpod-conmon-20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f.scope.
Sep 30 21:44:42 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:44:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/907e9edb2952446468daf1ef0f4effe456fb57d51374fccd9899efcc3e2743d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:44:42 compute-1 podman[244413]: 2025-09-30 21:44:42.585566268 +0000 UTC m=+0.185769024 container init 20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:44:42 compute-1 podman[244413]: 2025-09-30 21:44:42.592187276 +0000 UTC m=+0.192390002 container start 20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:44:42 compute-1 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[244428]: [NOTICE]   (244432) : New worker (244434) forked
Sep 30 21:44:42 compute-1 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[244428]: [NOTICE]   (244432) : Loading success.
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.802 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268682.8008544, f5164a8d-e5aa-4bb7-9075-73debca4d516 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.802 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] VM Started (Lifecycle Event)
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.807 2 DEBUG nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.814 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.819 2 INFO nova.virt.libvirt.driver [-] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Instance spawned successfully.
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.821 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.828 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.834 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.858 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.860 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.861 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.862 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.863 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.864 2 DEBUG nova.virt.libvirt.driver [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.873 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.874 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268682.800974, f5164a8d-e5aa-4bb7-9075-73debca4d516 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.874 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] VM Paused (Lifecycle Event)
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.911 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.916 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268682.813605, f5164a8d-e5aa-4bb7-9075-73debca4d516 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.916 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] VM Resumed (Lifecycle Event)
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.952 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.957 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.986 2 INFO nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Took 9.03 seconds to spawn the instance on the hypervisor.
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.987 2 DEBUG nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:42 compute-1 nova_compute[192795]: 2025-09-30 21:44:42.994 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:44:43 compute-1 nova_compute[192795]: 2025-09-30 21:44:43.081 2 INFO nova.compute.manager [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Took 10.02 seconds to build instance.
Sep 30 21:44:43 compute-1 nova_compute[192795]: 2025-09-30 21:44:43.104 2 DEBUG oslo_concurrency.lockutils [None req-f2026ad8-a600-4c45-9bbb-b480cf3298c1 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:43 compute-1 nova_compute[192795]: 2025-09-30 21:44:43.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.025 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000093', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1ff42902541948f7a6df344fac87c2b7', 'user_id': 'c33a752ef8234bba917ace1e73763490', 'hostId': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.028 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000091', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'hostId': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.030 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f5164a8d-e5aa-4bb7-9075-73debca4d516 / tap4bdd817d-42 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.031 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.034 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 / tapc00d1cce-57 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.034 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fe96e1c-679d-4229-944b-56a3dc39050c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.028749', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad444664-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': '8ce0e18d1f17261446b89a470fc0838ed99ea6266aba882062f3a3009c28ad32'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.028749', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad44b176-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': '1c08ba1cf83b073df40febfc16ee82380141505edf5b2c196adc47e1a9288b1b'}]}, 'timestamp': '2025-09-30 21:44:44.034639', '_unique_id': 'e3467a5a19534a4b9e5f77796d07e173'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.035 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.037 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.037 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85600d74-2922-451b-8501-4d193c4acab5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.037599', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad453308-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': '64c40b52d6802e0c0ea770272ee2b21c541d37c0d213f80cb832b087929a6b59'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.037599', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad453f7e-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': '709defa29e6ac875695d24d5626a37a8fe2519f73f0ccd691912236da8b58bcf'}]}, 'timestamp': '2025-09-30 21:44:44.038267', '_unique_id': '7d23879e6fe44c95921f881396bdaeb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.038 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.039 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.057 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.read.requests volume: 745 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.057 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.077 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.078 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0858881b-5ac4-4bef-b5bd-54025edeab83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 745, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-vda', 'timestamp': '2025-09-30T21:44:44.040086', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad48317a-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': 'acf7057a1f39fbf21e760dcc5b48c37812ffcda25584bdd09b172e220453eb33'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-sda', 'timestamp': '2025-09-30T21:44:44.040086', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad483d8c-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': '82edc3a7ea3f152da2c83358042b6ae01126cc710a97ef5ee40d4b99fae1439e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-vda', 'timestamp': '2025-09-30T21:44:44.040086', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad4b62fa-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': 'ae29b9505a80e2775eb0ce48bf92320e9a687ff31cdeeeac6cf701bc75e45edc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-sda', 'timestamp': '2025-09-30T21:44:44.040086', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad4b6fde-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': '18f61e003d868ad10aab4dd829fbab9462e335ba73ece1fb673fbc4f7c2d05e4'}]}, 'timestamp': '2025-09-30 21:44:44.078834', '_unique_id': 'adcc4cf8f16d4dbab5a8704ba282d75c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.079 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.090 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.091 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.100 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.100 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fe69e7e-638d-4ae7-8fb5-a56f6525e40a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-vda', 'timestamp': '2025-09-30T21:44:44.081110', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad4d5376-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.81390083, 'message_signature': '16406d77d47869d8881d693afcc2891a5b58ea38b52de483f44a3e82b88cf8b1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-sda', 'timestamp': '2025-09-30T21:44:44.081110', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad4d6384-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.81390083, 'message_signature': '380eed5bd14593bf67a28846ad13566d19a49e928a5b1ad1412de4b3253a9fc6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-vda', 'timestamp': '2025-09-30T21:44:44.081110', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad4ecf9e-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.824381483, 'message_signature': '4cc613075fa8d816a3de1c3fa1aa5d4a9615b1d85b4a1e92fea1861bd9ea24eb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-sda', 'timestamp': '2025-09-30T21:44:44.081110', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad4edc82-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.824381483, 'message_signature': '283dfc1947adf2d2a7128af8bc54011c443331e85ec731bc74bb1ec1954cdc84'}]}, 'timestamp': '2025-09-30 21:44:44.101276', '_unique_id': 'ef5afe65225348f2a6f0cc6ab5b5ac9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.102 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.103 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.104 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.104 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.write.latency volume: 2843515947 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.104 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d19615b-9690-469e-837e-023c7d8beb80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-vda', 'timestamp': '2025-09-30T21:44:44.103941', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad4f525c-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': 'ee6d26f8b0c36071d9bb60741bd149d23585418be19570e3d825eacf17f459b3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-sda', 'timestamp': '2025-09-30T21:44:44.103941', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad4f5ff4-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': 'a1fa8c159a57da368a2bbc5d6a26502623de34ad1259e0014e59be21f08ee64f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2843515947, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-vda', 'timestamp': '2025-09-30T21:44:44.103941', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad4f6c4c-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': 'b910a16a6a3f45c98a09a46cab970aee92506531c7e1de1a855f8b7ded8e5f9d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-sda', 'timestamp': '2025-09-30T21:44:44.103941', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad4f78b8-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': 'c38c592ed65ca0e2dcaf50345722cc71a03f97baf06afd6c680799bb24c3d6fc'}]}, 'timestamp': '2025-09-30 21:44:44.105346', '_unique_id': '433d41f087a24cf78e2d12dcc8623dc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.106 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.107 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.read.latency volume: 405657758 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.107 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.read.latency volume: 2903628 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 nova_compute[192795]: 2025-09-30 21:44:44.107 2 DEBUG nova.compute.manager [req-ae71e02f-3ab5-46b1-9129-1e98746598f9 req-8364d279-a6a5-493d-a694-8198d1e93682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received event network-vif-plugged-4bdd817d-4233-447a-a80b-3afc021a638a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:44 compute-1 nova_compute[192795]: 2025-09-30 21:44:44.107 2 DEBUG oslo_concurrency.lockutils [req-ae71e02f-3ab5-46b1-9129-1e98746598f9 req-8364d279-a6a5-493d-a694-8198d1e93682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.108 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.read.latency volume: 625080959 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 nova_compute[192795]: 2025-09-30 21:44:44.108 2 DEBUG oslo_concurrency.lockutils [req-ae71e02f-3ab5-46b1-9129-1e98746598f9 req-8364d279-a6a5-493d-a694-8198d1e93682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:44 compute-1 nova_compute[192795]: 2025-09-30 21:44:44.108 2 DEBUG oslo_concurrency.lockutils [req-ae71e02f-3ab5-46b1-9129-1e98746598f9 req-8364d279-a6a5-493d-a694-8198d1e93682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.108 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.read.latency volume: 116424140 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 nova_compute[192795]: 2025-09-30 21:44:44.108 2 DEBUG nova.compute.manager [req-ae71e02f-3ab5-46b1-9129-1e98746598f9 req-8364d279-a6a5-493d-a694-8198d1e93682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] No waiting events found dispatching network-vif-plugged-4bdd817d-4233-447a-a80b-3afc021a638a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:44 compute-1 nova_compute[192795]: 2025-09-30 21:44:44.108 2 WARNING nova.compute.manager [req-ae71e02f-3ab5-46b1-9129-1e98746598f9 req-8364d279-a6a5-493d-a694-8198d1e93682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received unexpected event network-vif-plugged-4bdd817d-4233-447a-a80b-3afc021a638a for instance with vm_state active and task_state None.
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62ad8e6c-40f6-4d3b-9337-cc028756e82d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 405657758, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-vda', 'timestamp': '2025-09-30T21:44:44.107383', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad4fd83a-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': '5eb01fed0999a13a5f2dd577a72a4732848e08b57ef1d210cbc2dfe38f54a1b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2903628, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-sda', 'timestamp': '2025-09-30T21:44:44.107383', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad4fe474-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': 'c22626b3eec91a21b8de775e08a9fe9cf84af944e869060a7cacd06acdf6cb5a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 625080959, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-vda', 'timestamp': '2025-09-30T21:44:44.107383', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad4ff0a4-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': '2f9c0f5fa61c8f41d0f6ec66755b4ea279bf048f44ce5d77ae3372310b21352e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 116424140, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-sda', 'timestamp': '2025-09-30T21:44:44.107383', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad4ffd74-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': 'e460768f9a91dd5af991f9011de91887725e7dd358704ce84121e906bc879665'}]}, 'timestamp': '2025-09-30 21:44:44.108663', '_unique_id': '0bdf8af6bab248f6a0bbb954fe8627c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.109 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.110 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.111 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.incoming.bytes volume: 2142 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6cedd36-cb52-4743-90cb-54c1a77ac2fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.110704', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad505a62-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': '09f0ea1e63d153cfcc253a5206aa54c2d9fc82340ffd21be8196fc337cc4468c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2142, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.110704', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad506872-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': '1bd0197779a093af1d3c9ba0d8d93dfe498a26ef2662bc89e2cb507f335f7cd3'}]}, 'timestamp': '2025-09-30 21:44:44.111439', '_unique_id': '8f2be9ef2d974104a3bafe604ff95d15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.112 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.113 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.113 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96d3a4f2-a8fc-46af-b7c5-dea7bd326b34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.113346', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad50c290-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': '5a00790538594f4b11a0f37121e88f3a043bada049d6e1be1865910092afe8ff'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.113346', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad50cf2e-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': 'e12866d2e6f96a007da4ce99c0ce151a6d31ecc2dc43f99e237857a2d92df1b3'}]}, 'timestamp': '2025-09-30 21:44:44.114035', '_unique_id': '52d91df5fc494d06be9d9aaf3cc0511f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.115 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.130 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/cpu volume: 1230000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.143 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/cpu volume: 11640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91e5b70c-ad98-466b-92bd-99595171f80d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1230000000, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'timestamp': '2025-09-30T21:44:44.115964', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ad537558-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.863504534, 'message_signature': '469bb3d200f59bea0c23abe2d870fdcfe93a06f18d717a29bf1af9de2bd264b0'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11640000000, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'timestamp': '2025-09-30T21:44:44.115964', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ad554aa4-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.875695571, 'message_signature': '17369cf1fe88e0e0b14ed8ddfe659f42dce0133dbc61f941b6d38f49afafcc65'}]}, 'timestamp': '2025-09-30 21:44:44.143463', '_unique_id': 'b9c4774af5114ae9ae7edd3a1c970e2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.144 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.145 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.146 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.146 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.146 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.usage volume: 29818880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.147 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d732a31-2f52-4098-8e3c-0dca29edeec0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-vda', 'timestamp': '2025-09-30T21:44:44.146177', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad55c560-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.81390083, 'message_signature': '2d3c22d94923d12c401f8ed4e956945adcb22035984f76a13583e2b90d66ae96'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-sda', 'timestamp': '2025-09-30T21:44:44.146177', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad55d1c2-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.81390083, 'message_signature': '3072ddcb772f10e7f92dad47a03e0f9d07e282cda93235e33af3f127419f0018'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29818880, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-vda', 'timestamp': '2025-09-30T21:44:44.146177', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad55dcda-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.824381483, 'message_signature': 'd4b76db45cf6114dc51b1dd8bf5faf5eb29f718eb409dae7a02048420559e5aa'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-sda', 'timestamp': '2025-09-30T21:44:44.146177', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad55e7ca-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.824381483, 'message_signature': 'f424b7235592771da5700de4107271282fbe3554669dfb62100ecdaf4a6bde95'}]}, 'timestamp': '2025-09-30 21:44:44.147435', '_unique_id': 'aa72c687753f4f5793474b554aaaa90b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.148 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.149 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.149 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ee18fef-14ec-48ca-8b15-6266f83b3c12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.149370', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad5640c6-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': '6b655e9cac4898b5774be0b0957b14788670e084ae2f453954baba711dacdb78'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.149370', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad564f08-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': '28d87eab31ae9adf93e11157a8fe613c556aae1ee28588b45a5772d458a5ec4f'}]}, 'timestamp': '2025-09-30 21:44:44.150078', '_unique_id': '0abae0933a884de68f7c7d7b5b6a1cda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.150 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.152 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.read.bytes volume: 23291392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.152 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.152 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.153 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '451c3af7-c089-44df-926c-3d2210dd5c94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23291392, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-vda', 'timestamp': '2025-09-30T21:44:44.152136', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad56ae8a-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': '3dd1d00d44d1dbf9d9d0e9dfa176049fff7b4f83bdfdbe664d5bd2c0ce0cc7c3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-sda', 'timestamp': '2025-09-30T21:44:44.152136', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad56ba56-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': '92ebefb3856649f4c7d6cecd82b03c36736f4517b8cd72fe9c2668778e939fe5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-vda', 'timestamp': '2025-09-30T21:44:44.152136', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad56c640-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': '79edf8bd66ea3f19ae8cf04fe91a5133e00e6b0fbf8206fc813f984496152678'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-sda', 'timestamp': '2025-09-30T21:44:44.152136', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad56d158-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': 'fb73ddc8c7af2de5f63898c38e3a441d484b8b7390449aeb3c482126075e3536'}]}, 'timestamp': '2025-09-30 21:44:44.153433', '_unique_id': '4383c67bd6d34da283a99d0741833984'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.154 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.155 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.155 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fedd789d-d1af-40e1-9921-52498c377c41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.155325', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad5728e2-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': '84b82a69b1f72fe88f148323509a04ad1a455f4bb66c3880981760bb724dbb4a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.155325', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad573526-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': 'b6c636ff7480adc5704e0255eec8d215ef117109eeebf1b8f07c095257dbf348'}]}, 'timestamp': '2025-09-30 21:44:44.155972', '_unique_id': 'f49ee809e5a44cd9acbb31d9159343f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.156 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.157 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.158 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1387484208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1387484208>]
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.158 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.158 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26bdc449-a150-4edb-afc5-38ef11763b0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.158533', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad57a650-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': 'c40fe497f348280de55c95ffab483e81f258ca5fd7d6a3c66841e9a0a1fbea88'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.158533', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad57b2ee-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': 'a3511e6b4be190f9ce46e59ac061a29d0e0b745703a663bef54d54af3fa7463b'}]}, 'timestamp': '2025-09-30 21:44:44.159194', '_unique_id': '57e187ab0d754a17926ee2cb3f1762b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.159 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.160 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.161 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.161 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f5164a8d-e5aa-4bb7-9075-73debca4d516: ceilometer.compute.pollsters.NoVolumeException
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.161 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/memory.usage volume: 40.41015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '156aa864-5a9e-4876-82a5-b100d8251bdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.41015625, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'timestamp': '2025-09-30T21:44:44.161100', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ad5816b2-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.875695571, 'message_signature': '8bab28e4b2da579795e40c8266973e162533afad26409abf4645b0de05c7b11c'}]}, 'timestamp': '2025-09-30 21:44:44.161748', '_unique_id': '4f2b8a5fd2d14f61812ebb75389a5380'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.162 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.163 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34485a13-34d3-4159-94a1-a6805affbd30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.163672', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad586ee6-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': '9d639a1373e7d11a8a878732b903069b132191d6a94cb9f60e6e7de2c821f1dc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.163672', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad587b7a-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': '5b52d68a4d7424259bdd8121d3ca6baf93d7d5748a8c695e0ba2c35eb50194a8'}]}, 'timestamp': '2025-09-30 21:44:44.164349', '_unique_id': '885257df96c8484da2a09f6289cbdee1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.164 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.166 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.166 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.incoming.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e04ee53d-9e15-48a2-ae65-571fc67f687c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.166364', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad58d980-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': '4c2484b19ad6a9ae66fd3a3713cd7044ff299e82d5b154cba7cd39a16e9e26a0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.166364', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad58e768-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': '67d6f9dd680ae324cfd3ef7dedd783d63cac2bd63a17562bfb3bce91cf8ef1a1'}]}, 'timestamp': '2025-09-30 21:44:44.167097', '_unique_id': 'a4f1d0cc37fc40d2b6a43fc803941055'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.167 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.169 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.169 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.169 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.write.bytes volume: 72781824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.170 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68a52a45-e4a7-475c-8178-2947adf5f2ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-vda', 'timestamp': '2025-09-30T21:44:44.169123', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad594726-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': '90d0baaf2bb2f1158dec9991b139467ebdf78420d1cc2e3619ec23c1cf7f1371'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-sda', 'timestamp': '2025-09-30T21:44:44.169123', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad5954a0-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': '737ff0521fc5d218ebb961af013bff744d6b30effce6fadb3d32ee6a2ee3de0d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72781824, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-vda', 'timestamp': '2025-09-30T21:44:44.169123', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad59604e-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': '99c5c8f32d4e68ee91848f16e31b2533aa965048c829c1ddd4b79a028708ab0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-sda', 'timestamp': '2025-09-30T21:44:44.169123', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad596cb0-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': '0ca709b558e37b318a155d6c75a554081155bd88b741bd1deb8224f009e05e3f'}]}, 'timestamp': '2025-09-30 21:44:44.170489', '_unique_id': '9c3b3b3e32d74c838e6479ac37ef3cd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.171 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.172 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.172 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.172 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c18fa1ad-b029-4877-bee4-70d965ba892c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-00000093-f5164a8d-e5aa-4bb7-9075-73debca4d516-tap4bdd817d-42', 'timestamp': '2025-09-30T21:44:44.172149', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'tap4bdd817d-42', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8e:b2:a2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bdd817d-42'}, 'message_id': 'ad59b990-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.761513463, 'message_signature': '33b01affc0dfe469584788b33e0769a790568e9feafc89fd4791a45d5ead0fec'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000091-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-tapc00d1cce-57', 'timestamp': '2025-09-30T21:44:44.172149', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'tapc00d1cce-57', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e7:65:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc00d1cce-57'}, 'message_id': 'ad59c3c2-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.764726809, 'message_signature': 'c59eb5f5cc8fa05d61e15f291193e5cc8887de19fe21815d9a79eaffe9b40265'}]}, 'timestamp': '2025-09-30 21:44:44.172694', '_unique_id': 'd21f80b476e14749be94eae94877b7b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.173 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.174 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.174 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1387484208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1387484208>]
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.174 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.174 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.175 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.175 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4bdebc3-7262-4aa3-8a47-d9e111218471', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-vda', 'timestamp': '2025-09-30T21:44:44.174560', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad5a1886-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': 'c5eb78d4604f5d9b5835ec9422dd376d2604dab3766eb18b32cdc5b3e19009af'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-sda', 'timestamp': '2025-09-30T21:44:44.174560', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad5a21dc-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.772854587, 'message_signature': '30e77136b43d8c220d1e3f50f4b275a1126c3f238ddb31dbbecd39ee316a9f78'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-vda', 'timestamp': '2025-09-30T21:44:44.174560', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad5a2ca4-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': '2265635015997a6662344c5a571fd36821d06aafc156715da16e1b3083ff2bbb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-sda', 'timestamp': '2025-09-30T21:44:44.174560', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad5a36e0-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.790592803, 'message_signature': 'b1b7fe54d1cbd5d076cd18777a087a53f1c571a59eddbcba2a25c980c0f933a9'}]}, 'timestamp': '2025-09-30 21:44:44.175633', '_unique_id': '9a18c1260c744d6ea77ee1fd5cdf5e88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.176 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.177 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.177 12 DEBUG ceilometer.compute.pollsters [-] f5164a8d-e5aa-4bb7-9075-73debca4d516/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.177 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 DEBUG ceilometer.compute.pollsters [-] 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31d03cb1-aaf0-4091-9720-06834929322c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-vda', 'timestamp': '2025-09-30T21:44:44.177160', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad5a8424-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.81390083, 'message_signature': '92ec0f59b0fee9fa7c3ccd5ffa73ed41387629b3b028145fead19c24121a40e7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516-sda', 'timestamp': '2025-09-30T21:44:44.177160', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737', 'name': 'instance-00000093', 'instance_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'instance_type': 'm1.nano', 'host': '6653a52b812f680bca8f0eb19b0e5cc5fbb9b55d2d6c25117fc2073c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad5a8df2-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.81390083, 'message_signature': '002e649e315daca9f3e1e4a3e5f68107b5834c63dfbf327ffe282469264da8e9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-vda', 'timestamp': '2025-09-30T21:44:44.177160', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad5a96f8-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.824381483, 'message_signature': '361252e0b3222f6289ddd673272be702bd274e582a1abf8f7bfeadcd378dde75'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-sda', 'timestamp': '2025-09-30T21:44:44.177160', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1387484208', 'name': 'instance-00000091', 'instance_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad5a9fcc-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5379.824381483, 'message_signature': 'd21be2695465b44ead05f187b4820366a0f2d03763ec5a6c8eca9042b7a71584'}]}, 'timestamp': '2025-09-30 21:44:44.178346', '_unique_id': '62af673336904f378c32aefbc0bdddf7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.178 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.179 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.180 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1387484208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1387484208>]
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.180 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:44:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:44:44.180 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1387484208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1387484208>]
Sep 30 21:44:45 compute-1 nova_compute[192795]: 2025-09-30 21:44:45.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:46.861 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:46.862 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:44:46 compute-1 nova_compute[192795]: 2025-09-30 21:44:46.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:47 compute-1 podman[244443]: 2025-09-30 21:44:47.239015617 +0000 UTC m=+0.066551089 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Sep 30 21:44:47 compute-1 podman[244445]: 2025-09-30 21:44:47.25323842 +0000 UTC m=+0.070577258 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 21:44:47 compute-1 podman[244444]: 2025-09-30 21:44:47.266800804 +0000 UTC m=+0.090188655 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:44:47 compute-1 nova_compute[192795]: 2025-09-30 21:44:47.512 2 INFO nova.compute.manager [None req-ca1bb9a9-ebc8-42b9-880d-4607a9715fd9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Get console output
Sep 30 21:44:47 compute-1 nova_compute[192795]: 2025-09-30 21:44:47.522 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:44:48 compute-1 nova_compute[192795]: 2025-09-30 21:44:48.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:49 compute-1 nova_compute[192795]: 2025-09-30 21:44:49.600 2 DEBUG nova.compute.manager [req-0c99191e-445b-411d-8aa1-462042364f4f req-07b86151-19e3-486e-a34d-1641ba668b99 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received event network-changed-4bdd817d-4233-447a-a80b-3afc021a638a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:49 compute-1 nova_compute[192795]: 2025-09-30 21:44:49.600 2 DEBUG nova.compute.manager [req-0c99191e-445b-411d-8aa1-462042364f4f req-07b86151-19e3-486e-a34d-1641ba668b99 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Refreshing instance network info cache due to event network-changed-4bdd817d-4233-447a-a80b-3afc021a638a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:44:49 compute-1 nova_compute[192795]: 2025-09-30 21:44:49.601 2 DEBUG oslo_concurrency.lockutils [req-0c99191e-445b-411d-8aa1-462042364f4f req-07b86151-19e3-486e-a34d-1641ba668b99 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:49 compute-1 nova_compute[192795]: 2025-09-30 21:44:49.601 2 DEBUG oslo_concurrency.lockutils [req-0c99191e-445b-411d-8aa1-462042364f4f req-07b86151-19e3-486e-a34d-1641ba668b99 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:49 compute-1 nova_compute[192795]: 2025-09-30 21:44:49.601 2 DEBUG nova.network.neutron [req-0c99191e-445b-411d-8aa1-462042364f4f req-07b86151-19e3-486e-a34d-1641ba668b99 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Refreshing network info cache for port 4bdd817d-4233-447a-a80b-3afc021a638a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:44:49 compute-1 nova_compute[192795]: 2025-09-30 21:44:49.687 2 INFO nova.compute.manager [None req-538c472b-23e4-4922-afc6-618709de0001 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Get console output
Sep 30 21:44:49 compute-1 nova_compute[192795]: 2025-09-30 21:44:49.693 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:44:50 compute-1 nova_compute[192795]: 2025-09-30 21:44:50.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:50 compute-1 nova_compute[192795]: 2025-09-30 21:44:50.941 2 DEBUG nova.network.neutron [req-0c99191e-445b-411d-8aa1-462042364f4f req-07b86151-19e3-486e-a34d-1641ba668b99 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updated VIF entry in instance network info cache for port 4bdd817d-4233-447a-a80b-3afc021a638a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:44:50 compute-1 nova_compute[192795]: 2025-09-30 21:44:50.941 2 DEBUG nova.network.neutron [req-0c99191e-445b-411d-8aa1-462042364f4f req-07b86151-19e3-486e-a34d-1641ba668b99 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updating instance_info_cache with network_info: [{"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:50 compute-1 nova_compute[192795]: 2025-09-30 21:44:50.965 2 DEBUG oslo_concurrency.lockutils [req-0c99191e-445b-411d-8aa1-462042364f4f req-07b86151-19e3-486e-a34d-1641ba668b99 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:51.865 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:52 compute-1 nova_compute[192795]: 2025-09-30 21:44:52.862 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:52 compute-1 nova_compute[192795]: 2025-09-30 21:44:52.862 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:52 compute-1 nova_compute[192795]: 2025-09-30 21:44:52.862 2 DEBUG nova.network.neutron [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:44:53 compute-1 nova_compute[192795]: 2025-09-30 21:44:53.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.115 2 DEBUG nova.network.neutron [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.152 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.326 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.327 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Creating file /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/cdbfaa030b694c65835c3a7c9e18333c.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.327 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/cdbfaa030b694c65835c3a7c9e18333c.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.826 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/cdbfaa030b694c65835c3a7c9e18333c.tmp" returned: 1 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.827 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/cdbfaa030b694c65835c3a7c9e18333c.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.828 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Creating directory /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 21:44:54 compute-1 nova_compute[192795]: 2025-09-30 21:44:54.828 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:55 compute-1 nova_compute[192795]: 2025-09-30 21:44:55.054 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:55 compute-1 nova_compute[192795]: 2025-09-30 21:44:55.059 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:44:55 compute-1 nova_compute[192795]: 2025-09-30 21:44:55.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:56 compute-1 podman[244524]: 2025-09-30 21:44:56.258635552 +0000 UTC m=+0.094617533 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:44:56 compute-1 nova_compute[192795]: 2025-09-30 21:44:56.701 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:56 compute-1 nova_compute[192795]: 2025-09-30 21:44:56.701 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:56 compute-1 ovn_controller[94902]: 2025-09-30T21:44:56Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8e:b2:a2 10.100.0.13
Sep 30 21:44:56 compute-1 ovn_controller[94902]: 2025-09-30T21:44:56Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8e:b2:a2 10.100.0.13
Sep 30 21:44:56 compute-1 nova_compute[192795]: 2025-09-30 21:44:56.884 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:56 compute-1 nova_compute[192795]: 2025-09-30 21:44:56.884 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:56 compute-1 nova_compute[192795]: 2025-09-30 21:44:56.884 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:56 compute-1 nova_compute[192795]: 2025-09-30 21:44:56.885 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:44:56 compute-1 nova_compute[192795]: 2025-09-30 21:44:56.979 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.056 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.057 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.112 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.118 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.178 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.180 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:57 compute-1 kernel: tapc00d1cce-57 (unregistering): left promiscuous mode
Sep 30 21:44:57 compute-1 NetworkManager[51724]: <info>  [1759268697.2612] device (tapc00d1cce-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.264 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00568|binding|INFO|Releasing lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 from this chassis (sb_readonly=0)
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00569|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 down in Southbound
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00570|binding|INFO|Removing iface tapc00d1cce-57 ovn-installed in OVS
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.288 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.289 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 unbound from our chassis
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.291 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 033e9c33-7065-4faf-8a4b-e2705c450c67, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.292 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[371a190b-0c97-4344-a3f9-44a9a92d3d5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.293 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 namespace which is not needed anymore
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:57 compute-1 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000091.scope: Deactivated successfully.
Sep 30 21:44:57 compute-1 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000091.scope: Consumed 13.915s CPU time.
Sep 30 21:44:57 compute-1 systemd-machined[152783]: Machine qemu-67-instance-00000091 terminated.
Sep 30 21:44:57 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244178]: [NOTICE]   (244182) : haproxy version is 2.8.14-c23fe91
Sep 30 21:44:57 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244178]: [NOTICE]   (244182) : path to executable is /usr/sbin/haproxy
Sep 30 21:44:57 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244178]: [WARNING]  (244182) : Exiting Master process...
Sep 30 21:44:57 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244178]: [ALERT]    (244182) : Current worker (244184) exited with code 143 (Terminated)
Sep 30 21:44:57 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244178]: [WARNING]  (244182) : All workers exited. Exiting... (0)
Sep 30 21:44:57 compute-1 systemd[1]: libpod-94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e.scope: Deactivated successfully.
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.457 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.459 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5289MB free_disk=73.24279022216797GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.459 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.459 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:57 compute-1 podman[244581]: 2025-09-30 21:44:57.463975607 +0000 UTC m=+0.069543329 container died 94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:44:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e-userdata-shm.mount: Deactivated successfully.
Sep 30 21:44:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-9b061a1f63e9c2820b5ae17850d435996d788002035a552cf008bf731b886d7d-merged.mount: Deactivated successfully.
Sep 30 21:44:57 compute-1 podman[244581]: 2025-09-30 21:44:57.505473893 +0000 UTC m=+0.111041615 container cleanup 94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:44:57 compute-1 systemd[1]: libpod-conmon-94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e.scope: Deactivated successfully.
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.524 2 INFO nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating resource usage from migration db92c3ea-df74-490b-9ee5-b72fdac10a6a
Sep 30 21:44:57 compute-1 kernel: tapc00d1cce-57: entered promiscuous mode
Sep 30 21:44:57 compute-1 NetworkManager[51724]: <info>  [1759268697.5256] manager: (tapc00d1cce-57): new Tun device (/org/freedesktop/NetworkManager/Devices/284)
Sep 30 21:44:57 compute-1 kernel: tapc00d1cce-57 (unregistering): left promiscuous mode
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00571|binding|INFO|Claiming lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for this chassis.
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00572|binding|INFO|c00d1cce-5707-41ba-9ca0-2aeecbf662d8: Claiming fa:16:3e:e7:65:6e 10.100.0.14
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.579 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00573|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 ovn-installed in OVS
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00574|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 up in Southbound
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:57 compute-1 podman[244613]: 2025-09-30 21:44:57.608836441 +0000 UTC m=+0.062088130 container remove 94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.618 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9e277d77-bfd5-4676-89d4-768798a12946]: (4, ('Tue Sep 30 09:44:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 (94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e)\n94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e\nTue Sep 30 09:44:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 (94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e)\n94b180efa3e51b46b943042adba01f53ce45f68a718a059371371f3edbb9718e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.620 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[79ea73bf-c3c7-40c0-ab86-9a28ff79e4a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.622 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap033e9c33-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:57 compute-1 kernel: tap033e9c33-70: left promiscuous mode
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.643 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance f5164a8d-e5aa-4bb7-9075-73debca4d516 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.644 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Migration db92c3ea-df74-490b-9ee5-b72fdac10a6a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.644 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.644 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00575|binding|INFO|Releasing lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 from this chassis (sb_readonly=0)
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00576|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 down in Southbound
Sep 30 21:44:57 compute-1 ovn_controller[94902]: 2025-09-30T21:44:57Z|00577|binding|INFO|Removing iface tapc00d1cce-57 ovn-installed in OVS
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.649 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8753393b-218f-4c8f-92a4-9a049ea9d59b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.651 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.683 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[41769515-0492-4984-9833-72fd8fef760f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.685 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[da52b5b0-a8ab-4676-9ba3-0a0a649e84ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.697 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.704 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8405681d-9cf1-457a-832b-697f0b26a384]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536449, 'reachable_time': 23156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244638, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.709 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.709 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[cf306051-0350-4634-9f25-ad852b5922bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.710 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 unbound from our chassis
Sep 30 21:44:57 compute-1 systemd[1]: run-netns-ovnmeta\x2d033e9c33\x2d7065\x2d4faf\x2d8a4b\x2de2705c450c67.mount: Deactivated successfully.
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.712 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 033e9c33-7065-4faf-8a4b-e2705c450c67, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.713 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cb22da23-c91b-4369-a90b-757eed1bda2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.714 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 unbound from our chassis
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.714 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.715 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 033e9c33-7065-4faf-8a4b-e2705c450c67, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:44:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:44:57.716 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[56e385b2-9914-47f7-a7c8-cae4c21624bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.746 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:44:57 compute-1 nova_compute[192795]: 2025-09-30 21:44:57.747 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.084 2 INFO nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance shutdown successfully after 3 seconds.
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.093 2 INFO nova.virt.libvirt.driver [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance destroyed successfully.
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.095 2 DEBUG nova.virt.libvirt.vif [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:44:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1387484208',display_name='tempest-TestNetworkAdvancedServerOps-server-1387484208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1387484208',id=145,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGR5X4mrP2QSdtYc3f8JfOkFpP2VM6oi0AFFxhCennHtpd7iMpKCrxhgpbTxKW+iQFQ9j7tfzEdahfdLg9KKVDLd5BB9lwZMhpRJKZ+A5gEfm5/7y6uIYpwFPAd9COVo+g==',key_name='tempest-TestNetworkAdvancedServerOps-302836404',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:44:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-be0b95e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:44:52Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=7dbd9807-9843-4455-8bf1-3bd7f4fa37c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2107888420", "vif_mac": "fa:16:3e:e7:65:6e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.095 2 DEBUG nova.network.os_vif_util [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converting VIF {"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2107888420", "vif_mac": "fa:16:3e:e7:65:6e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.100 2 DEBUG nova.network.os_vif_util [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.101 2 DEBUG os_vif [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.105 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc00d1cce-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.115 2 INFO os_vif [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57')
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.120 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.163 2 DEBUG nova.compute.manager [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.164 2 DEBUG oslo_concurrency.lockutils [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.164 2 DEBUG oslo_concurrency.lockutils [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.164 2 DEBUG oslo_concurrency.lockutils [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.165 2 DEBUG nova.compute.manager [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.165 2 WARNING nova.compute.manager [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.216 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.217 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.283 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.285 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Copying file /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_resize/disk to 192.168.122.100:/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.286 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_resize/disk 192.168.122.100:/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.912 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "scp -r /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_resize/disk 192.168.122.100:/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk" returned: 0 in 0.626s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.912 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Copying file /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:44:58 compute-1 nova_compute[192795]: 2025-09-30 21:44:58.913 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_resize/disk.config 192.168.122.100:/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:59 compute-1 nova_compute[192795]: 2025-09-30 21:44:59.150 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "scp -C -r /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_resize/disk.config 192.168.122.100:/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config" returned: 0 in 0.237s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:59 compute-1 nova_compute[192795]: 2025-09-30 21:44:59.151 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Copying file /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:44:59 compute-1 nova_compute[192795]: 2025-09-30 21:44:59.151 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_resize/disk.info 192.168.122.100:/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:59 compute-1 nova_compute[192795]: 2025-09-30 21:44:59.410 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "scp -C -r /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_resize/disk.info 192.168.122.100:/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" returned: 0 in 0.259s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:59 compute-1 nova_compute[192795]: 2025-09-30 21:44:59.740 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:59 compute-1 nova_compute[192795]: 2025-09-30 21:44:59.740 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:59 compute-1 nova_compute[192795]: 2025-09-30 21:44:59.740 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.328 2 DEBUG nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.328 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.328 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.329 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.329 2 DEBUG nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.329 2 WARNING nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.329 2 DEBUG nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.329 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.329 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.330 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.330 2 DEBUG nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.330 2 WARNING nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.469 2 DEBUG neutronclient.v2_0.client [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.599 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.600 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:00 compute-1 nova_compute[192795]: 2025-09-30 21:45:00.600 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:01 compute-1 anacron[105422]: Job `cron.weekly' started
Sep 30 21:45:01 compute-1 anacron[105422]: Job `cron.weekly' terminated
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.466 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.466 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.466 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.466 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.466 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.467 2 WARNING nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrated.
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.467 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.467 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.467 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.467 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.467 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.467 2 WARNING nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrated.
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.468 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.468 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.468 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.468 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.468 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.468 2 WARNING nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrated.
Sep 30 21:45:02 compute-1 nova_compute[192795]: 2025-09-30 21:45:02.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:03 compute-1 nova_compute[192795]: 2025-09-30 21:45:03.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:03 compute-1 nova_compute[192795]: 2025-09-30 21:45:03.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:04 compute-1 nova_compute[192795]: 2025-09-30 21:45:04.552 2 DEBUG nova.compute.manager [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:04 compute-1 nova_compute[192795]: 2025-09-30 21:45:04.553 2 DEBUG nova.compute.manager [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing instance network info cache due to event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:45:04 compute-1 nova_compute[192795]: 2025-09-30 21:45:04.553 2 DEBUG oslo_concurrency.lockutils [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:04 compute-1 nova_compute[192795]: 2025-09-30 21:45:04.553 2 DEBUG oslo_concurrency.lockutils [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:04 compute-1 nova_compute[192795]: 2025-09-30 21:45:04.553 2 DEBUG nova.network.neutron [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:45:05 compute-1 podman[244656]: 2025-09-30 21:45:05.238276005 +0000 UTC m=+0.061808182 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:45:05 compute-1 podman[244654]: 2025-09-30 21:45:05.239111497 +0000 UTC m=+0.070273500 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:45:05 compute-1 podman[244655]: 2025-09-30 21:45:05.261411196 +0000 UTC m=+0.092543078 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:45:05 compute-1 nova_compute[192795]: 2025-09-30 21:45:05.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:05 compute-1 nova_compute[192795]: 2025-09-30 21:45:05.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:06 compute-1 nova_compute[192795]: 2025-09-30 21:45:06.755 2 DEBUG nova.network.neutron [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updated VIF entry in instance network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:45:06 compute-1 nova_compute[192795]: 2025-09-30 21:45:06.756 2 DEBUG nova.network.neutron [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:06 compute-1 nova_compute[192795]: 2025-09-30 21:45:06.782 2 DEBUG oslo_concurrency.lockutils [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:08 compute-1 nova_compute[192795]: 2025-09-30 21:45:08.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-1 nova_compute[192795]: 2025-09-30 21:45:08.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-1 nova_compute[192795]: 2025-09-30 21:45:08.925 2 DEBUG nova.compute.manager [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:08 compute-1 nova_compute[192795]: 2025-09-30 21:45:08.925 2 DEBUG oslo_concurrency.lockutils [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:08 compute-1 nova_compute[192795]: 2025-09-30 21:45:08.926 2 DEBUG oslo_concurrency.lockutils [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:08 compute-1 nova_compute[192795]: 2025-09-30 21:45:08.926 2 DEBUG oslo_concurrency.lockutils [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:08 compute-1 nova_compute[192795]: 2025-09-30 21:45:08.926 2 DEBUG nova.compute.manager [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:08 compute-1 nova_compute[192795]: 2025-09-30 21:45:08.926 2 WARNING nova.compute.manager [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_finish.
Sep 30 21:45:10 compute-1 nova_compute[192795]: 2025-09-30 21:45:10.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:10 compute-1 nova_compute[192795]: 2025-09-30 21:45:10.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:45:10 compute-1 nova_compute[192795]: 2025-09-30 21:45:10.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:45:10 compute-1 podman[244721]: 2025-09-30 21:45:10.806482927 +0000 UTC m=+0.071809441 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:45:10 compute-1 nova_compute[192795]: 2025-09-30 21:45:10.924 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:10 compute-1 nova_compute[192795]: 2025-09-30 21:45:10.924 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:10 compute-1 nova_compute[192795]: 2025-09-30 21:45:10.924 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:45:10 compute-1 nova_compute[192795]: 2025-09-30 21:45:10.925 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f5164a8d-e5aa-4bb7-9075-73debca4d516 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:11 compute-1 nova_compute[192795]: 2025-09-30 21:45:11.085 2 DEBUG nova.compute.manager [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:11 compute-1 nova_compute[192795]: 2025-09-30 21:45:11.086 2 DEBUG oslo_concurrency.lockutils [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:11 compute-1 nova_compute[192795]: 2025-09-30 21:45:11.086 2 DEBUG oslo_concurrency.lockutils [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:11 compute-1 nova_compute[192795]: 2025-09-30 21:45:11.086 2 DEBUG oslo_concurrency.lockutils [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:11 compute-1 nova_compute[192795]: 2025-09-30 21:45:11.086 2 DEBUG nova.compute.manager [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:11 compute-1 nova_compute[192795]: 2025-09-30 21:45:11.086 2 WARNING nova.compute.manager [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state resized and task_state resize_reverting.
Sep 30 21:45:12 compute-1 nova_compute[192795]: 2025-09-30 21:45:12.622 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268697.6215465, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:12 compute-1 nova_compute[192795]: 2025-09-30 21:45:12.623 2 INFO nova.compute.manager [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Stopped (Lifecycle Event)
Sep 30 21:45:12 compute-1 nova_compute[192795]: 2025-09-30 21:45:12.644 2 DEBUG nova.compute.manager [None req-1b10e3ef-ad7b-42e5-9c77-f0ded30dac0d - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:12 compute-1 nova_compute[192795]: 2025-09-30 21:45:12.647 2 DEBUG nova.compute.manager [None req-1b10e3ef-ad7b-42e5-9c77-f0ded30dac0d - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:45:12 compute-1 nova_compute[192795]: 2025-09-30 21:45:12.669 2 INFO nova.compute.manager [None req-1b10e3ef-ad7b-42e5-9c77-f0ded30dac0d - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Sep 30 21:45:12 compute-1 nova_compute[192795]: 2025-09-30 21:45:12.835 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updating instance_info_cache with network_info: [{"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:12 compute-1 nova_compute[192795]: 2025-09-30 21:45:12.891 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:12 compute-1 nova_compute[192795]: 2025-09-30 21:45:12.891 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:45:12 compute-1 nova_compute[192795]: 2025-09-30 21:45:12.891 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.190 2 DEBUG nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.191 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.191 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.191 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.191 2 DEBUG nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.192 2 WARNING nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state resized and task_state resize_reverting.
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.192 2 DEBUG nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.192 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.192 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.193 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.193 2 DEBUG nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.193 2 WARNING nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state resized and task_state resize_reverting.
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.473 2 INFO nova.compute.manager [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Swapping old allocation on dict_keys(['e551d5b4-e9f6-409e-b2a1-508a20c11333']) held by migration db92c3ea-df74-490b-9ee5-b72fdac10a6a for instance
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.502 2 DEBUG nova.scheduler.client.report [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Overwriting current allocation {'allocations': {'fe423b93-de5a-41f7-97d1-9622ea46af54': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 75}}, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'consumer_generation': 1} on consumer 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:13 compute-1 nova_compute[192795]: 2025-09-30 21:45:13.764 2 INFO nova.network.neutron [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 21:45:15 compute-1 nova_compute[192795]: 2025-09-30 21:45:15.298 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:15 compute-1 nova_compute[192795]: 2025-09-30 21:45:15.299 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:15 compute-1 nova_compute[192795]: 2025-09-30 21:45:15.299 2 DEBUG nova.network.neutron [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:45:15 compute-1 nova_compute[192795]: 2025-09-30 21:45:15.433 2 DEBUG nova.compute.manager [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:15 compute-1 nova_compute[192795]: 2025-09-30 21:45:15.433 2 DEBUG nova.compute.manager [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing instance network info cache due to event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:45:15 compute-1 nova_compute[192795]: 2025-09-30 21:45:15.434 2 DEBUG oslo_concurrency.lockutils [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.572 2 DEBUG nova.network.neutron [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.599 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.600 2 DEBUG nova.virt.libvirt.driver [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.605 2 DEBUG oslo_concurrency.lockutils [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.605 2 DEBUG nova.network.neutron [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.616 2 DEBUG nova.virt.libvirt.driver [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Start _get_guest_xml network_info=[{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.622 2 WARNING nova.virt.libvirt.driver [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.634 2 DEBUG nova.virt.libvirt.host [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.635 2 DEBUG nova.virt.libvirt.host [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.639 2 DEBUG nova.virt.libvirt.host [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.640 2 DEBUG nova.virt.libvirt.host [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.642 2 DEBUG nova.virt.libvirt.driver [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.643 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.644 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.644 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.645 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.646 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.646 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.647 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.648 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.648 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.648 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.649 2 DEBUG nova.virt.hardware [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.650 2 DEBUG nova.objects.instance [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.685 2 DEBUG oslo_concurrency.processutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.773 2 DEBUG oslo_concurrency.processutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.776 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.777 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.779 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.781 2 DEBUG nova.virt.libvirt.vif [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:44:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1387484208',display_name='tempest-TestNetworkAdvancedServerOps-server-1387484208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1387484208',id=145,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGR5X4mrP2QSdtYc3f8JfOkFpP2VM6oi0AFFxhCennHtpd7iMpKCrxhgpbTxKW+iQFQ9j7tfzEdahfdLg9KKVDLd5BB9lwZMhpRJKZ+A5gEfm5/7y6uIYpwFPAd9COVo+g==',key_name='tempest-TestNetworkAdvancedServerOps-302836404',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:45:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-be0b95e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:45:12Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=7dbd9807-9843-4455-8bf1-3bd7f4fa37c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.782 2 DEBUG nova.network.os_vif_util [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.784 2 DEBUG nova.network.os_vif_util [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.789 2 DEBUG nova.virt.libvirt.driver [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <uuid>7dbd9807-9843-4455-8bf1-3bd7f4fa37c6</uuid>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <name>instance-00000091</name>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1387484208</nova:name>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:45:16</nova:creationTime>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:45:16 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:45:16 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:45:16 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:45:16 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:45:16 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:45:16 compute-1 nova_compute[192795]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:45:16 compute-1 nova_compute[192795]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:45:16 compute-1 nova_compute[192795]:         <nova:port uuid="c00d1cce-5707-41ba-9ca0-2aeecbf662d8">
Sep 30 21:45:16 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <system>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <entry name="serial">7dbd9807-9843-4455-8bf1-3bd7f4fa37c6</entry>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <entry name="uuid">7dbd9807-9843-4455-8bf1-3bd7f4fa37c6</entry>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     </system>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <os>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   </os>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <features>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   </features>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:e7:65:6e"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <target dev="tapc00d1cce-57"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/console.log" append="off"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <video>
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     </video>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <input type="keyboard" bus="usb"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:45:16 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:45:16 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:45:16 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:45:16 compute-1 nova_compute[192795]: </domain>
Sep 30 21:45:16 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.792 2 DEBUG nova.compute.manager [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Preparing to wait for external event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.792 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.793 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.793 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.794 2 DEBUG nova.virt.libvirt.vif [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:44:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1387484208',display_name='tempest-TestNetworkAdvancedServerOps-server-1387484208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1387484208',id=145,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGR5X4mrP2QSdtYc3f8JfOkFpP2VM6oi0AFFxhCennHtpd7iMpKCrxhgpbTxKW+iQFQ9j7tfzEdahfdLg9KKVDLd5BB9lwZMhpRJKZ+A5gEfm5/7y6uIYpwFPAd9COVo+g==',key_name='tempest-TestNetworkAdvancedServerOps-302836404',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:45:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-be0b95e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:45:12Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=7dbd9807-9843-4455-8bf1-3bd7f4fa37c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.795 2 DEBUG nova.network.os_vif_util [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.796 2 DEBUG nova.network.os_vif_util [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.796 2 DEBUG os_vif [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.798 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.798 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.803 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc00d1cce-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.804 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc00d1cce-57, col_values=(('external_ids', {'iface-id': 'c00d1cce-5707-41ba-9ca0-2aeecbf662d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:65:6e', 'vm-uuid': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:16 compute-1 NetworkManager[51724]: <info>  [1759268716.8070] manager: (tapc00d1cce-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.812 2 INFO os_vif [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57')
Sep 30 21:45:16 compute-1 kernel: tapc00d1cce-57: entered promiscuous mode
Sep 30 21:45:16 compute-1 NetworkManager[51724]: <info>  [1759268716.9113] manager: (tapc00d1cce-57): new Tun device (/org/freedesktop/NetworkManager/Devices/286)
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:16 compute-1 ovn_controller[94902]: 2025-09-30T21:45:16Z|00578|binding|INFO|Claiming lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for this chassis.
Sep 30 21:45:16 compute-1 ovn_controller[94902]: 2025-09-30T21:45:16Z|00579|binding|INFO|c00d1cce-5707-41ba-9ca0-2aeecbf662d8: Claiming fa:16:3e:e7:65:6e 10.100.0.14
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.924 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '12', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.925 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 bound to our chassis
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.928 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 033e9c33-7065-4faf-8a4b-e2705c450c67
Sep 30 21:45:16 compute-1 ovn_controller[94902]: 2025-09-30T21:45:16Z|00580|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 ovn-installed in OVS
Sep 30 21:45:16 compute-1 ovn_controller[94902]: 2025-09-30T21:45:16Z|00581|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 up in Southbound
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:16 compute-1 systemd-udevd[244759]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.943 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6a1c7685-d9d7-42e2-8ba2-29ad01b6b9cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.944 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap033e9c33-71 in ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:45:16 compute-1 nova_compute[192795]: 2025-09-30 21:45:16.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.948 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap033e9c33-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.949 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[81363680-b86a-4775-b759-461010ee6619]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.949 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b09e47d5-4c46-441a-b966-77c7f2ba579a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:16 compute-1 NetworkManager[51724]: <info>  [1759268716.9566] device (tapc00d1cce-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:45:16 compute-1 NetworkManager[51724]: <info>  [1759268716.9573] device (tapc00d1cce-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.959 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[35e6d2d2-1330-4ceb-a77f-b20199e0b678]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:16 compute-1 systemd-machined[152783]: New machine qemu-69-instance-00000091.
Sep 30 21:45:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:16.986 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea50a46-9113-40ec-b5f8-787c9345b8c0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:16 compute-1 systemd[1]: Started Virtual Machine qemu-69-instance-00000091.
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.015 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[18b3aac2-59fc-4690-b33d-1296969ed848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.020 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2371f0d3-42ca-4bf3-a4cc-495ec57f75d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 NetworkManager[51724]: <info>  [1759268717.0225] manager: (tap033e9c33-70): new Veth device (/org/freedesktop/NetworkManager/Devices/287)
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.059 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[682106cd-7483-4caf-b8e1-b997ca2581e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.063 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[64756a49-7ede-4a13-9bde-c6a481dcaed4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 NetworkManager[51724]: <info>  [1759268717.0884] device (tap033e9c33-70): carrier: link connected
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.095 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2555f161-4062-4123-90c8-f6fbfae5a8e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.115 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea92910-7a26-4ab5-bedf-f710459d7dc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap033e9c33-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:16:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541276, 'reachable_time': 16690, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244794, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.136 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a6d1c3df-c345-41ac-b05a-9c90417ebd70]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:16ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 541276, 'tstamp': 541276}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244795, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.161 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[610bdbe0-d5b9-442d-8d3d-487addbdfaff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap033e9c33-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:16:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541276, 'reachable_time': 16690, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244796, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.207 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[692b1b8a-cbad-4cc1-b614-5fb1814cd315]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.277 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[084c4090-9b1e-484b-ba87-b39e4873adcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.278 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap033e9c33-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.278 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.278 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap033e9c33-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.314 2 DEBUG nova.compute.manager [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.315 2 DEBUG oslo_concurrency.lockutils [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.315 2 DEBUG oslo_concurrency.lockutils [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:17 compute-1 kernel: tap033e9c33-70: entered promiscuous mode
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.316 2 DEBUG oslo_concurrency.lockutils [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.317 2 DEBUG nova.compute.manager [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Processing event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:17 compute-1 NetworkManager[51724]: <info>  [1759268717.3190] manager: (tap033e9c33-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.319 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap033e9c33-70, col_values=(('external_ids', {'iface-id': 'c20141c7-d465-400d-879d-d68081c3646d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:17 compute-1 ovn_controller[94902]: 2025-09-30T21:45:17Z|00582|binding|INFO|Releasing lport c20141c7-d465-400d-879d-d68081c3646d from this chassis (sb_readonly=0)
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.326 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/033e9c33-7065-4faf-8a4b-e2705c450c67.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/033e9c33-7065-4faf-8a4b-e2705c450c67.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.327 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c795603f-7465-4867-92e7-72faf28e4111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.328 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-033e9c33-7065-4faf-8a4b-e2705c450c67
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/033e9c33-7065-4faf-8a4b-e2705c450c67.pid.haproxy
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 033e9c33-7065-4faf-8a4b-e2705c450c67
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:45:17 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:17.329 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'env', 'PROCESS_TAG=haproxy-033e9c33-7065-4faf-8a4b-e2705c450c67', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/033e9c33-7065-4faf-8a4b-e2705c450c67.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.642 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268717.6418738, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.643 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Started (Lifecycle Event)
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.645 2 DEBUG nova.compute.manager [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.651 2 INFO nova.virt.libvirt.driver [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance running successfully.
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.651 2 DEBUG nova.virt.libvirt.driver [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.682 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.686 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.713 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.714 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268717.642113, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.714 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Paused (Lifecycle Event)
Sep 30 21:45:17 compute-1 podman[244834]: 2025-09-30 21:45:17.734595501 +0000 UTC m=+0.060762315 container create 789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.743 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.747 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268717.6483972, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.747 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Resumed (Lifecycle Event)
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.767 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.769 2 INFO nova.compute.manager [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance to original state: 'active'
Sep 30 21:45:17 compute-1 systemd[1]: Started libpod-conmon-789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446.scope.
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.780 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:45:17 compute-1 podman[244834]: 2025-09-30 21:45:17.703006451 +0000 UTC m=+0.029173315 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.804 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Sep 30 21:45:17 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:45:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c526a82aa8f310078154fc24bec8f743377057c3ba3ac8e618e8f46eed79ae0a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:45:17 compute-1 podman[244834]: 2025-09-30 21:45:17.825166614 +0000 UTC m=+0.151333458 container init 789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:45:17 compute-1 podman[244834]: 2025-09-30 21:45:17.832358188 +0000 UTC m=+0.158525002 container start 789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:45:17 compute-1 podman[244851]: 2025-09-30 21:45:17.837092605 +0000 UTC m=+0.058686138 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:45:17 compute-1 podman[244847]: 2025-09-30 21:45:17.846093497 +0000 UTC m=+0.068823991 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:45:17 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244862]: [NOTICE]   (244909) : New worker (244919) forked
Sep 30 21:45:17 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244862]: [NOTICE]   (244909) : Loading success.
Sep 30 21:45:17 compute-1 podman[244852]: 2025-09-30 21:45:17.868086088 +0000 UTC m=+0.072203151 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.981 2 DEBUG nova.network.neutron [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updated VIF entry in instance network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:45:17 compute-1 nova_compute[192795]: 2025-09-30 21:45:17.981 2 DEBUG nova.network.neutron [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:18 compute-1 nova_compute[192795]: 2025-09-30 21:45:18.003 2 DEBUG oslo_concurrency.lockutils [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:18 compute-1 nova_compute[192795]: 2025-09-30 21:45:18.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:18 compute-1 nova_compute[192795]: 2025-09-30 21:45:18.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:18 compute-1 nova_compute[192795]: 2025-09-30 21:45:18.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:45:18 compute-1 nova_compute[192795]: 2025-09-30 21:45:18.725 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:45:19 compute-1 nova_compute[192795]: 2025-09-30 21:45:19.444 2 DEBUG nova.compute.manager [req-d5c6fb77-07f5-486e-a876-280ebde62489 req-fd945947-e190-4308-a373-ce1dd011d6b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:19 compute-1 nova_compute[192795]: 2025-09-30 21:45:19.445 2 DEBUG oslo_concurrency.lockutils [req-d5c6fb77-07f5-486e-a876-280ebde62489 req-fd945947-e190-4308-a373-ce1dd011d6b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:19 compute-1 nova_compute[192795]: 2025-09-30 21:45:19.445 2 DEBUG oslo_concurrency.lockutils [req-d5c6fb77-07f5-486e-a876-280ebde62489 req-fd945947-e190-4308-a373-ce1dd011d6b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:19 compute-1 nova_compute[192795]: 2025-09-30 21:45:19.445 2 DEBUG oslo_concurrency.lockutils [req-d5c6fb77-07f5-486e-a876-280ebde62489 req-fd945947-e190-4308-a373-ce1dd011d6b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:19 compute-1 nova_compute[192795]: 2025-09-30 21:45:19.446 2 DEBUG nova.compute.manager [req-d5c6fb77-07f5-486e-a876-280ebde62489 req-fd945947-e190-4308-a373-ce1dd011d6b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:19 compute-1 nova_compute[192795]: 2025-09-30 21:45:19.446 2 WARNING nova.compute.manager [req-d5c6fb77-07f5-486e-a876-280ebde62489 req-fd945947-e190-4308-a373-ce1dd011d6b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state None.
Sep 30 21:45:21 compute-1 nova_compute[192795]: 2025-09-30 21:45:21.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:22 compute-1 nova_compute[192795]: 2025-09-30 21:45:22.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:23 compute-1 nova_compute[192795]: 2025-09-30 21:45:23.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:26 compute-1 nova_compute[192795]: 2025-09-30 21:45:26.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:27 compute-1 podman[244929]: 2025-09-30 21:45:27.205691019 +0000 UTC m=+0.053087128 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:45:28 compute-1 nova_compute[192795]: 2025-09-30 21:45:28.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:30 compute-1 ovn_controller[94902]: 2025-09-30T21:45:30Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:65:6e 10.100.0.14
Sep 30 21:45:31 compute-1 nova_compute[192795]: 2025-09-30 21:45:31.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:33 compute-1 nova_compute[192795]: 2025-09-30 21:45:33.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:36 compute-1 podman[244961]: 2025-09-30 21:45:36.206077667 +0000 UTC m=+0.046287725 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:45:36 compute-1 podman[244959]: 2025-09-30 21:45:36.215159981 +0000 UTC m=+0.063118857 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:45:36 compute-1 podman[244960]: 2025-09-30 21:45:36.266157162 +0000 UTC m=+0.110137971 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Sep 30 21:45:36 compute-1 nova_compute[192795]: 2025-09-30 21:45:36.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:38 compute-1 nova_compute[192795]: 2025-09-30 21:45:38.408 2 INFO nova.compute.manager [None req-e7ff2b48-2238-4688-840f-7b61b45be131 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Get console output
Sep 30 21:45:38 compute-1 nova_compute[192795]: 2025-09-30 21:45:38.414 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:45:38 compute-1 nova_compute[192795]: 2025-09-30 21:45:38.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:38.703 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:38.704 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:38.705 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:41 compute-1 podman[245024]: 2025-09-30 21:45:41.224468675 +0000 UTC m=+0.068750039 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.333 2 DEBUG oslo_concurrency.lockutils [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.334 2 DEBUG oslo_concurrency.lockutils [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.334 2 DEBUG oslo_concurrency.lockutils [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.334 2 DEBUG oslo_concurrency.lockutils [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.334 2 DEBUG oslo_concurrency.lockutils [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.353 2 INFO nova.compute.manager [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Terminating instance
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.363 2 DEBUG nova.compute.manager [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:45:41 compute-1 kernel: tapc00d1cce-57 (unregistering): left promiscuous mode
Sep 30 21:45:41 compute-1 NetworkManager[51724]: <info>  [1759268741.3975] device (tapc00d1cce-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:41 compute-1 ovn_controller[94902]: 2025-09-30T21:45:41Z|00583|binding|INFO|Releasing lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 from this chassis (sb_readonly=0)
Sep 30 21:45:41 compute-1 ovn_controller[94902]: 2025-09-30T21:45:41Z|00584|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 down in Southbound
Sep 30 21:45:41 compute-1 ovn_controller[94902]: 2025-09-30T21:45:41Z|00585|binding|INFO|Removing iface tapc00d1cce-57 ovn-installed in OVS
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.415 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '14', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.417 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 unbound from our chassis
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.419 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 033e9c33-7065-4faf-8a4b-e2705c450c67, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.420 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3c77c404-c033-4c36-b26b-84db91d44037]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.421 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 namespace which is not needed anymore
Sep 30 21:45:41 compute-1 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000091.scope: Deactivated successfully.
Sep 30 21:45:41 compute-1 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000091.scope: Consumed 13.661s CPU time.
Sep 30 21:45:41 compute-1 systemd-machined[152783]: Machine qemu-69-instance-00000091 terminated.
Sep 30 21:45:41 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244862]: [NOTICE]   (244909) : haproxy version is 2.8.14-c23fe91
Sep 30 21:45:41 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244862]: [NOTICE]   (244909) : path to executable is /usr/sbin/haproxy
Sep 30 21:45:41 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244862]: [WARNING]  (244909) : Exiting Master process...
Sep 30 21:45:41 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244862]: [WARNING]  (244909) : Exiting Master process...
Sep 30 21:45:41 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244862]: [ALERT]    (244909) : Current worker (244919) exited with code 143 (Terminated)
Sep 30 21:45:41 compute-1 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[244862]: [WARNING]  (244909) : All workers exited. Exiting... (0)
Sep 30 21:45:41 compute-1 systemd[1]: libpod-789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446.scope: Deactivated successfully.
Sep 30 21:45:41 compute-1 podman[245066]: 2025-09-30 21:45:41.570543826 +0000 UTC m=+0.045639488 container died 789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 21:45:41 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446-userdata-shm.mount: Deactivated successfully.
Sep 30 21:45:41 compute-1 systemd[1]: var-lib-containers-storage-overlay-c526a82aa8f310078154fc24bec8f743377057c3ba3ac8e618e8f46eed79ae0a-merged.mount: Deactivated successfully.
Sep 30 21:45:41 compute-1 podman[245066]: 2025-09-30 21:45:41.613282914 +0000 UTC m=+0.088378566 container cleanup 789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:45:41 compute-1 systemd[1]: libpod-conmon-789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446.scope: Deactivated successfully.
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.621 2 DEBUG nova.compute.manager [req-83e30dd7-267b-414e-aa18-f34503484684 req-6e768dec-bc8b-4578-90eb-94ec076d4fff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.621 2 DEBUG oslo_concurrency.lockutils [req-83e30dd7-267b-414e-aa18-f34503484684 req-6e768dec-bc8b-4578-90eb-94ec076d4fff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.621 2 DEBUG oslo_concurrency.lockutils [req-83e30dd7-267b-414e-aa18-f34503484684 req-6e768dec-bc8b-4578-90eb-94ec076d4fff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.622 2 DEBUG oslo_concurrency.lockutils [req-83e30dd7-267b-414e-aa18-f34503484684 req-6e768dec-bc8b-4578-90eb-94ec076d4fff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.622 2 DEBUG nova.compute.manager [req-83e30dd7-267b-414e-aa18-f34503484684 req-6e768dec-bc8b-4578-90eb-94ec076d4fff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.622 2 DEBUG nova.compute.manager [req-83e30dd7-267b-414e-aa18-f34503484684 req-6e768dec-bc8b-4578-90eb-94ec076d4fff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.626 2 INFO nova.virt.libvirt.driver [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance destroyed successfully.
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.626 2 DEBUG nova.objects.instance [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.645 2 DEBUG nova.virt.libvirt.vif [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:44:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1387484208',display_name='tempest-TestNetworkAdvancedServerOps-server-1387484208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1387484208',id=145,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGR5X4mrP2QSdtYc3f8JfOkFpP2VM6oi0AFFxhCennHtpd7iMpKCrxhgpbTxKW+iQFQ9j7tfzEdahfdLg9KKVDLd5BB9lwZMhpRJKZ+A5gEfm5/7y6uIYpwFPAd9COVo+g==',key_name='tempest-TestNetworkAdvancedServerOps-302836404',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:45:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-be0b95e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:45:17Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=7dbd9807-9843-4455-8bf1-3bd7f4fa37c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.645 2 DEBUG nova.network.os_vif_util [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.647 2 DEBUG nova.network.os_vif_util [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.647 2 DEBUG os_vif [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.649 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc00d1cce-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.654 2 INFO os_vif [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57')
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.654 2 INFO nova.virt.libvirt.driver [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Deleting instance files /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_del
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.660 2 INFO nova.virt.libvirt.driver [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Deletion of /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_del complete
Sep 30 21:45:41 compute-1 podman[245114]: 2025-09-30 21:45:41.680250004 +0000 UTC m=+0.043982462 container remove 789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.686 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd67e19-84d9-4242-82a6-3cfe57b806dc]: (4, ('Tue Sep 30 09:45:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 (789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446)\n789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446\nTue Sep 30 09:45:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 (789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446)\n789f87c74a8ddf0939ecf73d60506694aa78a53d8d7c38d7f275f9720e32f446\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.688 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[efd8972d-9aa5-4d9c-ac52-591a1d6331b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.689 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap033e9c33-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:41 compute-1 kernel: tap033e9c33-70: left promiscuous mode
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.724 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d0db75-65f4-49e7-940a-f5deddaa31f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.734 2 INFO nova.compute.manager [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Took 0.37 seconds to destroy the instance on the hypervisor.
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.735 2 DEBUG oslo.service.loopingcall [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.736 2 DEBUG nova.compute.manager [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.736 2 DEBUG nova.network.neutron [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.761 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8637ecdc-c9b1-45ad-958d-dc6dfde9e71b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.763 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e53b752d-0205-4905-a1c3-44c719bd9a77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.778 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c697e623-1761-4a0f-a862-c17abe87bbe1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541268, 'reachable_time': 38625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245130, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.781 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:45:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:41.782 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[3f5763bf-dbc9-4cb3-a11c-895637fb4831]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:41 compute-1 systemd[1]: run-netns-ovnmeta\x2d033e9c33\x2d7065\x2d4faf\x2d8a4b\x2de2705c450c67.mount: Deactivated successfully.
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.808 2 DEBUG nova.compute.manager [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.808 2 DEBUG nova.compute.manager [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing instance network info cache due to event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.809 2 DEBUG oslo_concurrency.lockutils [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.809 2 DEBUG oslo_concurrency.lockutils [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:41 compute-1 nova_compute[192795]: 2025-09-30 21:45:41.810 2 DEBUG nova.network.neutron [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.667 2 DEBUG nova.network.neutron [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.763 2 DEBUG nova.compute.manager [req-972e7778-3bdb-4784-b6f4-7087ec62945a req-b70d8004-6c9e-49d6-8335-02b045c3e951 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-deleted-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.764 2 INFO nova.compute.manager [req-972e7778-3bdb-4784-b6f4-7087ec62945a req-b70d8004-6c9e-49d6-8335-02b045c3e951 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Neutron deleted interface c00d1cce-5707-41ba-9ca0-2aeecbf662d8; detaching it from the instance and deleting it from the info cache
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.764 2 DEBUG nova.network.neutron [req-972e7778-3bdb-4784-b6f4-7087ec62945a req-b70d8004-6c9e-49d6-8335-02b045c3e951 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.766 2 INFO nova.compute.manager [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Took 1.03 seconds to deallocate network for instance.
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.792 2 DEBUG nova.compute.manager [req-972e7778-3bdb-4784-b6f4-7087ec62945a req-b70d8004-6c9e-49d6-8335-02b045c3e951 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Detach interface failed, port_id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8, reason: Instance 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.821 2 DEBUG oslo_concurrency.lockutils [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "f5164a8d-e5aa-4bb7-9075-73debca4d516" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.822 2 DEBUG oslo_concurrency.lockutils [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.822 2 DEBUG oslo_concurrency.lockutils [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.822 2 DEBUG oslo_concurrency.lockutils [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.822 2 DEBUG oslo_concurrency.lockutils [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.845 2 INFO nova.compute.manager [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Terminating instance
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.859 2 DEBUG nova.compute.manager [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:45:42 compute-1 kernel: tap4bdd817d-42 (unregistering): left promiscuous mode
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.884 2 DEBUG oslo_concurrency.lockutils [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:42 compute-1 NetworkManager[51724]: <info>  [1759268742.8850] device (tap4bdd817d-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.885 2 DEBUG oslo_concurrency.lockutils [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:42 compute-1 ovn_controller[94902]: 2025-09-30T21:45:42Z|00586|binding|INFO|Releasing lport 4bdd817d-4233-447a-a80b-3afc021a638a from this chassis (sb_readonly=0)
Sep 30 21:45:42 compute-1 ovn_controller[94902]: 2025-09-30T21:45:42Z|00587|binding|INFO|Setting lport 4bdd817d-4233-447a-a80b-3afc021a638a down in Southbound
Sep 30 21:45:42 compute-1 ovn_controller[94902]: 2025-09-30T21:45:42Z|00588|binding|INFO|Removing iface tap4bdd817d-42 ovn-installed in OVS
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:42.932 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:b2:a2 10.100.0.13'], port_security=['fa:16:3e:8e:b2:a2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f5164a8d-e5aa-4bb7-9075-73debca4d516', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02e17e31-bee3-4214-8c0d-d336d8499304 9c7bc5b7-aa16-4110-aeee-11b59189f128', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea4dfefc-3776-4359-bdd5-ef1ed99a61d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=4bdd817d-4233-447a-a80b-3afc021a638a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:42.933 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 4bdd817d-4233-447a-a80b-3afc021a638a in datapath cad5d4b4-0147-4d5b-8e82-ad8835d4110a unbound from our chassis
Sep 30 21:45:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:42.935 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cad5d4b4-0147-4d5b-8e82-ad8835d4110a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:45:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:42.936 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f36bd33c-e6a0-4667-a2d6-8e49c8c8650e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:42.936 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a namespace which is not needed anymore
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:42 compute-1 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000093.scope: Deactivated successfully.
Sep 30 21:45:42 compute-1 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000093.scope: Consumed 15.912s CPU time.
Sep 30 21:45:42 compute-1 systemd-machined[152783]: Machine qemu-68-instance-00000093 terminated.
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.964 2 DEBUG nova.compute.manager [req-9df13b65-e37a-4fdf-9cc7-3f7501158352 req-399867ef-ea94-4655-84d8-089f3cfc5daa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received event network-changed-4bdd817d-4233-447a-a80b-3afc021a638a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.964 2 DEBUG nova.compute.manager [req-9df13b65-e37a-4fdf-9cc7-3f7501158352 req-399867ef-ea94-4655-84d8-089f3cfc5daa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Refreshing instance network info cache due to event network-changed-4bdd817d-4233-447a-a80b-3afc021a638a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.964 2 DEBUG oslo_concurrency.lockutils [req-9df13b65-e37a-4fdf-9cc7-3f7501158352 req-399867ef-ea94-4655-84d8-089f3cfc5daa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.964 2 DEBUG oslo_concurrency.lockutils [req-9df13b65-e37a-4fdf-9cc7-3f7501158352 req-399867ef-ea94-4655-84d8-089f3cfc5daa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:42 compute-1 nova_compute[192795]: 2025-09-30 21:45:42.964 2 DEBUG nova.network.neutron [req-9df13b65-e37a-4fdf-9cc7-3f7501158352 req-399867ef-ea94-4655-84d8-089f3cfc5daa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Refreshing network info cache for port 4bdd817d-4233-447a-a80b-3afc021a638a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:45:43 compute-1 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[244428]: [NOTICE]   (244432) : haproxy version is 2.8.14-c23fe91
Sep 30 21:45:43 compute-1 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[244428]: [NOTICE]   (244432) : path to executable is /usr/sbin/haproxy
Sep 30 21:45:43 compute-1 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[244428]: [WARNING]  (244432) : Exiting Master process...
Sep 30 21:45:43 compute-1 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[244428]: [WARNING]  (244432) : Exiting Master process...
Sep 30 21:45:43 compute-1 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[244428]: [ALERT]    (244432) : Current worker (244434) exited with code 143 (Terminated)
Sep 30 21:45:43 compute-1 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[244428]: [WARNING]  (244432) : All workers exited. Exiting... (0)
Sep 30 21:45:43 compute-1 systemd[1]: libpod-20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f.scope: Deactivated successfully.
Sep 30 21:45:43 compute-1 podman[245153]: 2025-09-30 21:45:43.067251472 +0000 UTC m=+0.047807385 container died 20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:45:43 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f-userdata-shm.mount: Deactivated successfully.
Sep 30 21:45:43 compute-1 systemd[1]: var-lib-containers-storage-overlay-907e9edb2952446468daf1ef0f4effe456fb57d51374fccd9899efcc3e2743d7-merged.mount: Deactivated successfully.
Sep 30 21:45:43 compute-1 podman[245153]: 2025-09-30 21:45:43.105814838 +0000 UTC m=+0.086370731 container cleanup 20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:45:43 compute-1 systemd[1]: libpod-conmon-20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f.scope: Deactivated successfully.
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.124 2 INFO nova.virt.libvirt.driver [-] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Instance destroyed successfully.
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.126 2 DEBUG nova.objects.instance [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'resources' on Instance uuid f5164a8d-e5aa-4bb7-9075-73debca4d516 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.167 2 DEBUG nova.compute.provider_tree [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:45:43 compute-1 podman[245198]: 2025-09-30 21:45:43.16798623 +0000 UTC m=+0.038191408 container remove 20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:45:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:43.175 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d36a917d-826c-4364-81c6-dcd55ac7cacd]: (4, ('Tue Sep 30 09:45:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a (20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f)\n20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f\nTue Sep 30 09:45:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a (20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f)\n20f397aa5773ee14fe6f22eef07f04d1aefea00f7aec28852d2af24cd449cd0f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:43.177 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[30c9dc39-ccbb-4636-97b3-f3d12bf1c7ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:43.178 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcad5d4b4-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.186 2 DEBUG nova.virt.libvirt.vif [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:44:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-675813737',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=147,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL1/fVTmGYuOOwxSGbH9N8TvZbqR3p/LXcDidDQZkj1x/r1cncivgRIPmN5OLuNtDCDFO3TM2NYeeoJUUNE27RLr42R4vAhyNzhABLK9jtr/4dNw2wyPqxeNtjp2/KxusQ==',key_name='tempest-TestSecurityGroupsBasicOps-1980363703',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:44:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-fmqlieta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:44:43Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=f5164a8d-e5aa-4bb7-9075-73debca4d516,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.187 2 DEBUG nova.network.os_vif_util [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.187 2 DEBUG nova.network.os_vif_util [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8e:b2:a2,bridge_name='br-int',has_traffic_filtering=True,id=4bdd817d-4233-447a-a80b-3afc021a638a,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bdd817d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.187 2 DEBUG os_vif [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:b2:a2,bridge_name='br-int',has_traffic_filtering=True,id=4bdd817d-4233-447a-a80b-3afc021a638a,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bdd817d-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bdd817d-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:43 compute-1 kernel: tapcad5d4b4-00: left promiscuous mode
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.192 2 DEBUG nova.scheduler.client.report [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:43.200 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ab211479-e514-4feb-b3cf-330c41dfdf34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.202 2 INFO os_vif [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:b2:a2,bridge_name='br-int',has_traffic_filtering=True,id=4bdd817d-4233-447a-a80b-3afc021a638a,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bdd817d-42')
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.202 2 INFO nova.virt.libvirt.driver [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Deleting instance files /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516_del
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.203 2 INFO nova.virt.libvirt.driver [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Deletion of /var/lib/nova/instances/f5164a8d-e5aa-4bb7-9075-73debca4d516_del complete
Sep 30 21:45:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:43.230 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[85558789-6c99-46f7-83bb-b83025de3957]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:43.232 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3b1a6f-de46-4c2d-8aae-ed921a708006]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:43.247 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[28e6bfa7-30f4-4378-9475-cbee25ab8fe7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537748, 'reachable_time': 28565, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245215, 'error': None, 'target': 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:43.249 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:45:43 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:43.249 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[699fd005-ad54-43e6-8d07-5e8776f5ef76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:43 compute-1 systemd[1]: run-netns-ovnmeta\x2dcad5d4b4\x2d0147\x2d4d5b\x2d8e82\x2dad8835d4110a.mount: Deactivated successfully.
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.290 2 DEBUG oslo_concurrency.lockutils [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.370 2 INFO nova.scheduler.client.report [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Deleted allocations for instance 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.396 2 INFO nova.compute.manager [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Took 0.54 seconds to destroy the instance on the hypervisor.
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.397 2 DEBUG oslo.service.loopingcall [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.397 2 DEBUG nova.compute.manager [-] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.397 2 DEBUG nova.network.neutron [-] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.519 2 DEBUG oslo_concurrency.lockutils [None req-458b3525-e5f5-48c0-b49f-bde63e7b8349 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.900 2 DEBUG nova.compute.manager [req-5db1e8e4-65e0-44a2-b048-f985811f5084 req-fec5bc04-62af-4e66-bb89-de6b6f57ead8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.900 2 DEBUG oslo_concurrency.lockutils [req-5db1e8e4-65e0-44a2-b048-f985811f5084 req-fec5bc04-62af-4e66-bb89-de6b6f57ead8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.900 2 DEBUG oslo_concurrency.lockutils [req-5db1e8e4-65e0-44a2-b048-f985811f5084 req-fec5bc04-62af-4e66-bb89-de6b6f57ead8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.901 2 DEBUG oslo_concurrency.lockutils [req-5db1e8e4-65e0-44a2-b048-f985811f5084 req-fec5bc04-62af-4e66-bb89-de6b6f57ead8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.901 2 DEBUG nova.compute.manager [req-5db1e8e4-65e0-44a2-b048-f985811f5084 req-fec5bc04-62af-4e66-bb89-de6b6f57ead8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.901 2 WARNING nova.compute.manager [req-5db1e8e4-65e0-44a2-b048-f985811f5084 req-fec5bc04-62af-4e66-bb89-de6b6f57ead8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state deleted and task_state None.
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.961 2 DEBUG nova.network.neutron [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updated VIF entry in instance network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.962 2 DEBUG nova.network.neutron [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:43 compute-1 nova_compute[192795]: 2025-09-30 21:45:43.997 2 DEBUG oslo_concurrency.lockutils [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.788 2 DEBUG nova.network.neutron [-] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.821 2 INFO nova.compute.manager [-] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Took 1.42 seconds to deallocate network for instance.
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.858 2 DEBUG nova.compute.manager [req-6335dc88-515f-4aa7-a91e-8a6bcec4a35d req-2e8d9d33-9ade-4c97-aa6a-d5a47f301f67 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received event network-vif-deleted-4bdd817d-4233-447a-a80b-3afc021a638a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.902 2 DEBUG nova.compute.manager [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received event network-vif-unplugged-4bdd817d-4233-447a-a80b-3afc021a638a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.902 2 DEBUG oslo_concurrency.lockutils [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.902 2 DEBUG oslo_concurrency.lockutils [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.903 2 DEBUG oslo_concurrency.lockutils [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.903 2 DEBUG nova.compute.manager [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] No waiting events found dispatching network-vif-unplugged-4bdd817d-4233-447a-a80b-3afc021a638a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.903 2 DEBUG nova.compute.manager [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received event network-vif-unplugged-4bdd817d-4233-447a-a80b-3afc021a638a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.903 2 DEBUG nova.compute.manager [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received event network-vif-plugged-4bdd817d-4233-447a-a80b-3afc021a638a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.904 2 DEBUG oslo_concurrency.lockutils [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.904 2 DEBUG oslo_concurrency.lockutils [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.904 2 DEBUG oslo_concurrency.lockutils [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.904 2 DEBUG nova.compute.manager [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] No waiting events found dispatching network-vif-plugged-4bdd817d-4233-447a-a80b-3afc021a638a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:44 compute-1 nova_compute[192795]: 2025-09-30 21:45:44.904 2 WARNING nova.compute.manager [req-c144fe7c-02b2-43b2-8382-86d974dd2486 req-ad1ffe67-bc68-49b7-a730-09ed280d96d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Received unexpected event network-vif-plugged-4bdd817d-4233-447a-a80b-3afc021a638a for instance with vm_state active and task_state deleting.
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.010 2 DEBUG oslo_concurrency.lockutils [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.011 2 DEBUG oslo_concurrency.lockutils [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.054 2 DEBUG nova.compute.provider_tree [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.081 2 DEBUG nova.scheduler.client.report [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.102 2 DEBUG oslo_concurrency.lockutils [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.139 2 INFO nova.scheduler.client.report [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Deleted allocations for instance f5164a8d-e5aa-4bb7-9075-73debca4d516
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.266 2 DEBUG oslo_concurrency.lockutils [None req-e943ba4d-22a5-4ac2-94ec-368d8af0692d c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "f5164a8d-e5aa-4bb7-9075-73debca4d516" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.346 2 DEBUG nova.network.neutron [req-9df13b65-e37a-4fdf-9cc7-3f7501158352 req-399867ef-ea94-4655-84d8-089f3cfc5daa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updated VIF entry in instance network info cache for port 4bdd817d-4233-447a-a80b-3afc021a638a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.347 2 DEBUG nova.network.neutron [req-9df13b65-e37a-4fdf-9cc7-3f7501158352 req-399867ef-ea94-4655-84d8-089f3cfc5daa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Updating instance_info_cache with network_info: [{"id": "4bdd817d-4233-447a-a80b-3afc021a638a", "address": "fa:16:3e:8e:b2:a2", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bdd817d-42", "ovs_interfaceid": "4bdd817d-4233-447a-a80b-3afc021a638a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:45 compute-1 nova_compute[192795]: 2025-09-30 21:45:45.468 2 DEBUG oslo_concurrency.lockutils [req-9df13b65-e37a-4fdf-9cc7-3f7501158352 req-399867ef-ea94-4655-84d8-089f3cfc5daa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f5164a8d-e5aa-4bb7-9075-73debca4d516" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:46.947 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:46 compute-1 nova_compute[192795]: 2025-09-30 21:45:46.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:46 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:46.949 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:45:47 compute-1 nova_compute[192795]: 2025-09-30 21:45:47.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:47 compute-1 nova_compute[192795]: 2025-09-30 21:45:47.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:48 compute-1 nova_compute[192795]: 2025-09-30 21:45:48.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:48 compute-1 podman[245217]: 2025-09-30 21:45:48.223923016 +0000 UTC m=+0.069719675 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Sep 30 21:45:48 compute-1 podman[245219]: 2025-09-30 21:45:48.245056254 +0000 UTC m=+0.081960935 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Sep 30 21:45:48 compute-1 podman[245218]: 2025-09-30 21:45:48.247281963 +0000 UTC m=+0.088489129 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:45:48 compute-1 nova_compute[192795]: 2025-09-30 21:45:48.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:45:50.952 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:53 compute-1 nova_compute[192795]: 2025-09-30 21:45:53.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:53 compute-1 nova_compute[192795]: 2025-09-30 21:45:53.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:56 compute-1 nova_compute[192795]: 2025-09-30 21:45:56.625 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268741.6245568, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:56 compute-1 nova_compute[192795]: 2025-09-30 21:45:56.626 2 INFO nova.compute.manager [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Stopped (Lifecycle Event)
Sep 30 21:45:56 compute-1 nova_compute[192795]: 2025-09-30 21:45:56.744 2 DEBUG nova.compute.manager [None req-9f31e3f4-3b5d-4cb7-8e13-f66275452291 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.705 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.705 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.737 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.737 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.737 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.738 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:45:57 compute-1 podman[245281]: 2025-09-30 21:45:57.868038354 +0000 UTC m=+0.070530617 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.902 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.903 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5709MB free_disk=73.30108261108398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.903 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.904 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.974 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.974 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:45:57 compute-1 nova_compute[192795]: 2025-09-30 21:45:57.992 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:45:58 compute-1 nova_compute[192795]: 2025-09-30 21:45:58.005 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:45:58 compute-1 nova_compute[192795]: 2025-09-30 21:45:58.023 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:45:58 compute-1 nova_compute[192795]: 2025-09-30 21:45:58.024 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:58 compute-1 nova_compute[192795]: 2025-09-30 21:45:58.123 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268743.1217513, f5164a8d-e5aa-4bb7-9075-73debca4d516 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:58 compute-1 nova_compute[192795]: 2025-09-30 21:45:58.123 2 INFO nova.compute.manager [-] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] VM Stopped (Lifecycle Event)
Sep 30 21:45:58 compute-1 nova_compute[192795]: 2025-09-30 21:45:58.140 2 DEBUG nova.compute.manager [None req-b84ae11d-2768-4509-886e-23dfdeef9757 - - - - - -] [instance: f5164a8d-e5aa-4bb7-9075-73debca4d516] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:58 compute-1 nova_compute[192795]: 2025-09-30 21:45:58.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:58 compute-1 nova_compute[192795]: 2025-09-30 21:45:58.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:00 compute-1 nova_compute[192795]: 2025-09-30 21:46:00.011 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:00 compute-1 nova_compute[192795]: 2025-09-30 21:46:00.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:00 compute-1 nova_compute[192795]: 2025-09-30 21:46:00.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:46:01 compute-1 nova_compute[192795]: 2025-09-30 21:46:01.973 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:01 compute-1 nova_compute[192795]: 2025-09-30 21:46:01.974 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:01 compute-1 nova_compute[192795]: 2025-09-30 21:46:01.988 2 DEBUG nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.091 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.091 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.097 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.097 2 INFO nova.compute.claims [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.236 2 DEBUG nova.compute.provider_tree [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.249 2 DEBUG nova.scheduler.client.report [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.269 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.270 2 DEBUG nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.312 2 DEBUG nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.312 2 DEBUG nova.network.neutron [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.324 2 INFO nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.338 2 DEBUG nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.424 2 DEBUG nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.426 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.426 2 INFO nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Creating image(s)
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.427 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "/var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.427 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.428 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.440 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.515 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.516 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.517 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.529 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.587 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.588 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.610 2 DEBUG nova.policy [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.625 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.626 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.626 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.691 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.692 2 DEBUG nova.virt.disk.api [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Checking if we can resize image /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.693 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.746 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.747 2 DEBUG nova.virt.disk.api [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Cannot resize image /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.748 2 DEBUG nova.objects.instance [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'migration_context' on Instance uuid d2339d12-50dd-4f1a-9ae0-62f630de86dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.763 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.764 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Ensure instance console log exists: /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.764 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.765 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:02 compute-1 nova_compute[192795]: 2025-09-30 21:46:02.765 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:03 compute-1 nova_compute[192795]: 2025-09-30 21:46:03.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:03 compute-1 nova_compute[192795]: 2025-09-30 21:46:03.533 2 DEBUG nova.network.neutron [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Successfully created port: e1340afc-6c56-42ae-947b-30329a3d37bb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:46:03 compute-1 nova_compute[192795]: 2025-09-30 21:46:03.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:04 compute-1 nova_compute[192795]: 2025-09-30 21:46:04.561 2 DEBUG nova.network.neutron [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Successfully updated port: e1340afc-6c56-42ae-947b-30329a3d37bb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:46:04 compute-1 nova_compute[192795]: 2025-09-30 21:46:04.578 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:04 compute-1 nova_compute[192795]: 2025-09-30 21:46:04.579 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquired lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:04 compute-1 nova_compute[192795]: 2025-09-30 21:46:04.579 2 DEBUG nova.network.neutron [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:46:04 compute-1 nova_compute[192795]: 2025-09-30 21:46:04.690 2 DEBUG nova.compute.manager [req-22484b48-38b5-49b1-bcdc-c208de9d469e req-68013057-d7ba-4261-9c6d-433cff5d041d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received event network-changed-e1340afc-6c56-42ae-947b-30329a3d37bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:04 compute-1 nova_compute[192795]: 2025-09-30 21:46:04.690 2 DEBUG nova.compute.manager [req-22484b48-38b5-49b1-bcdc-c208de9d469e req-68013057-d7ba-4261-9c6d-433cff5d041d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Refreshing instance network info cache due to event network-changed-e1340afc-6c56-42ae-947b-30329a3d37bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:46:04 compute-1 nova_compute[192795]: 2025-09-30 21:46:04.690 2 DEBUG oslo_concurrency.lockutils [req-22484b48-38b5-49b1-bcdc-c208de9d469e req-68013057-d7ba-4261-9c6d-433cff5d041d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:04 compute-1 nova_compute[192795]: 2025-09-30 21:46:04.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:04 compute-1 nova_compute[192795]: 2025-09-30 21:46:04.761 2 DEBUG nova.network.neutron [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:46:05 compute-1 nova_compute[192795]: 2025-09-30 21:46:05.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.121 2 DEBUG nova.network.neutron [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updating instance_info_cache with network_info: [{"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.148 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Releasing lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.149 2 DEBUG nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Instance network_info: |[{"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.149 2 DEBUG oslo_concurrency.lockutils [req-22484b48-38b5-49b1-bcdc-c208de9d469e req-68013057-d7ba-4261-9c6d-433cff5d041d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.150 2 DEBUG nova.network.neutron [req-22484b48-38b5-49b1-bcdc-c208de9d469e req-68013057-d7ba-4261-9c6d-433cff5d041d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Refreshing network info cache for port e1340afc-6c56-42ae-947b-30329a3d37bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.154 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Start _get_guest_xml network_info=[{"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.164 2 WARNING nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.178 2 DEBUG nova.virt.libvirt.host [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.180 2 DEBUG nova.virt.libvirt.host [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.184 2 DEBUG nova.virt.libvirt.host [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.185 2 DEBUG nova.virt.libvirt.host [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.186 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.187 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.187 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.187 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.188 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.188 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.188 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.189 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.189 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.189 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.189 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.190 2 DEBUG nova.virt.hardware [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.194 2 DEBUG nova.virt.libvirt.vif [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:46:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1307223704',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1307223704',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=151,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH3ScEbiEXmcush+M0FQmOZ+6rNVimr/3Q2KBWRu3bFnwZQjf2et8DcC4gPSfxdcIivp4cpu6A4+jivTW6/B6vuApjM4LJ1qfMVZ9mXCxg5nOjRfTE7oMZoXxkel1KnKAw==',key_name='tempest-TestSecurityGroupsBasicOps-1978918738',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-0x3a985k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:46:02Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=d2339d12-50dd-4f1a-9ae0-62f630de86dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.195 2 DEBUG nova.network.os_vif_util [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.196 2 DEBUG nova.network.os_vif_util [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:74:62,bridge_name='br-int',has_traffic_filtering=True,id=e1340afc-6c56-42ae-947b-30329a3d37bb,network=Network(4222e136-df5f-4a10-8189-ca81337b231e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1340afc-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.197 2 DEBUG nova.objects.instance [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'pci_devices' on Instance uuid d2339d12-50dd-4f1a-9ae0-62f630de86dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.212 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <uuid>d2339d12-50dd-4f1a-9ae0-62f630de86dd</uuid>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <name>instance-00000097</name>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1307223704</nova:name>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:46:06</nova:creationTime>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:46:06 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:46:06 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:46:06 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:46:06 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:46:06 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:46:06 compute-1 nova_compute[192795]:         <nova:user uuid="c33a752ef8234bba917ace1e73763490">tempest-TestSecurityGroupsBasicOps-2108116341-project-member</nova:user>
Sep 30 21:46:06 compute-1 nova_compute[192795]:         <nova:project uuid="1ff42902541948f7a6df344fac87c2b7">tempest-TestSecurityGroupsBasicOps-2108116341</nova:project>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:46:06 compute-1 nova_compute[192795]:         <nova:port uuid="e1340afc-6c56-42ae-947b-30329a3d37bb">
Sep 30 21:46:06 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <system>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <entry name="serial">d2339d12-50dd-4f1a-9ae0-62f630de86dd</entry>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <entry name="uuid">d2339d12-50dd-4f1a-9ae0-62f630de86dd</entry>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     </system>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <os>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   </os>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <features>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   </features>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk.config"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:ba:74:62"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <target dev="tape1340afc-6c"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/console.log" append="off"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <video>
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     </video>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:46:06 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:46:06 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:46:06 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:46:06 compute-1 nova_compute[192795]: </domain>
Sep 30 21:46:06 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.213 2 DEBUG nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Preparing to wait for external event network-vif-plugged-e1340afc-6c56-42ae-947b-30329a3d37bb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.214 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.214 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.215 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.215 2 DEBUG nova.virt.libvirt.vif [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:46:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1307223704',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1307223704',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=151,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH3ScEbiEXmcush+M0FQmOZ+6rNVimr/3Q2KBWRu3bFnwZQjf2et8DcC4gPSfxdcIivp4cpu6A4+jivTW6/B6vuApjM4LJ1qfMVZ9mXCxg5nOjRfTE7oMZoXxkel1KnKAw==',key_name='tempest-TestSecurityGroupsBasicOps-1978918738',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-0x3a985k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:46:02Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=d2339d12-50dd-4f1a-9ae0-62f630de86dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.216 2 DEBUG nova.network.os_vif_util [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.216 2 DEBUG nova.network.os_vif_util [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:74:62,bridge_name='br-int',has_traffic_filtering=True,id=e1340afc-6c56-42ae-947b-30329a3d37bb,network=Network(4222e136-df5f-4a10-8189-ca81337b231e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1340afc-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.217 2 DEBUG os_vif [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:74:62,bridge_name='br-int',has_traffic_filtering=True,id=e1340afc-6c56-42ae-947b-30329a3d37bb,network=Network(4222e136-df5f-4a10-8189-ca81337b231e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1340afc-6c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.218 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.218 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.221 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1340afc-6c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.221 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape1340afc-6c, col_values=(('external_ids', {'iface-id': 'e1340afc-6c56-42ae-947b-30329a3d37bb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:74:62', 'vm-uuid': 'd2339d12-50dd-4f1a-9ae0-62f630de86dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:06 compute-1 NetworkManager[51724]: <info>  [1759268766.2244] manager: (tape1340afc-6c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.228 2 INFO os_vif [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:74:62,bridge_name='br-int',has_traffic_filtering=True,id=e1340afc-6c56-42ae-947b-30329a3d37bb,network=Network(4222e136-df5f-4a10-8189-ca81337b231e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1340afc-6c')
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.280 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.281 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.281 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No VIF found with MAC fa:16:3e:ba:74:62, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.281 2 INFO nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Using config drive
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.978 2 INFO nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Creating config drive at /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk.config
Sep 30 21:46:06 compute-1 nova_compute[192795]: 2025-09-30 21:46:06.984 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvjp_7hv9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.111 2 DEBUG oslo_concurrency.processutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvjp_7hv9" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:07 compute-1 kernel: tape1340afc-6c: entered promiscuous mode
Sep 30 21:46:07 compute-1 NetworkManager[51724]: <info>  [1759268767.2065] manager: (tape1340afc-6c): new Tun device (/org/freedesktop/NetworkManager/Devices/290)
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:07 compute-1 ovn_controller[94902]: 2025-09-30T21:46:07Z|00589|binding|INFO|Claiming lport e1340afc-6c56-42ae-947b-30329a3d37bb for this chassis.
Sep 30 21:46:07 compute-1 ovn_controller[94902]: 2025-09-30T21:46:07Z|00590|binding|INFO|e1340afc-6c56-42ae-947b-30329a3d37bb: Claiming fa:16:3e:ba:74:62 10.100.0.4
Sep 30 21:46:07 compute-1 systemd-udevd[245390]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.244 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:74:62 10.100.0.4'], port_security=['fa:16:3e:ba:74:62 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd2339d12-50dd-4f1a-9ae0-62f630de86dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4222e136-df5f-4a10-8189-ca81337b231e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '489b8320-df45-412b-aa25-82c9f2ade333 b0d6122c-cf3e-4e14-83fa-d5e76f9ca4ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2a57ede-7e02-42c9-b7e1-016d9d117d53, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=e1340afc-6c56-42ae-947b-30329a3d37bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.245 103861 INFO neutron.agent.ovn.metadata.agent [-] Port e1340afc-6c56-42ae-947b-30329a3d37bb in datapath 4222e136-df5f-4a10-8189-ca81337b231e bound to our chassis
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.247 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4222e136-df5f-4a10-8189-ca81337b231e
Sep 30 21:46:07 compute-1 podman[245331]: 2025-09-30 21:46:07.250121766 +0000 UTC m=+0.066545249 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:46:07 compute-1 NetworkManager[51724]: <info>  [1759268767.2658] device (tape1340afc-6c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:46:07 compute-1 NetworkManager[51724]: <info>  [1759268767.2665] device (tape1340afc-6c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:46:07 compute-1 systemd-machined[152783]: New machine qemu-70-instance-00000097.
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.270 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3c901131-6f5d-4e45-ac76-f3bca32abd46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.271 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4222e136-d1 in ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:46:07 compute-1 ovn_controller[94902]: 2025-09-30T21:46:07Z|00591|binding|INFO|Setting lport e1340afc-6c56-42ae-947b-30329a3d37bb ovn-installed in OVS
Sep 30 21:46:07 compute-1 ovn_controller[94902]: 2025-09-30T21:46:07Z|00592|binding|INFO|Setting lport e1340afc-6c56-42ae-947b-30329a3d37bb up in Southbound
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.273 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4222e136-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.273 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4b824b4f-2625-4594-978f-d0aefffb12db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.275 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b86fbbc7-314c-4ea1-92e4-3134568edbc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 podman[245326]: 2025-09-30 21:46:07.276664199 +0000 UTC m=+0.105267600 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 21:46:07 compute-1 systemd[1]: Started Virtual Machine qemu-70-instance-00000097.
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.290 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[039603df-23aa-4259-bc56-1e9978c900e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 podman[245330]: 2025-09-30 21:46:07.301443235 +0000 UTC m=+0.123984773 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.313 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[96523566-b395-4806-af36-59f565444ebd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.342 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ff3eac-adaf-4000-97b8-e0816b43d62e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 NetworkManager[51724]: <info>  [1759268767.3496] manager: (tap4222e136-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/291)
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.348 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b466c25b-6ede-4b73-a742-8a7bb900f984]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.384 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6c092253-a8bd-472a-ba1d-267e96ddcaff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.387 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[70044af3-cc1f-4c31-9c91-f5515239c733]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 NetworkManager[51724]: <info>  [1759268767.4087] device (tap4222e136-d0): carrier: link connected
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.414 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5f6ed8-0981-46a7-8cab-819c02b29ab7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.430 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[eb97d87f-e6cf-4970-ae81-83850675e190]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4222e136-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:c2:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546308, 'reachable_time': 34453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245437, 'error': None, 'target': 'ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.446 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[db178af7-3603-4e2c-b9ff-100b950929c4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:c2ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546308, 'tstamp': 546308}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245438, 'error': None, 'target': 'ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.465 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ef35a6-8e68-443e-8092-2da871bc3034]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4222e136-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:c2:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546308, 'reachable_time': 34453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245439, 'error': None, 'target': 'ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.503 2 DEBUG nova.compute.manager [req-607a1cf2-0985-4f9c-b784-2bcfd952774f req-3655aa2f-347e-4416-9548-afb9d6686c62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received event network-vif-plugged-e1340afc-6c56-42ae-947b-30329a3d37bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.504 2 DEBUG oslo_concurrency.lockutils [req-607a1cf2-0985-4f9c-b784-2bcfd952774f req-3655aa2f-347e-4416-9548-afb9d6686c62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.504 2 DEBUG oslo_concurrency.lockutils [req-607a1cf2-0985-4f9c-b784-2bcfd952774f req-3655aa2f-347e-4416-9548-afb9d6686c62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.505 2 DEBUG oslo_concurrency.lockutils [req-607a1cf2-0985-4f9c-b784-2bcfd952774f req-3655aa2f-347e-4416-9548-afb9d6686c62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.505 2 DEBUG nova.compute.manager [req-607a1cf2-0985-4f9c-b784-2bcfd952774f req-3655aa2f-347e-4416-9548-afb9d6686c62 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Processing event network-vif-plugged-e1340afc-6c56-42ae-947b-30329a3d37bb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.506 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[503c70c4-27e8-4f6a-ab14-2466e3d752ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.588 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[35d4e095-7cb3-4436-b87f-f66da5fed3f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.590 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4222e136-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.590 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.590 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4222e136-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:07 compute-1 kernel: tap4222e136-d0: entered promiscuous mode
Sep 30 21:46:07 compute-1 NetworkManager[51724]: <info>  [1759268767.5937] manager: (tap4222e136-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.596 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4222e136-d0, col_values=(('external_ids', {'iface-id': '74e1338e-b229-4f08-b752-39792fa7a6b3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:07 compute-1 ovn_controller[94902]: 2025-09-30T21:46:07Z|00593|binding|INFO|Releasing lport 74e1338e-b229-4f08-b752-39792fa7a6b3 from this chassis (sb_readonly=0)
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.600 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4222e136-df5f-4a10-8189-ca81337b231e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4222e136-df5f-4a10-8189-ca81337b231e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.601 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c6284e05-51fa-4780-a563-4cc2a6db78f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.602 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-4222e136-df5f-4a10-8189-ca81337b231e
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/4222e136-df5f-4a10-8189-ca81337b231e.pid.haproxy
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 4222e136-df5f-4a10-8189-ca81337b231e
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:46:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:07.603 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e', 'env', 'PROCESS_TAG=haproxy-4222e136-df5f-4a10-8189-ca81337b231e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4222e136-df5f-4a10-8189-ca81337b231e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:46:07 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:07 compute-1 podman[245478]: 2025-09-30 21:46:07.990494785 +0000 UTC m=+0.053938761 container create e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:07.999 2 DEBUG nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.000 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268767.998532, d2339d12-50dd-4f1a-9ae0-62f630de86dd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.000 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] VM Started (Lifecycle Event)
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.003 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.006 2 INFO nova.virt.libvirt.driver [-] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Instance spawned successfully.
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.006 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.024 2 DEBUG nova.network.neutron [req-22484b48-38b5-49b1-bcdc-c208de9d469e req-68013057-d7ba-4261-9c6d-433cff5d041d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updated VIF entry in instance network info cache for port e1340afc-6c56-42ae-947b-30329a3d37bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.025 2 DEBUG nova.network.neutron [req-22484b48-38b5-49b1-bcdc-c208de9d469e req-68013057-d7ba-4261-9c6d-433cff5d041d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updating instance_info_cache with network_info: [{"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:08 compute-1 systemd[1]: Started libpod-conmon-e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030.scope.
Sep 30 21:46:08 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:46:08 compute-1 podman[245478]: 2025-09-30 21:46:07.958353051 +0000 UTC m=+0.021797037 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:46:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a98575c6c6948a35d62e30c8539297ba432f88d5b66ce3205625a6cd2ae12534/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:46:08 compute-1 podman[245478]: 2025-09-30 21:46:08.069138648 +0000 UTC m=+0.132582644 container init e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:46:08 compute-1 podman[245478]: 2025-09-30 21:46:08.075375146 +0000 UTC m=+0.138819112 container start e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:46:08 compute-1 neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e[245492]: [NOTICE]   (245496) : New worker (245498) forked
Sep 30 21:46:08 compute-1 neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e[245492]: [NOTICE]   (245496) : Loading success.
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.271 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.277 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.311 2 DEBUG oslo_concurrency.lockutils [req-22484b48-38b5-49b1-bcdc-c208de9d469e req-68013057-d7ba-4261-9c6d-433cff5d041d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.316 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.316 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.317 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.317 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.318 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.318 2 DEBUG nova.virt.libvirt.driver [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.403 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.404 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268768.0001013, d2339d12-50dd-4f1a-9ae0-62f630de86dd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.404 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] VM Paused (Lifecycle Event)
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.447 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.451 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268768.0028305, d2339d12-50dd-4f1a-9ae0-62f630de86dd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.452 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] VM Resumed (Lifecycle Event)
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.472 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.478 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.497 2 INFO nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Took 6.07 seconds to spawn the instance on the hypervisor.
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.499 2 DEBUG nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.509 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.580 2 INFO nova.compute.manager [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Took 6.53 seconds to build instance.
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.605 2 DEBUG oslo_concurrency.lockutils [None req-1b3a9116-81e7-4c37-a3d2-0058b2854f2e c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:08 compute-1 nova_compute[192795]: 2025-09-30 21:46:08.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:09 compute-1 nova_compute[192795]: 2025-09-30 21:46:09.599 2 DEBUG nova.compute.manager [req-6c5cbeda-ace0-4ad7-a598-23765ad2bc8d req-ede87035-e52d-46e5-87c3-32a5d151af82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received event network-vif-plugged-e1340afc-6c56-42ae-947b-30329a3d37bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:09 compute-1 nova_compute[192795]: 2025-09-30 21:46:09.600 2 DEBUG oslo_concurrency.lockutils [req-6c5cbeda-ace0-4ad7-a598-23765ad2bc8d req-ede87035-e52d-46e5-87c3-32a5d151af82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:09 compute-1 nova_compute[192795]: 2025-09-30 21:46:09.600 2 DEBUG oslo_concurrency.lockutils [req-6c5cbeda-ace0-4ad7-a598-23765ad2bc8d req-ede87035-e52d-46e5-87c3-32a5d151af82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:09 compute-1 nova_compute[192795]: 2025-09-30 21:46:09.600 2 DEBUG oslo_concurrency.lockutils [req-6c5cbeda-ace0-4ad7-a598-23765ad2bc8d req-ede87035-e52d-46e5-87c3-32a5d151af82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:09 compute-1 nova_compute[192795]: 2025-09-30 21:46:09.601 2 DEBUG nova.compute.manager [req-6c5cbeda-ace0-4ad7-a598-23765ad2bc8d req-ede87035-e52d-46e5-87c3-32a5d151af82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] No waiting events found dispatching network-vif-plugged-e1340afc-6c56-42ae-947b-30329a3d37bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:09 compute-1 nova_compute[192795]: 2025-09-30 21:46:09.601 2 WARNING nova.compute.manager [req-6c5cbeda-ace0-4ad7-a598-23765ad2bc8d req-ede87035-e52d-46e5-87c3-32a5d151af82 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received unexpected event network-vif-plugged-e1340afc-6c56-42ae-947b-30329a3d37bb for instance with vm_state active and task_state None.
Sep 30 21:46:11 compute-1 nova_compute[192795]: 2025-09-30 21:46:11.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:11 compute-1 nova_compute[192795]: 2025-09-30 21:46:11.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:12 compute-1 podman[245507]: 2025-09-30 21:46:12.248936877 +0000 UTC m=+0.091676256 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:46:12 compute-1 NetworkManager[51724]: <info>  [1759268772.5909] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/293)
Sep 30 21:46:12 compute-1 NetworkManager[51724]: <info>  [1759268772.5919] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:12 compute-1 ovn_controller[94902]: 2025-09-30T21:46:12Z|00594|binding|INFO|Releasing lport 74e1338e-b229-4f08-b752-39792fa7a6b3 from this chassis (sb_readonly=0)
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.763 2 DEBUG nova.compute.manager [req-f5339caa-3d7c-4409-97ff-dede17b4bdf6 req-08d0f148-33c2-4904-88b5-0e15a0411f4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received event network-changed-e1340afc-6c56-42ae-947b-30329a3d37bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.764 2 DEBUG nova.compute.manager [req-f5339caa-3d7c-4409-97ff-dede17b4bdf6 req-08d0f148-33c2-4904-88b5-0e15a0411f4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Refreshing instance network info cache due to event network-changed-e1340afc-6c56-42ae-947b-30329a3d37bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.764 2 DEBUG oslo_concurrency.lockutils [req-f5339caa-3d7c-4409-97ff-dede17b4bdf6 req-08d0f148-33c2-4904-88b5-0e15a0411f4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.764 2 DEBUG oslo_concurrency.lockutils [req-f5339caa-3d7c-4409-97ff-dede17b4bdf6 req-08d0f148-33c2-4904-88b5-0e15a0411f4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:12 compute-1 nova_compute[192795]: 2025-09-30 21:46:12.765 2 DEBUG nova.network.neutron [req-f5339caa-3d7c-4409-97ff-dede17b4bdf6 req-08d0f148-33c2-4904-88b5-0e15a0411f4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Refreshing network info cache for port e1340afc-6c56-42ae-947b-30329a3d37bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:46:13 compute-1 nova_compute[192795]: 2025-09-30 21:46:13.194 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:13 compute-1 nova_compute[192795]: 2025-09-30 21:46:13.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:13 compute-1 nova_compute[192795]: 2025-09-30 21:46:13.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:14 compute-1 nova_compute[192795]: 2025-09-30 21:46:14.655 2 DEBUG nova.network.neutron [req-f5339caa-3d7c-4409-97ff-dede17b4bdf6 req-08d0f148-33c2-4904-88b5-0e15a0411f4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updated VIF entry in instance network info cache for port e1340afc-6c56-42ae-947b-30329a3d37bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:46:14 compute-1 nova_compute[192795]: 2025-09-30 21:46:14.656 2 DEBUG nova.network.neutron [req-f5339caa-3d7c-4409-97ff-dede17b4bdf6 req-08d0f148-33c2-4904-88b5-0e15a0411f4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updating instance_info_cache with network_info: [{"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:14 compute-1 nova_compute[192795]: 2025-09-30 21:46:14.676 2 DEBUG oslo_concurrency.lockutils [req-f5339caa-3d7c-4409-97ff-dede17b4bdf6 req-08d0f148-33c2-4904-88b5-0e15a0411f4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:14 compute-1 nova_compute[192795]: 2025-09-30 21:46:14.677 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:14 compute-1 nova_compute[192795]: 2025-09-30 21:46:14.677 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:46:14 compute-1 nova_compute[192795]: 2025-09-30 21:46:14.678 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d2339d12-50dd-4f1a-9ae0-62f630de86dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:15 compute-1 nova_compute[192795]: 2025-09-30 21:46:15.972 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updating instance_info_cache with network_info: [{"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:15 compute-1 nova_compute[192795]: 2025-09-30 21:46:15.988 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:15 compute-1 nova_compute[192795]: 2025-09-30 21:46:15.988 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:46:16 compute-1 nova_compute[192795]: 2025-09-30 21:46:16.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:16 compute-1 nova_compute[192795]: 2025-09-30 21:46:16.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:16 compute-1 nova_compute[192795]: 2025-09-30 21:46:16.983 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:18 compute-1 nova_compute[192795]: 2025-09-30 21:46:18.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:19 compute-1 podman[245531]: 2025-09-30 21:46:19.212850451 +0000 UTC m=+0.051690650 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:46:19 compute-1 podman[245532]: 2025-09-30 21:46:19.213230502 +0000 UTC m=+0.050314133 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:46:19 compute-1 podman[245530]: 2025-09-30 21:46:19.244135883 +0000 UTC m=+0.086335372 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Sep 30 21:46:19 compute-1 ovn_controller[94902]: 2025-09-30T21:46:19Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ba:74:62 10.100.0.4
Sep 30 21:46:19 compute-1 ovn_controller[94902]: 2025-09-30T21:46:19Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:74:62 10.100.0.4
Sep 30 21:46:19 compute-1 nova_compute[192795]: 2025-09-30 21:46:19.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:21 compute-1 nova_compute[192795]: 2025-09-30 21:46:21.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:21.787 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:b1:8a 2001:db8:0:1:f816:3eff:feb2:b18a 2001:db8::f816:3eff:feb2:b18a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feb2:b18a/64 2001:db8::f816:3eff:feb2:b18a/64', 'neutron:device_id': 'ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9942446e-c20f-4a7d-bedc-ac08b4f4b886, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4bd0bdb1-40a0-42f8-98d1-c84ba21808c1) old=Port_Binding(mac=['fa:16:3e:b2:b1:8a 2001:db8::f816:3eff:feb2:b18a'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb2:b18a/64', 'neutron:device_id': 'ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:21.788 103861 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4bd0bdb1-40a0-42f8-98d1-c84ba21808c1 in datapath d22f103a-1a95-4031-ae6e-c474eae9834e updated
Sep 30 21:46:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:21.790 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d22f103a-1a95-4031-ae6e-c474eae9834e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:46:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:21.793 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[87fe6b33-a04b-448e-a6e4-e3046f84529f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:23 compute-1 nova_compute[192795]: 2025-09-30 21:46:23.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:26 compute-1 nova_compute[192795]: 2025-09-30 21:46:26.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:28 compute-1 podman[245604]: 2025-09-30 21:46:28.218333917 +0000 UTC m=+0.058443395 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:46:28 compute-1 nova_compute[192795]: 2025-09-30 21:46:28.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:31 compute-1 nova_compute[192795]: 2025-09-30 21:46:31.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.032 2 DEBUG nova.compute.manager [req-c32a9147-f5f9-4e3c-945b-68bd94225495 req-aa3699fa-037d-4dbc-9782-9d28cbcb655d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received event network-changed-e1340afc-6c56-42ae-947b-30329a3d37bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.032 2 DEBUG nova.compute.manager [req-c32a9147-f5f9-4e3c-945b-68bd94225495 req-aa3699fa-037d-4dbc-9782-9d28cbcb655d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Refreshing instance network info cache due to event network-changed-e1340afc-6c56-42ae-947b-30329a3d37bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.033 2 DEBUG oslo_concurrency.lockutils [req-c32a9147-f5f9-4e3c-945b-68bd94225495 req-aa3699fa-037d-4dbc-9782-9d28cbcb655d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.033 2 DEBUG oslo_concurrency.lockutils [req-c32a9147-f5f9-4e3c-945b-68bd94225495 req-aa3699fa-037d-4dbc-9782-9d28cbcb655d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.034 2 DEBUG nova.network.neutron [req-c32a9147-f5f9-4e3c-945b-68bd94225495 req-aa3699fa-037d-4dbc-9782-9d28cbcb655d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Refreshing network info cache for port e1340afc-6c56-42ae-947b-30329a3d37bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.182 2 DEBUG oslo_concurrency.lockutils [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.183 2 DEBUG oslo_concurrency.lockutils [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.183 2 DEBUG oslo_concurrency.lockutils [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.184 2 DEBUG oslo_concurrency.lockutils [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.184 2 DEBUG oslo_concurrency.lockutils [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.200 2 INFO nova.compute.manager [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Terminating instance
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.216 2 DEBUG nova.compute.manager [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:46:32 compute-1 kernel: tape1340afc-6c (unregistering): left promiscuous mode
Sep 30 21:46:32 compute-1 NetworkManager[51724]: <info>  [1759268792.2538] device (tape1340afc-6c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:46:32 compute-1 ovn_controller[94902]: 2025-09-30T21:46:32Z|00595|binding|INFO|Releasing lport e1340afc-6c56-42ae-947b-30329a3d37bb from this chassis (sb_readonly=0)
Sep 30 21:46:32 compute-1 ovn_controller[94902]: 2025-09-30T21:46:32Z|00596|binding|INFO|Setting lport e1340afc-6c56-42ae-947b-30329a3d37bb down in Southbound
Sep 30 21:46:32 compute-1 ovn_controller[94902]: 2025-09-30T21:46:32Z|00597|binding|INFO|Removing iface tape1340afc-6c ovn-installed in OVS
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.275 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:74:62 10.100.0.4'], port_security=['fa:16:3e:ba:74:62 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd2339d12-50dd-4f1a-9ae0-62f630de86dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4222e136-df5f-4a10-8189-ca81337b231e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '489b8320-df45-412b-aa25-82c9f2ade333 b0d6122c-cf3e-4e14-83fa-d5e76f9ca4ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2a57ede-7e02-42c9-b7e1-016d9d117d53, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=e1340afc-6c56-42ae-947b-30329a3d37bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.277 103861 INFO neutron.agent.ovn.metadata.agent [-] Port e1340afc-6c56-42ae-947b-30329a3d37bb in datapath 4222e136-df5f-4a10-8189-ca81337b231e unbound from our chassis
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.279 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4222e136-df5f-4a10-8189-ca81337b231e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.280 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bef99790-6dae-4a65-a203-73e2f0048e48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.280 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e namespace which is not needed anymore
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:32 compute-1 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000097.scope: Deactivated successfully.
Sep 30 21:46:32 compute-1 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000097.scope: Consumed 13.548s CPU time.
Sep 30 21:46:32 compute-1 systemd-machined[152783]: Machine qemu-70-instance-00000097 terminated.
Sep 30 21:46:32 compute-1 neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e[245492]: [NOTICE]   (245496) : haproxy version is 2.8.14-c23fe91
Sep 30 21:46:32 compute-1 neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e[245492]: [NOTICE]   (245496) : path to executable is /usr/sbin/haproxy
Sep 30 21:46:32 compute-1 neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e[245492]: [WARNING]  (245496) : Exiting Master process...
Sep 30 21:46:32 compute-1 neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e[245492]: [ALERT]    (245496) : Current worker (245498) exited with code 143 (Terminated)
Sep 30 21:46:32 compute-1 neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e[245492]: [WARNING]  (245496) : All workers exited. Exiting... (0)
Sep 30 21:46:32 compute-1 systemd[1]: libpod-e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030.scope: Deactivated successfully.
Sep 30 21:46:32 compute-1 podman[245646]: 2025-09-30 21:46:32.418679553 +0000 UTC m=+0.043233350 container died e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:46:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030-userdata-shm.mount: Deactivated successfully.
Sep 30 21:46:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-a98575c6c6948a35d62e30c8539297ba432f88d5b66ce3205625a6cd2ae12534-merged.mount: Deactivated successfully.
Sep 30 21:46:32 compute-1 podman[245646]: 2025-09-30 21:46:32.451308919 +0000 UTC m=+0.075862716 container cleanup e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:46:32 compute-1 systemd[1]: libpod-conmon-e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030.scope: Deactivated successfully.
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.480 2 INFO nova.virt.libvirt.driver [-] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Instance destroyed successfully.
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.481 2 DEBUG nova.objects.instance [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'resources' on Instance uuid d2339d12-50dd-4f1a-9ae0-62f630de86dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.495 2 DEBUG nova.virt.libvirt.vif [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:46:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1307223704',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1307223704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=151,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH3ScEbiEXmcush+M0FQmOZ+6rNVimr/3Q2KBWRu3bFnwZQjf2et8DcC4gPSfxdcIivp4cpu6A4+jivTW6/B6vuApjM4LJ1qfMVZ9mXCxg5nOjRfTE7oMZoXxkel1KnKAw==',key_name='tempest-TestSecurityGroupsBasicOps-1978918738',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:46:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-0x3a985k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:46:08Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=d2339d12-50dd-4f1a-9ae0-62f630de86dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.496 2 DEBUG nova.network.os_vif_util [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.497 2 DEBUG nova.network.os_vif_util [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:74:62,bridge_name='br-int',has_traffic_filtering=True,id=e1340afc-6c56-42ae-947b-30329a3d37bb,network=Network(4222e136-df5f-4a10-8189-ca81337b231e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1340afc-6c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.497 2 DEBUG os_vif [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:74:62,bridge_name='br-int',has_traffic_filtering=True,id=e1340afc-6c56-42ae-947b-30329a3d37bb,network=Network(4222e136-df5f-4a10-8189-ca81337b231e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1340afc-6c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.501 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1340afc-6c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.509 2 INFO os_vif [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:74:62,bridge_name='br-int',has_traffic_filtering=True,id=e1340afc-6c56-42ae-947b-30329a3d37bb,network=Network(4222e136-df5f-4a10-8189-ca81337b231e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1340afc-6c')
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.509 2 INFO nova.virt.libvirt.driver [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Deleting instance files /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd_del
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.510 2 INFO nova.virt.libvirt.driver [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Deletion of /var/lib/nova/instances/d2339d12-50dd-4f1a-9ae0-62f630de86dd_del complete
Sep 30 21:46:32 compute-1 podman[245688]: 2025-09-30 21:46:32.51688524 +0000 UTC m=+0.042646185 container remove e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.522 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0238258b-0a6d-4566-b5d0-291eb3624caa]: (4, ('Tue Sep 30 09:46:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e (e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030)\ne74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030\nTue Sep 30 09:46:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e (e74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030)\ne74a64d25a2e58520eacc00f8bcf5fed8baed25dcf2bc998fe7c58d6b39f0030\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.524 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[744731dd-7a04-476d-b2a0-99f94fe677bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.525 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4222e136-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:32 compute-1 kernel: tap4222e136-d0: left promiscuous mode
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.541 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[32d28a84-be05-40ef-9c82-13bce5695fc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.582 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0939c5f9-5e78-421d-922f-a89b012ad2b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.584 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[19189740-080a-4876-884f-0a35a2dbcd60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.591 2 INFO nova.compute.manager [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Took 0.37 seconds to destroy the instance on the hypervisor.
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.591 2 DEBUG oslo.service.loopingcall [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.592 2 DEBUG nova.compute.manager [-] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:46:32 compute-1 nova_compute[192795]: 2025-09-30 21:46:32.592 2 DEBUG nova.network.neutron [-] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.607 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c133c748-7ed2-4e86-a71d-fe14e3356b03]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546301, 'reachable_time': 31360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245705, 'error': None, 'target': 'ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.609 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4222e136-df5f-4a10-8189-ca81337b231e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:46:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:32.610 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[d420b3ea-4954-4d04-8222-bbff233077c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:32 compute-1 systemd[1]: run-netns-ovnmeta\x2d4222e136\x2ddf5f\x2d4a10\x2d8189\x2dca81337b231e.mount: Deactivated successfully.
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.390 2 DEBUG nova.compute.manager [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received event network-vif-unplugged-e1340afc-6c56-42ae-947b-30329a3d37bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.390 2 DEBUG oslo_concurrency.lockutils [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.390 2 DEBUG oslo_concurrency.lockutils [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.390 2 DEBUG oslo_concurrency.lockutils [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.391 2 DEBUG nova.compute.manager [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] No waiting events found dispatching network-vif-unplugged-e1340afc-6c56-42ae-947b-30329a3d37bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.391 2 DEBUG nova.compute.manager [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received event network-vif-unplugged-e1340afc-6c56-42ae-947b-30329a3d37bb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.391 2 DEBUG nova.compute.manager [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received event network-vif-plugged-e1340afc-6c56-42ae-947b-30329a3d37bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.391 2 DEBUG oslo_concurrency.lockutils [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.391 2 DEBUG oslo_concurrency.lockutils [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.391 2 DEBUG oslo_concurrency.lockutils [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.392 2 DEBUG nova.compute.manager [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] No waiting events found dispatching network-vif-plugged-e1340afc-6c56-42ae-947b-30329a3d37bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.392 2 WARNING nova.compute.manager [req-9dc963e8-c323-48f6-a249-eb4d16136839 req-1d482a11-e419-471e-8dd6-acf19ad1b364 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received unexpected event network-vif-plugged-e1340afc-6c56-42ae-947b-30329a3d37bb for instance with vm_state active and task_state deleting.
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.731 2 DEBUG nova.network.neutron [-] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.750 2 INFO nova.compute.manager [-] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Took 1.16 seconds to deallocate network for instance.
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.853 2 DEBUG oslo_concurrency.lockutils [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.853 2 DEBUG oslo_concurrency.lockutils [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.865 2 DEBUG nova.compute.manager [req-2282a88a-2bf5-497c-a91d-b6a5d17b7a37 req-b0cafb86-7282-46a3-b4bc-ccac703f6c63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Received event network-vif-deleted-e1340afc-6c56-42ae-947b-30329a3d37bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.933 2 DEBUG nova.compute.provider_tree [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.947 2 DEBUG nova.scheduler.client.report [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:46:33 compute-1 nova_compute[192795]: 2025-09-30 21:46:33.976 2 DEBUG oslo_concurrency.lockutils [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:34 compute-1 nova_compute[192795]: 2025-09-30 21:46:34.003 2 INFO nova.scheduler.client.report [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Deleted allocations for instance d2339d12-50dd-4f1a-9ae0-62f630de86dd
Sep 30 21:46:34 compute-1 nova_compute[192795]: 2025-09-30 21:46:34.071 2 DEBUG oslo_concurrency.lockutils [None req-ce413f56-d217-443a-91ed-17f32f56ecef c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "d2339d12-50dd-4f1a-9ae0-62f630de86dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:34 compute-1 nova_compute[192795]: 2025-09-30 21:46:34.148 2 DEBUG nova.network.neutron [req-c32a9147-f5f9-4e3c-945b-68bd94225495 req-aa3699fa-037d-4dbc-9782-9d28cbcb655d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updated VIF entry in instance network info cache for port e1340afc-6c56-42ae-947b-30329a3d37bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:46:34 compute-1 nova_compute[192795]: 2025-09-30 21:46:34.148 2 DEBUG nova.network.neutron [req-c32a9147-f5f9-4e3c-945b-68bd94225495 req-aa3699fa-037d-4dbc-9782-9d28cbcb655d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Updating instance_info_cache with network_info: [{"id": "e1340afc-6c56-42ae-947b-30329a3d37bb", "address": "fa:16:3e:ba:74:62", "network": {"id": "4222e136-df5f-4a10-8189-ca81337b231e", "bridge": "br-int", "label": "tempest-network-smoke--1432542759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1340afc-6c", "ovs_interfaceid": "e1340afc-6c56-42ae-947b-30329a3d37bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:34 compute-1 nova_compute[192795]: 2025-09-30 21:46:34.179 2 DEBUG oslo_concurrency.lockutils [req-c32a9147-f5f9-4e3c-945b-68bd94225495 req-aa3699fa-037d-4dbc-9782-9d28cbcb655d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-d2339d12-50dd-4f1a-9ae0-62f630de86dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:34 compute-1 nova_compute[192795]: 2025-09-30 21:46:34.381 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Creating tmpfile /var/lib/nova/instances/tmp83k0dk2g to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Sep 30 21:46:34 compute-1 nova_compute[192795]: 2025-09-30 21:46:34.497 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp83k0dk2g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Sep 30 21:46:35 compute-1 nova_compute[192795]: 2025-09-30 21:46:35.538 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp83k0dk2g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1ffcb24-6b20-4ed8-8287-65fb4e98d88c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Sep 30 21:46:35 compute-1 nova_compute[192795]: 2025-09-30 21:46:35.567 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:35 compute-1 nova_compute[192795]: 2025-09-30 21:46:35.568 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquired lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:35 compute-1 nova_compute[192795]: 2025-09-30 21:46:35.568 2 DEBUG nova.network.neutron [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:46:37 compute-1 nova_compute[192795]: 2025-09-30 21:46:37.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:38 compute-1 podman[245706]: 2025-09-30 21:46:38.227635912 +0000 UTC m=+0.065363907 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:46:38 compute-1 podman[245708]: 2025-09-30 21:46:38.235002474 +0000 UTC m=+0.066999567 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:46:38 compute-1 podman[245707]: 2025-09-30 21:46:38.270608384 +0000 UTC m=+0.105839387 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:46:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:38.704 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:38.705 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:38.705 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:38 compute-1 nova_compute[192795]: 2025-09-30 21:46:38.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.168 2 DEBUG nova.network.neutron [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updating instance_info_cache with network_info: [{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.192 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Releasing lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.207 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp83k0dk2g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1ffcb24-6b20-4ed8-8287-65fb4e98d88c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.207 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Creating instance directory: /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.208 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Creating disk.info with the contents: {'/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk': 'qcow2', '/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.208 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.209 2 DEBUG nova.objects.instance [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c1ffcb24-6b20-4ed8-8287-65fb4e98d88c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.252 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.313 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.314 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.315 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.326 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.380 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.381 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.416 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.418 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.418 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.484 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.485 2 DEBUG nova.virt.disk.api [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Checking if we can resize image /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.486 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.547 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.549 2 DEBUG nova.virt.disk.api [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Cannot resize image /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.549 2 DEBUG nova.objects.instance [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lazy-loading 'migration_context' on Instance uuid c1ffcb24-6b20-4ed8-8287-65fb4e98d88c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.564 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.596 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config 485376" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.598 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config to /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:46:39 compute-1 nova_compute[192795]: 2025-09-30 21:46:39.599 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.115 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.117 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.118 2 DEBUG nova.virt.libvirt.vif [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:46:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-179264408',display_name='tempest-TestNetworkAdvancedServerOps-server-179264408',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-179264408',id=152,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOjZFWrOkGLtM22QkbWpGIMaBp+BaIFIIqX3JnKMPx79D7bs+sVkeOWPP9lZzJjeCe2zeve1vj7ZAmUVopE269mNakJeQ6oTVT41jDINH3f5N8CxQSQQns3MXA4tC9JjTQ==',key_name='tempest-TestNetworkAdvancedServerOps-144739839',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:46:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-xa5oqg6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:46:11Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c1ffcb24-6b20-4ed8-8287-65fb4e98d88c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.118 2 DEBUG nova.network.os_vif_util [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converting VIF {"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.119 2 DEBUG nova.network.os_vif_util [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.119 2 DEBUG os_vif [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.121 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.121 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.124 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d27cdd-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.124 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd7d27cdd-9f, col_values=(('external_ids', {'iface-id': 'd7d27cdd-9f0b-467f-901a-08b4834d1496', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:ba:7b', 'vm-uuid': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:40 compute-1 NetworkManager[51724]: <info>  [1759268800.1270] manager: (tapd7d27cdd-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.133 2 INFO os_vif [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f')
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.134 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Sep 30 21:46:40 compute-1 nova_compute[192795]: 2025-09-30 21:46:40.134 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp83k0dk2g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1ffcb24-6b20-4ed8-8287-65fb4e98d88c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Sep 30 21:46:42 compute-1 nova_compute[192795]: 2025-09-30 21:46:42.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:43 compute-1 nova_compute[192795]: 2025-09-30 21:46:43.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:43 compute-1 podman[245797]: 2025-09-30 21:46:43.212467496 +0000 UTC m=+0.057723207 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:46:43 compute-1 nova_compute[192795]: 2025-09-30 21:46:43.230 2 DEBUG nova.network.neutron [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Port d7d27cdd-9f0b-467f-901a-08b4834d1496 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Sep 30 21:46:43 compute-1 nova_compute[192795]: 2025-09-30 21:46:43.245 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp83k0dk2g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1ffcb24-6b20-4ed8-8287-65fb4e98d88c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Sep 30 21:46:43 compute-1 systemd[1]: Starting libvirt proxy daemon...
Sep 30 21:46:43 compute-1 systemd[1]: Started libvirt proxy daemon.
Sep 30 21:46:43 compute-1 kernel: tapd7d27cdd-9f: entered promiscuous mode
Sep 30 21:46:43 compute-1 NetworkManager[51724]: <info>  [1759268803.5642] manager: (tapd7d27cdd-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Sep 30 21:46:43 compute-1 ovn_controller[94902]: 2025-09-30T21:46:43Z|00598|binding|INFO|Claiming lport d7d27cdd-9f0b-467f-901a-08b4834d1496 for this additional chassis.
Sep 30 21:46:43 compute-1 ovn_controller[94902]: 2025-09-30T21:46:43Z|00599|binding|INFO|d7d27cdd-9f0b-467f-901a-08b4834d1496: Claiming fa:16:3e:0d:ba:7b 10.100.0.4
Sep 30 21:46:43 compute-1 nova_compute[192795]: 2025-09-30 21:46:43.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:43 compute-1 nova_compute[192795]: 2025-09-30 21:46:43.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:43 compute-1 NetworkManager[51724]: <info>  [1759268803.5839] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Sep 30 21:46:43 compute-1 NetworkManager[51724]: <info>  [1759268803.5848] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Sep 30 21:46:43 compute-1 nova_compute[192795]: 2025-09-30 21:46:43.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:43 compute-1 systemd-udevd[245847]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:46:43 compute-1 NetworkManager[51724]: <info>  [1759268803.6113] device (tapd7d27cdd-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:46:43 compute-1 NetworkManager[51724]: <info>  [1759268803.6130] device (tapd7d27cdd-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:46:43 compute-1 systemd-machined[152783]: New machine qemu-71-instance-00000098.
Sep 30 21:46:43 compute-1 systemd[1]: Started Virtual Machine qemu-71-instance-00000098.
Sep 30 21:46:43 compute-1 nova_compute[192795]: 2025-09-30 21:46:43.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:43 compute-1 nova_compute[192795]: 2025-09-30 21:46:43.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:43 compute-1 ovn_controller[94902]: 2025-09-30T21:46:43Z|00600|binding|INFO|Setting lport d7d27cdd-9f0b-467f-901a-08b4834d1496 ovn-installed in OVS
Sep 30 21:46:43 compute-1 nova_compute[192795]: 2025-09-30 21:46:43.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:44.025 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000098', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'hostId': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:46:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:44.027 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:46:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:44.027 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:46:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:44.027 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>]
Sep 30 21:46:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:44.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:46:44 compute-1 nova_compute[192795]: 2025-09-30 21:46:44.973 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268804.973167, c1ffcb24-6b20-4ed8-8287-65fb4e98d88c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:44 compute-1 nova_compute[192795]: 2025-09-30 21:46:44.974 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] VM Started (Lifecycle Event)
Sep 30 21:46:44 compute-1 nova_compute[192795]: 2025-09-30 21:46:44.992 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:45 compute-1 nova_compute[192795]: 2025-09-30 21:46:45.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:45 compute-1 nova_compute[192795]: 2025-09-30 21:46:45.698 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268805.6978526, c1ffcb24-6b20-4ed8-8287-65fb4e98d88c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:45 compute-1 nova_compute[192795]: 2025-09-30 21:46:45.699 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] VM Resumed (Lifecycle Event)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.727 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.728 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 nova_compute[192795]: 2025-09-30 21:46:45.729 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a53a10e-e020-4581-9816-3b455ccaf420', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:44.027871', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5cdb79e-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': '4aeb92638202630604bf4de06904bfc2ccd9d59b9e62f3f15e485b3b85853070'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:44.027871', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f5cdc54a-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': 'ae30bf3c20fd945b2946a1d6c9c08c180cb4824ced05b8675662d281f02788c2'}]}, 'timestamp': '2025-09-30 21:46:45.728920', '_unique_id': '8e0e38d67aa74d598c587f93bce5b44b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.730 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:46:45 compute-1 nova_compute[192795]: 2025-09-30 21:46:45.733 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.733 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c1ffcb24-6b20-4ed8-8287-65fb4e98d88c / tapd7d27cdd-9f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.733 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39464a43-2a01-4a68-b550-b812b8c22762', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.731389', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5ce9024-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': '07a8779f17fd67945d99e2da2e3fc8722f3feb24c3edefcebaaae910853817ea'}]}, 'timestamp': '2025-09-30 21:46:45.734112', '_unique_id': 'c44743521ab24c63bfbbf47ca8ca8f43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.734 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.735 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.735 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.735 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>]
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.751 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.752 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee9685f3-a677-4d2e-bbbe-70237bbe6d3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:45.736129', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5d15070-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.468878293, 'message_signature': '6495d9c431220f7a65cae67fee72d76da5f2b78c10d6621c890b9256262baae3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:45.736129', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f5d15de0-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.468878293, 'message_signature': 'e2593dd5f94495f09c09c7e58396a9670d3ceddaae3ea7951411fb98838ae82b'}]}, 'timestamp': '2025-09-30 21:46:45.752498', '_unique_id': '51bf5369b34448299b41e4c91c75df25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.753 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.754 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.755 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.755 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a13110f-c24d-46b7-b4c0-995f02a74755', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:45.755085', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5d1ce74-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': '8ef1f41999683f7a63e3b6f322b2014831be50c08158d0739bc9e4b6f1369eec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:45.755085', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f5d1d914-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': '406145dc7f24db44f8e3af49d845fb48da68b018495c973bc17c0bc88902ed20'}]}, 'timestamp': '2025-09-30 21:46:45.755621', '_unique_id': '1c024aa84ee041668839114351bebd97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.756 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.757 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.757 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.bytes volume: 338 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb7515d1-8a66-4460-b5e5-ff2f88451d61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 338, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.757232', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5d224dc-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': 'dd48ecd20583949c72dff9b74becd4f6e69d7d74f49f99a45af6ff1e71a9e322'}]}, 'timestamp': '2025-09-30 21:46:45.757575', '_unique_id': '43b1a1b7428d4bcab6b7eb0f76285da4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.758 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.759 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.759 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>]
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.759 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.packets volume: 5 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7eb759c6-ec7e-4ba4-a4a6-b200a9e08e29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 5, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.759590', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5d27f54-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': '1df294a9ff9bdc1c06303c312547611efd1ca6a165247b3b32e66b5ea39bcf25'}]}, 'timestamp': '2025-09-30 21:46:45.759889', '_unique_id': '905ee0a112d8461e91a18ba7f19ca1d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.760 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.761 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.761 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.bytes volume: 742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2439646a-6ca8-4abc-a84b-6bf5dbcc4d6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 742, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.761400', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5d2c54a-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': '19afc575692290fc199373d058919aa40c8a57dfda47e83d6ce2dd7ff4b6334a'}]}, 'timestamp': '2025-09-30 21:46:45.761680', '_unique_id': '92cba60b874848d4b47de1bec79b47f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.762 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.763 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2654c014-0e66-4465-b9a1-cc33332b865b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.763714', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5d32044-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': '80efeca2ed4606a06fb74c453dfb4c28a446feaad1d17342a06b5eb8dbfe9788'}]}, 'timestamp': '2025-09-30 21:46:45.764046', '_unique_id': 'c0e985dac6cc487292ed1cf16f12a78f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.764 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.765 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 nova_compute[192795]: 2025-09-30 21:46:45.767 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9654ddec-4453-452b-8c5b-ca0367e397ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.765701', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5d36e32-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': 'e1605576e5dea6f25316a9dde064225558534060c46083456c2333ac4de02afe'}]}, 'timestamp': '2025-09-30 21:46:45.766028', '_unique_id': '0179c5bf34044fdfa7ad662a9d6a8019'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.767 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.768 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.768 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.768 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45d76c47-f4b8-4e13-b5c3-595b13058cfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:45.768548', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5d3dd4a-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.468878293, 'message_signature': 'd3c0a79a6e960c56c166d260e2118bc6fa1feccc1a95a4d5845a6c04b603ae18'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:45.768548', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f5d3e83a-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.468878293, 'message_signature': '36bf8a591203b1686efb79fe2abfc51f3c80b42338d6397ae6c60757a4130cff'}]}, 'timestamp': '2025-09-30 21:46:45.769125', '_unique_id': '9eafb5d7435f4448a14201afc7f87601'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.770 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.771 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.771 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fdfbbae-6035-4ca7-8696-80abf32314bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:45.771114', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5d43f7e-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': 'e94cd719e00b195a21092db26ff62da5b0ea5b933a4c4e3fd7f002517e063b6b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:45.771114', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f5d44aa0-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': 'a2cabac0444ca6a1a610fd3a4e89b283f9b31e2148f2bad2365dcd2a73ea0025'}]}, 'timestamp': '2025-09-30 21:46:45.771652', '_unique_id': '754b2374ae7b4b1eab9ff59dcbb4b29d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.772 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.773 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.773 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '938e0f1b-95b7-4429-bfbe-3d3b8f122061', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:45.773177', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5d490e6-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.468878293, 'message_signature': '1903165c91fcc37705372d09e0cce7131c45b1417166ca0fba1a493b80d08744'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:45.773177', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f5d498b6-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.468878293, 'message_signature': 'b2c63151c516c309dab6a62014482c79eb343208816d08d074d5141c12a3df83'}]}, 'timestamp': '2025-09-30 21:46:45.773640', '_unique_id': '5a238c9624a5402e9c7b9ce9b013ac99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.774 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2361b2c2-1a65-4d1d-bde6-34ef5508e826', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:45.774790', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5d4d010-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': '8ead9324f4154e7e6ca3840a2c3f04c62624707eebfa093e415d1bbbc3048d46'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:45.774790', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f5d4d9d4-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': 'f7551f9eba5c84ea5a4fccf7c28873edc537e5b3a89d0ec30c8e8b6ee1cb3d4c'}]}, 'timestamp': '2025-09-30 21:46:45.775325', '_unique_id': 'df01475e63724ab999f8f06b1ea55068'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.775 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.776 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.776 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52d354a4-09b7-468c-92f1-b200973090df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:45.776659', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5d5182c-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': '7c3f04e3e50b56de3b7da828b8491ba95219944a6372f6523d390aa0e1a87352'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:45.776659', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f5d52128-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': '25be18326a503fe7eedb6d22bf54a71ced11798b86c9563cea26be8a25b94a32'}]}, 'timestamp': '2025-09-30 21:46:45.777126', '_unique_id': '3a6f633f65984e0cb50fb8ff41da2348'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.777 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.778 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73efa897-e900-487e-9a71-8ac7f99a3faf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.778429', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5d55e4a-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': '536624a6e3c5a7d26787b5bd61962614dd5cb3a228a8f51dff0f189233bd7b30'}]}, 'timestamp': '2025-09-30 21:46:45.778741', '_unique_id': '8ebbb87ed9c14c5bbf6b7931e2c2924a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.779 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.780 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.795 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/memory.usage volume: 42.78515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cdf4faf-4106-4e49-ac2c-5f92b3aa31af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.78515625, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'timestamp': '2025-09-30T21:46:45.780144', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f5d80898-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.528263991, 'message_signature': 'fc8b2031922868aa35a785bcd70fde22900333f97fe96ce7452d884e8b69c893'}]}, 'timestamp': '2025-09-30 21:46:45.796248', '_unique_id': '8ac0b4183fd04ca591d35513130dd6c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1d3aa48-8d18-4ab0-bff0-1348c2d70fcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.798060', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5d85c80-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': '6a5defbb546100ea422b9730475d03dd94248597d8b8b6b84624dd52449e1939'}]}, 'timestamp': '2025-09-30 21:46:45.798343', '_unique_id': '2632743fb7a945189aeb7f932e35e9a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.798 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.799 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.799 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61a141ad-bd99-4bc5-a36f-4b346d7db6fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.799587', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5d8975e-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': '55a057284a35cfc300764412ea1af2cf4a7148398aeac2a43a5d820b830e6b06'}]}, 'timestamp': '2025-09-30 21:46:45.799825', '_unique_id': '1b817c086c61448994ab26d83ca3b730'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.800 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.801 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>]
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.801 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.801 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8ae4a0b-db40-48ac-be35-3067c40fbe0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:45.801276', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f5d8dba6-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': '7c2935796e05b29f818209eb7af73c3abb73d0e2da6b4137b382171230ecb8c2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:45.801276', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f5d8e5c4-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5499.760623625, 'message_signature': 'b929541f29700f8f8b25dab78a846a1ad3f658012d6b08a416a2927f3e300b1e'}]}, 'timestamp': '2025-09-30 21:46:45.801835', '_unique_id': 'de2ecbf9e64346ccbe23786255f1d5f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.802 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51198fac-658a-433a-be9a-c601d2c81246', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:45.803082', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f5d91f94-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.464180037, 'message_signature': 'e874a0703c73947b5c183d3fb6bee990ca8994b7879cc408f2757774060a7516'}]}, 'timestamp': '2025-09-30 21:46:45.803333', '_unique_id': '6c4ea7c4d70448b9b91b301c04a6c890'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.803 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.804 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.804 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9685d375-faee-4147-8230-184c522340d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'timestamp': '2025-09-30T21:46:45.804381', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f5d95220-9e46-11f0-984a-fa163e8033fc', 'monotonic_time': 5501.528263991, 'message_signature': 'e1bbf8af76895d6edeb3d5abb885854d1c01858476cb4b31eecd1e4fc316c874'}]}, 'timestamp': '2025-09-30 21:46:45.804598', '_unique_id': 'cffd91737ba04a7cbcd874e74847a6ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:45 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:46:45.805 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:47 compute-1 nova_compute[192795]: 2025-09-30 21:46:47.479 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268792.4781034, d2339d12-50dd-4f1a-9ae0-62f630de86dd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:47 compute-1 nova_compute[192795]: 2025-09-30 21:46:47.479 2 INFO nova.compute.manager [-] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] VM Stopped (Lifecycle Event)
Sep 30 21:46:47 compute-1 nova_compute[192795]: 2025-09-30 21:46:47.511 2 DEBUG nova.compute.manager [None req-87f9ba6b-f53c-4285-a329-d4a617ea14b0 - - - - - -] [instance: d2339d12-50dd-4f1a-9ae0-62f630de86dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:47 compute-1 ovn_controller[94902]: 2025-09-30T21:46:47Z|00601|binding|INFO|Claiming lport d7d27cdd-9f0b-467f-901a-08b4834d1496 for this chassis.
Sep 30 21:46:47 compute-1 ovn_controller[94902]: 2025-09-30T21:46:47Z|00602|binding|INFO|d7d27cdd-9f0b-467f-901a-08b4834d1496: Claiming fa:16:3e:0d:ba:7b 10.100.0.4
Sep 30 21:46:47 compute-1 ovn_controller[94902]: 2025-09-30T21:46:47Z|00603|binding|INFO|Setting lport d7d27cdd-9f0b-467f-901a-08b4834d1496 up in Southbound
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.716 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:ba:7b 10.100.0.4'], port_security=['fa:16:3e:0d:ba:7b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'cf7d6f50-5bb5-4e27-9140-45f441e5642f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.221'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5da0e449-d17e-4f7d-8049-f4559cb24f52, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=d7d27cdd-9f0b-467f-901a-08b4834d1496) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.717 103861 INFO neutron.agent.ovn.metadata.agent [-] Port d7d27cdd-9f0b-467f-901a-08b4834d1496 in datapath 5bda010c-f47e-4b74-9f9e-0682da4daba2 bound to our chassis
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.719 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5bda010c-f47e-4b74-9f9e-0682da4daba2
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.732 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d56dc94d-6864-4566-8239-72a739cd4dcb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.733 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5bda010c-f1 in ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.736 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5bda010c-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.736 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b729e7ea-cd4d-4ceb-a75c-47779f4e1ec9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.737 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e5b957-088d-47f2-90ee-c2bbed0b4091]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.741 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:47 compute-1 nova_compute[192795]: 2025-09-30 21:46:47.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.750 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[708df85e-a491-4fc9-af50-eefb0a15c18e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.775 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce7ee00-798d-43ab-b959-b9776cd566a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.805 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9410e736-0c2c-47da-97f1-99a0355576fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 NetworkManager[51724]: <info>  [1759268807.8116] manager: (tap5bda010c-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/299)
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.810 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6774e2ba-f06c-4db7-8d0e-e7432e342c28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 systemd-udevd[245885]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.838 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b1acacc1-79a0-476a-a5a7-0c3f0c2ee66b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.841 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4dadbd-d2ba-4f50-ad2a-5661abdbd04a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 NetworkManager[51724]: <info>  [1759268807.8919] device (tap5bda010c-f0): carrier: link connected
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.900 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e27dadaa-03f2-4238-99ba-d463b75e2690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.918 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[00fc1aa3-82ca-495a-9f83-abc6e85fd162]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bda010c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:01:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550356, 'reachable_time': 40982, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245904, 'error': None, 'target': 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.933 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f1df81c5-2fec-4526-871d-4f0f9acf0c88]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:15e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550356, 'tstamp': 550356}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245905, 'error': None, 'target': 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.952 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4ff54281-0640-46cc-bca3-d5f6aa4d5873]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bda010c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:01:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550356, 'reachable_time': 40982, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245906, 'error': None, 'target': 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:47 compute-1 nova_compute[192795]: 2025-09-30 21:46:47.966 2 INFO nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Post operation of migration started
Sep 30 21:46:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:47.980 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f0711009-1185-4a36-a6a6-0803470b2fe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.036 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[eed092b9-64a2-48d6-8ae3-a2f33720499a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.037 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bda010c-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.038 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.038 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bda010c-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:48 compute-1 kernel: tap5bda010c-f0: entered promiscuous mode
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:48 compute-1 NetworkManager[51724]: <info>  [1759268808.0427] manager: (tap5bda010c-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.043 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5bda010c-f0, col_values=(('external_ids', {'iface-id': '1e783af7-70e3-416e-b2fa-da6490df0917'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:48 compute-1 ovn_controller[94902]: 2025-09-30T21:46:48Z|00604|binding|INFO|Releasing lport 1e783af7-70e3-416e-b2fa-da6490df0917 from this chassis (sb_readonly=0)
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.047 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bda010c-f47e-4b74-9f9e-0682da4daba2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bda010c-f47e-4b74-9f9e-0682da4daba2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.048 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[880d4646-a67f-4c3b-895d-1168ed5b9db1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.049 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-5bda010c-f47e-4b74-9f9e-0682da4daba2
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/5bda010c-f47e-4b74-9f9e-0682da4daba2.pid.haproxy
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 5bda010c-f47e-4b74-9f9e-0682da4daba2
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.050 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'env', 'PROCESS_TAG=haproxy-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5bda010c-f47e-4b74-9f9e-0682da4daba2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.131 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.131 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.180 2 DEBUG nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.383 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.384 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.390 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.390 2 INFO nova.compute.claims [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:46:48 compute-1 podman[245939]: 2025-09-30 21:46:48.40891193 +0000 UTC m=+0.045078645 container create 4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:46:48 compute-1 systemd[1]: Started libpod-conmon-4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f.scope.
Sep 30 21:46:48 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:46:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d455adef5912111f9252200bb03b83588d7f58769012e534864a5db006711397/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:46:48 compute-1 podman[245939]: 2025-09-30 21:46:48.477876225 +0000 UTC m=+0.114042960 container init 4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:46:48 compute-1 podman[245939]: 2025-09-30 21:46:48.386011954 +0000 UTC m=+0.022178699 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.480 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.482 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquired lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.482 2 DEBUG nova.network.neutron [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:46:48 compute-1 podman[245939]: 2025-09-30 21:46:48.483308839 +0000 UTC m=+0.119475554 container start 4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:46:48 compute-1 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[245954]: [NOTICE]   (245958) : New worker (245960) forked
Sep 30 21:46:48 compute-1 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[245954]: [NOTICE]   (245958) : Loading success.
Sep 30 21:46:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:48.542 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.550 2 DEBUG nova.compute.provider_tree [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.575 2 DEBUG nova.scheduler.client.report [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.596 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.597 2 DEBUG nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.660 2 DEBUG nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.661 2 DEBUG nova.network.neutron [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.689 2 INFO nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.708 2 DEBUG nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.845 2 DEBUG nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.846 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.847 2 INFO nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Creating image(s)
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.847 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "/var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.848 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.848 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.866 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.946 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.947 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.947 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:48 compute-1 nova_compute[192795]: 2025-09-30 21:46:48.959 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.013 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.014 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.048 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.049 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.049 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.112 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.113 2 DEBUG nova.virt.disk.api [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Checking if we can resize image /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.114 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.165 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.166 2 DEBUG nova.virt.disk.api [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Cannot resize image /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.166 2 DEBUG nova.objects.instance [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'migration_context' on Instance uuid 89b1d81c-69be-444e-8ba0-7bd444fa4d41 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.181 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.181 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Ensure instance console log exists: /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.182 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.182 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.182 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:49 compute-1 nova_compute[192795]: 2025-09-30 21:46:49.840 2 DEBUG nova.policy [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27859618cb1d493cb1531af26b200b92', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '043721d1d0a2480fa785367fa56c1fa4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:46:50 compute-1 nova_compute[192795]: 2025-09-30 21:46:50.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:50 compute-1 podman[245985]: 2025-09-30 21:46:50.22610021 +0000 UTC m=+0.057444391 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:46:50 compute-1 podman[245986]: 2025-09-30 21:46:50.22612235 +0000 UTC m=+0.053762649 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923)
Sep 30 21:46:50 compute-1 podman[245984]: 2025-09-30 21:46:50.234428286 +0000 UTC m=+0.067480909 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Sep 30 21:46:50 compute-1 nova_compute[192795]: 2025-09-30 21:46:50.717 2 DEBUG nova.network.neutron [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updating instance_info_cache with network_info: [{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:50 compute-1 nova_compute[192795]: 2025-09-30 21:46:50.737 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Releasing lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:50 compute-1 nova_compute[192795]: 2025-09-30 21:46:50.768 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:50 compute-1 nova_compute[192795]: 2025-09-30 21:46:50.768 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:50 compute-1 nova_compute[192795]: 2025-09-30 21:46:50.769 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:50 compute-1 nova_compute[192795]: 2025-09-30 21:46:50.773 2 INFO nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 21:46:50 compute-1 virtqemud[192217]: Domain id=71 name='instance-00000098' uuid=c1ffcb24-6b20-4ed8-8287-65fb4e98d88c is tainted: custom-monitor
Sep 30 21:46:51 compute-1 nova_compute[192795]: 2025-09-30 21:46:51.022 2 DEBUG nova.network.neutron [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Successfully updated port: 1c535e54-4f52-4e80-8bfb-3ee8946adf22 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:46:51 compute-1 nova_compute[192795]: 2025-09-30 21:46:51.041 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "refresh_cache-89b1d81c-69be-444e-8ba0-7bd444fa4d41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:51 compute-1 nova_compute[192795]: 2025-09-30 21:46:51.041 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquired lock "refresh_cache-89b1d81c-69be-444e-8ba0-7bd444fa4d41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:51 compute-1 nova_compute[192795]: 2025-09-30 21:46:51.041 2 DEBUG nova.network.neutron [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:46:51 compute-1 nova_compute[192795]: 2025-09-30 21:46:51.115 2 DEBUG nova.compute.manager [req-414c7e71-12ea-46f6-8e3c-5cf9b7f348e9 req-866c6b20-7d98-4cad-bfeb-440220670667 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Received event network-changed-1c535e54-4f52-4e80-8bfb-3ee8946adf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:51 compute-1 nova_compute[192795]: 2025-09-30 21:46:51.116 2 DEBUG nova.compute.manager [req-414c7e71-12ea-46f6-8e3c-5cf9b7f348e9 req-866c6b20-7d98-4cad-bfeb-440220670667 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Refreshing instance network info cache due to event network-changed-1c535e54-4f52-4e80-8bfb-3ee8946adf22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:46:51 compute-1 nova_compute[192795]: 2025-09-30 21:46:51.116 2 DEBUG oslo_concurrency.lockutils [req-414c7e71-12ea-46f6-8e3c-5cf9b7f348e9 req-866c6b20-7d98-4cad-bfeb-440220670667 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-89b1d81c-69be-444e-8ba0-7bd444fa4d41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:51 compute-1 nova_compute[192795]: 2025-09-30 21:46:51.302 2 DEBUG nova.network.neutron [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:46:51 compute-1 nova_compute[192795]: 2025-09-30 21:46:51.781 2 INFO nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.702 2 DEBUG nova.network.neutron [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Updating instance_info_cache with network_info: [{"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.787 2 INFO nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.791 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.838 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Releasing lock "refresh_cache-89b1d81c-69be-444e-8ba0-7bd444fa4d41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.838 2 DEBUG nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Instance network_info: |[{"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.839 2 DEBUG oslo_concurrency.lockutils [req-414c7e71-12ea-46f6-8e3c-5cf9b7f348e9 req-866c6b20-7d98-4cad-bfeb-440220670667 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-89b1d81c-69be-444e-8ba0-7bd444fa4d41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.839 2 DEBUG nova.network.neutron [req-414c7e71-12ea-46f6-8e3c-5cf9b7f348e9 req-866c6b20-7d98-4cad-bfeb-440220670667 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Refreshing network info cache for port 1c535e54-4f52-4e80-8bfb-3ee8946adf22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.841 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Start _get_guest_xml network_info=[{"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.845 2 WARNING nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.853 2 DEBUG nova.virt.libvirt.host [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.854 2 DEBUG nova.virt.libvirt.host [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.861 2 DEBUG nova.virt.libvirt.host [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.862 2 DEBUG nova.virt.libvirt.host [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.863 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.863 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.863 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.864 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.864 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.864 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.864 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.864 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.865 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.865 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.866 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.866 2 DEBUG nova.virt.hardware [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.869 2 DEBUG nova.virt.libvirt.vif [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:46:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-496048242',display_name='tempest-TestNetworkBasicOps-server-496048242',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-496048242',id=155,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKDNjO5Jo8chzIkSZNaqeGTK+iihzCOepiRiJv/9rD8Rq/ByvlO+t5HPHK4rXiD0koDg+6gj5n6aypotfN5IfEHHv7vHuusBTBXY8Lz1oyWQe//C/2+Bp2ImRjH6jhcTww==',key_name='tempest-TestNetworkBasicOps-1175594242',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-u9s8yu4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:46:48Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=89b1d81c-69be-444e-8ba0-7bd444fa4d41,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.869 2 DEBUG nova.network.os_vif_util [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.870 2 DEBUG nova.network.os_vif_util [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:55:03,bridge_name='br-int',has_traffic_filtering=True,id=1c535e54-4f52-4e80-8bfb-3ee8946adf22,network=Network(8c645c67-da61-4c69-b798-016ded873d36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1c535e54-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.871 2 DEBUG nova.objects.instance [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 89b1d81c-69be-444e-8ba0-7bd444fa4d41 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:52 compute-1 nova_compute[192795]: 2025-09-30 21:46:52.882 2 DEBUG nova.objects.instance [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.005 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <uuid>89b1d81c-69be-444e-8ba0-7bd444fa4d41</uuid>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <name>instance-0000009b</name>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkBasicOps-server-496048242</nova:name>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:46:52</nova:creationTime>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:46:53 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:46:53 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:46:53 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:46:53 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:46:53 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:46:53 compute-1 nova_compute[192795]:         <nova:user uuid="27859618cb1d493cb1531af26b200b92">tempest-TestNetworkBasicOps-2126023928-project-member</nova:user>
Sep 30 21:46:53 compute-1 nova_compute[192795]:         <nova:project uuid="043721d1d0a2480fa785367fa56c1fa4">tempest-TestNetworkBasicOps-2126023928</nova:project>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:46:53 compute-1 nova_compute[192795]:         <nova:port uuid="1c535e54-4f52-4e80-8bfb-3ee8946adf22">
Sep 30 21:46:53 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <system>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <entry name="serial">89b1d81c-69be-444e-8ba0-7bd444fa4d41</entry>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <entry name="uuid">89b1d81c-69be-444e-8ba0-7bd444fa4d41</entry>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     </system>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <os>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   </os>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <features>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   </features>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk.config"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:57:55:03"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <target dev="tap1c535e54-4f"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/console.log" append="off"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <video>
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     </video>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:46:53 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:46:53 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:46:53 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:46:53 compute-1 nova_compute[192795]: </domain>
Sep 30 21:46:53 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.007 2 DEBUG nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Preparing to wait for external event network-vif-plugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.009 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.009 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.010 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.012 2 DEBUG nova.virt.libvirt.vif [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:46:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-496048242',display_name='tempest-TestNetworkBasicOps-server-496048242',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-496048242',id=155,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKDNjO5Jo8chzIkSZNaqeGTK+iihzCOepiRiJv/9rD8Rq/ByvlO+t5HPHK4rXiD0koDg+6gj5n6aypotfN5IfEHHv7vHuusBTBXY8Lz1oyWQe//C/2+Bp2ImRjH6jhcTww==',key_name='tempest-TestNetworkBasicOps-1175594242',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-u9s8yu4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:46:48Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=89b1d81c-69be-444e-8ba0-7bd444fa4d41,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.013 2 DEBUG nova.network.os_vif_util [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.015 2 DEBUG nova.network.os_vif_util [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:55:03,bridge_name='br-int',has_traffic_filtering=True,id=1c535e54-4f52-4e80-8bfb-3ee8946adf22,network=Network(8c645c67-da61-4c69-b798-016ded873d36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1c535e54-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.016 2 DEBUG os_vif [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:55:03,bridge_name='br-int',has_traffic_filtering=True,id=1c535e54-4f52-4e80-8bfb-3ee8946adf22,network=Network(8c645c67-da61-4c69-b798-016ded873d36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1c535e54-4f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.017 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.018 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.024 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c535e54-4f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.025 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1c535e54-4f, col_values=(('external_ids', {'iface-id': '1c535e54-4f52-4e80-8bfb-3ee8946adf22', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:55:03', 'vm-uuid': '89b1d81c-69be-444e-8ba0-7bd444fa4d41'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:53 compute-1 NetworkManager[51724]: <info>  [1759268813.0278] manager: (tap1c535e54-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.034 2 INFO os_vif [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:55:03,bridge_name='br-int',has_traffic_filtering=True,id=1c535e54-4f52-4e80-8bfb-3ee8946adf22,network=Network(8c645c67-da61-4c69-b798-016ded873d36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1c535e54-4f')
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.152 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.153 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.153 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No VIF found with MAC fa:16:3e:57:55:03, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.153 2 INFO nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Using config drive
Sep 30 21:46:53 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:53.545 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.802 2 INFO nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Creating config drive at /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk.config
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.807 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwyq9aoim execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:53 compute-1 nova_compute[192795]: 2025-09-30 21:46:53.929 2 DEBUG oslo_concurrency.processutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwyq9aoim" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:53 compute-1 kernel: tap1c535e54-4f: entered promiscuous mode
Sep 30 21:46:53 compute-1 NetworkManager[51724]: <info>  [1759268813.9885] manager: (tap1c535e54-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/302)
Sep 30 21:46:54 compute-1 ovn_controller[94902]: 2025-09-30T21:46:54Z|00605|binding|INFO|Claiming lport 1c535e54-4f52-4e80-8bfb-3ee8946adf22 for this chassis.
Sep 30 21:46:54 compute-1 ovn_controller[94902]: 2025-09-30T21:46:54Z|00606|binding|INFO|1c535e54-4f52-4e80-8bfb-3ee8946adf22: Claiming fa:16:3e:57:55:03 10.100.0.13
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:54 compute-1 systemd-udevd[246063]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.019 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:55:03 10.100.0.13'], port_security=['fa:16:3e:57:55:03 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-431926733', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '89b1d81c-69be-444e-8ba0-7bd444fa4d41', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c645c67-da61-4c69-b798-016ded873d36', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-431926733', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '7', 'neutron:security_group_ids': '24f12cb1-976c-4b06-8e76-7f11c490f264', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43321272-a507-4fd9-b66d-912e773448bc, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=1c535e54-4f52-4e80-8bfb-3ee8946adf22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:54 compute-1 ovn_controller[94902]: 2025-09-30T21:46:54Z|00607|binding|INFO|Setting lport 1c535e54-4f52-4e80-8bfb-3ee8946adf22 ovn-installed in OVS
Sep 30 21:46:54 compute-1 ovn_controller[94902]: 2025-09-30T21:46:54Z|00608|binding|INFO|Setting lport 1c535e54-4f52-4e80-8bfb-3ee8946adf22 up in Southbound
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.021 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 1c535e54-4f52-4e80-8bfb-3ee8946adf22 in datapath 8c645c67-da61-4c69-b798-016ded873d36 bound to our chassis
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.025 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c645c67-da61-4c69-b798-016ded873d36
Sep 30 21:46:54 compute-1 NetworkManager[51724]: <info>  [1759268814.0281] device (tap1c535e54-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:46:54 compute-1 NetworkManager[51724]: <info>  [1759268814.0294] device (tap1c535e54-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:46:54 compute-1 systemd-machined[152783]: New machine qemu-72-instance-0000009b.
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.037 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7855f461-9c7b-4b6f-b8f8-74b43cbe4d83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.038 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8c645c67-d1 in ovnmeta-8c645c67-da61-4c69-b798-016ded873d36 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.040 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8c645c67-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.040 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[353267e5-479d-4673-b01a-c03ad283c792]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.041 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[91267d95-3568-4cda-95c6-273cfc27087f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 systemd[1]: Started Virtual Machine qemu-72-instance-0000009b.
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.052 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a42c85-1f15-47b4-a8c2-7bdd0ea34a6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.076 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5759b022-9a83-4675-a2ec-12d07fa1b7fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.107 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[01885523-5ed8-4325-9d22-8558888a3861]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.112 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7621aa9c-e1b2-493b-b84f-a7585c8b4091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 NetworkManager[51724]: <info>  [1759268814.1134] manager: (tap8c645c67-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/303)
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.147 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a5bead-1940-49e4-87c1-5a88b13b3838]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.150 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4379e1b8-e866-4a63-b4cc-ae0aadbc59d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 NetworkManager[51724]: <info>  [1759268814.1765] device (tap8c645c67-d0): carrier: link connected
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.183 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[be9ebbc9-d1f9-490a-a290-8a2e1cbe7e4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.201 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae272ce-f503-4b2b-9d8f-85da2641bff1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c645c67-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:29:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550984, 'reachable_time': 16784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246099, 'error': None, 'target': 'ovnmeta-8c645c67-da61-4c69-b798-016ded873d36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.216 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fc61cb6e-4c8d-4cdf-9577-33e44212b30d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:29fc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550984, 'tstamp': 550984}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246101, 'error': None, 'target': 'ovnmeta-8c645c67-da61-4c69-b798-016ded873d36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.237 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[81fe6552-78b9-40f8-895e-55479bf88b32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c645c67-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:29:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550984, 'reachable_time': 16784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246106, 'error': None, 'target': 'ovnmeta-8c645c67-da61-4c69-b798-016ded873d36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.269 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b22241-f702-4042-8670-24380bb16444]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.325 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e078b940-bb1d-4f75-8c16-ba18c71d8b9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.326 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c645c67-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.326 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.327 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c645c67-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:54 compute-1 kernel: tap8c645c67-d0: entered promiscuous mode
Sep 30 21:46:54 compute-1 NetworkManager[51724]: <info>  [1759268814.3295] manager: (tap8c645c67-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.332 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c645c67-d0, col_values=(('external_ids', {'iface-id': '54b8edf3-0ac2-4347-9249-a6a993a815c4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:54 compute-1 ovn_controller[94902]: 2025-09-30T21:46:54Z|00609|binding|INFO|Releasing lport 54b8edf3-0ac2-4347-9249-a6a993a815c4 from this chassis (sb_readonly=0)
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.345 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c645c67-da61-4c69-b798-016ded873d36.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c645c67-da61-4c69-b798-016ded873d36.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.346 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[109edd9c-adfa-4450-9481-04e8d9aa641d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.346 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-8c645c67-da61-4c69-b798-016ded873d36
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/8c645c67-da61-4c69-b798-016ded873d36.pid.haproxy
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 8c645c67-da61-4c69-b798-016ded873d36
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:46:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:54.347 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8c645c67-da61-4c69-b798-016ded873d36', 'env', 'PROCESS_TAG=haproxy-8c645c67-da61-4c69-b798-016ded873d36', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8c645c67-da61-4c69-b798-016ded873d36.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.653 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268814.652093, 89b1d81c-69be-444e-8ba0-7bd444fa4d41 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.653 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] VM Started (Lifecycle Event)
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.686 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.689 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268814.653592, 89b1d81c-69be-444e-8ba0-7bd444fa4d41 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.689 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] VM Paused (Lifecycle Event)
Sep 30 21:46:54 compute-1 podman[246140]: 2025-09-30 21:46:54.689713983 +0000 UTC m=+0.048940030 container create a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.698 2 DEBUG nova.compute.manager [req-e31cf59e-3d85-498c-980f-d781ca8d062a req-36ccb4b4-ad54-4ee6-9a78-45b991c9f7a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Received event network-vif-plugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.699 2 DEBUG oslo_concurrency.lockutils [req-e31cf59e-3d85-498c-980f-d781ca8d062a req-36ccb4b4-ad54-4ee6-9a78-45b991c9f7a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.699 2 DEBUG oslo_concurrency.lockutils [req-e31cf59e-3d85-498c-980f-d781ca8d062a req-36ccb4b4-ad54-4ee6-9a78-45b991c9f7a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.699 2 DEBUG oslo_concurrency.lockutils [req-e31cf59e-3d85-498c-980f-d781ca8d062a req-36ccb4b4-ad54-4ee6-9a78-45b991c9f7a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.699 2 DEBUG nova.compute.manager [req-e31cf59e-3d85-498c-980f-d781ca8d062a req-36ccb4b4-ad54-4ee6-9a78-45b991c9f7a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Processing event network-vif-plugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.700 2 DEBUG nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.703 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.705 2 INFO nova.virt.libvirt.driver [-] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Instance spawned successfully.
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.706 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.728 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:54 compute-1 systemd[1]: Started libpod-conmon-a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123.scope.
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.733 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268814.7030652, 89b1d81c-69be-444e-8ba0-7bd444fa4d41 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.734 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] VM Resumed (Lifecycle Event)
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.737 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.737 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.738 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.738 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.739 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.739 2 DEBUG nova.virt.libvirt.driver [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:54 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:46:54 compute-1 podman[246140]: 2025-09-30 21:46:54.668700704 +0000 UTC m=+0.027926771 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:46:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17ebd2030d1f6b4acc4337f5821cc0d16bebcccc620078c545403df443032405/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:46:54 compute-1 podman[246140]: 2025-09-30 21:46:54.777607116 +0000 UTC m=+0.136833183 container init a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.777 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.780 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:46:54 compute-1 podman[246140]: 2025-09-30 21:46:54.78385558 +0000 UTC m=+0.143081637 container start a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:46:54 compute-1 neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36[246156]: [NOTICE]   (246160) : New worker (246162) forked
Sep 30 21:46:54 compute-1 neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36[246156]: [NOTICE]   (246160) : Loading success.
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.824 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.850 2 INFO nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Took 6.01 seconds to spawn the instance on the hypervisor.
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.851 2 DEBUG nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.979 2 INFO nova.compute.manager [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Took 6.67 seconds to build instance.
Sep 30 21:46:54 compute-1 nova_compute[192795]: 2025-09-30 21:46:54.995 2 DEBUG oslo_concurrency.lockutils [None req-a1ca6df4-2048-4b4f-956e-158455649e0c 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:55 compute-1 nova_compute[192795]: 2025-09-30 21:46:55.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:55 compute-1 nova_compute[192795]: 2025-09-30 21:46:55.850 2 DEBUG nova.network.neutron [req-414c7e71-12ea-46f6-8e3c-5cf9b7f348e9 req-866c6b20-7d98-4cad-bfeb-440220670667 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Updated VIF entry in instance network info cache for port 1c535e54-4f52-4e80-8bfb-3ee8946adf22. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:46:55 compute-1 nova_compute[192795]: 2025-09-30 21:46:55.851 2 DEBUG nova.network.neutron [req-414c7e71-12ea-46f6-8e3c-5cf9b7f348e9 req-866c6b20-7d98-4cad-bfeb-440220670667 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Updating instance_info_cache with network_info: [{"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:55 compute-1 nova_compute[192795]: 2025-09-30 21:46:55.881 2 DEBUG oslo_concurrency.lockutils [req-414c7e71-12ea-46f6-8e3c-5cf9b7f348e9 req-866c6b20-7d98-4cad-bfeb-440220670667 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-89b1d81c-69be-444e-8ba0-7bd444fa4d41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:57 compute-1 nova_compute[192795]: 2025-09-30 21:46:57.030 2 DEBUG nova.compute.manager [req-e5afd8bd-1890-4c93-8958-b4adc41380e1 req-7a321d0b-ac83-48a0-8419-c36d532351f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Received event network-vif-plugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:57 compute-1 nova_compute[192795]: 2025-09-30 21:46:57.031 2 DEBUG oslo_concurrency.lockutils [req-e5afd8bd-1890-4c93-8958-b4adc41380e1 req-7a321d0b-ac83-48a0-8419-c36d532351f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:57 compute-1 nova_compute[192795]: 2025-09-30 21:46:57.032 2 DEBUG oslo_concurrency.lockutils [req-e5afd8bd-1890-4c93-8958-b4adc41380e1 req-7a321d0b-ac83-48a0-8419-c36d532351f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:57 compute-1 nova_compute[192795]: 2025-09-30 21:46:57.032 2 DEBUG oslo_concurrency.lockutils [req-e5afd8bd-1890-4c93-8958-b4adc41380e1 req-7a321d0b-ac83-48a0-8419-c36d532351f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:57 compute-1 nova_compute[192795]: 2025-09-30 21:46:57.032 2 DEBUG nova.compute.manager [req-e5afd8bd-1890-4c93-8958-b4adc41380e1 req-7a321d0b-ac83-48a0-8419-c36d532351f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] No waiting events found dispatching network-vif-plugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:57 compute-1 nova_compute[192795]: 2025-09-30 21:46:57.032 2 WARNING nova.compute.manager [req-e5afd8bd-1890-4c93-8958-b4adc41380e1 req-7a321d0b-ac83-48a0-8419-c36d532351f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Received unexpected event network-vif-plugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 for instance with vm_state active and task_state None.
Sep 30 21:46:57 compute-1 nova_compute[192795]: 2025-09-30 21:46:57.706 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:58 compute-1 nova_compute[192795]: 2025-09-30 21:46:58.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:58 compute-1 nova_compute[192795]: 2025-09-30 21:46:58.248 2 INFO nova.compute.manager [None req-6b19e3b0-aa0d-4adc-ac3c-d9e4bdb6b2a6 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Get console output
Sep 30 21:46:58 compute-1 nova_compute[192795]: 2025-09-30 21:46:58.253 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:46:58 compute-1 nova_compute[192795]: 2025-09-30 21:46:58.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.199 2 DEBUG nova.compute.manager [req-fa626f10-bca6-4366-bd81-35076c67b30c req-5ffcf315-11a0-4623-b293-2fd93b2f2ba8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-changed-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.200 2 DEBUG nova.compute.manager [req-fa626f10-bca6-4366-bd81-35076c67b30c req-5ffcf315-11a0-4623-b293-2fd93b2f2ba8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Refreshing instance network info cache due to event network-changed-d7d27cdd-9f0b-467f-901a-08b4834d1496. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.200 2 DEBUG oslo_concurrency.lockutils [req-fa626f10-bca6-4366-bd81-35076c67b30c req-5ffcf315-11a0-4623-b293-2fd93b2f2ba8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.200 2 DEBUG oslo_concurrency.lockutils [req-fa626f10-bca6-4366-bd81-35076c67b30c req-5ffcf315-11a0-4623-b293-2fd93b2f2ba8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.201 2 DEBUG nova.network.neutron [req-fa626f10-bca6-4366-bd81-35076c67b30c req-5ffcf315-11a0-4623-b293-2fd93b2f2ba8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Refreshing network info cache for port d7d27cdd-9f0b-467f-901a-08b4834d1496 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:46:59 compute-1 podman[246171]: 2025-09-30 21:46:59.212094578 +0000 UTC m=+0.055527223 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.468 2 DEBUG oslo_concurrency.lockutils [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.469 2 DEBUG oslo_concurrency.lockutils [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.470 2 DEBUG oslo_concurrency.lockutils [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.470 2 DEBUG oslo_concurrency.lockutils [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.471 2 DEBUG oslo_concurrency.lockutils [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.490 2 INFO nova.compute.manager [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Terminating instance
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.511 2 DEBUG nova.compute.manager [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:46:59 compute-1 kernel: tapd7d27cdd-9f (unregistering): left promiscuous mode
Sep 30 21:46:59 compute-1 NetworkManager[51724]: <info>  [1759268819.5354] device (tapd7d27cdd-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.537 2 DEBUG oslo_concurrency.lockutils [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.539 2 DEBUG oslo_concurrency.lockutils [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.539 2 DEBUG oslo_concurrency.lockutils [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.540 2 DEBUG oslo_concurrency.lockutils [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.540 2 DEBUG oslo_concurrency.lockutils [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 ovn_controller[94902]: 2025-09-30T21:46:59Z|00610|binding|INFO|Releasing lport d7d27cdd-9f0b-467f-901a-08b4834d1496 from this chassis (sb_readonly=0)
Sep 30 21:46:59 compute-1 ovn_controller[94902]: 2025-09-30T21:46:59Z|00611|binding|INFO|Setting lport d7d27cdd-9f0b-467f-901a-08b4834d1496 down in Southbound
Sep 30 21:46:59 compute-1 ovn_controller[94902]: 2025-09-30T21:46:59Z|00612|binding|INFO|Removing iface tapd7d27cdd-9f ovn-installed in OVS
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.553 2 INFO nova.compute.manager [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Terminating instance
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.557 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:ba:7b 10.100.0.4'], port_security=['fa:16:3e:0d:ba:7b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'cf7d6f50-5bb5-4e27-9140-45f441e5642f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5da0e449-d17e-4f7d-8049-f4559cb24f52, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=d7d27cdd-9f0b-467f-901a-08b4834d1496) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.559 103861 INFO neutron.agent.ovn.metadata.agent [-] Port d7d27cdd-9f0b-467f-901a-08b4834d1496 in datapath 5bda010c-f47e-4b74-9f9e-0682da4daba2 unbound from our chassis
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.561 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bda010c-f47e-4b74-9f9e-0682da4daba2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.562 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5f000789-ee16-4117-bd1f-b65b725af514]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.563 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 namespace which is not needed anymore
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.567 2 DEBUG nova.compute.manager [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 kernel: tap1c535e54-4f (unregistering): left promiscuous mode
Sep 30 21:46:59 compute-1 NetworkManager[51724]: <info>  [1759268819.5878] device (tap1c535e54-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:46:59 compute-1 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000098.scope: Deactivated successfully.
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000098.scope: Consumed 2.163s CPU time.
Sep 30 21:46:59 compute-1 systemd-machined[152783]: Machine qemu-71-instance-00000098 terminated.
Sep 30 21:46:59 compute-1 ovn_controller[94902]: 2025-09-30T21:46:59Z|00613|binding|INFO|Releasing lport 1c535e54-4f52-4e80-8bfb-3ee8946adf22 from this chassis (sb_readonly=0)
Sep 30 21:46:59 compute-1 ovn_controller[94902]: 2025-09-30T21:46:59Z|00614|binding|INFO|Setting lport 1c535e54-4f52-4e80-8bfb-3ee8946adf22 down in Southbound
Sep 30 21:46:59 compute-1 ovn_controller[94902]: 2025-09-30T21:46:59Z|00615|binding|INFO|Removing iface tap1c535e54-4f ovn-installed in OVS
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.612 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:55:03 10.100.0.13'], port_security=['fa:16:3e:57:55:03 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-431926733', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '89b1d81c-69be-444e-8ba0-7bd444fa4d41', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c645c67-da61-4c69-b798-016ded873d36', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-431926733', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '9', 'neutron:security_group_ids': '24f12cb1-976c-4b06-8e76-7f11c490f264', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.205', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43321272-a507-4fd9-b66d-912e773448bc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=1c535e54-4f52-4e80-8bfb-3ee8946adf22) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Sep 30 21:46:59 compute-1 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009b.scope: Consumed 5.415s CPU time.
Sep 30 21:46:59 compute-1 systemd-machined[152783]: Machine qemu-72-instance-0000009b terminated.
Sep 30 21:46:59 compute-1 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[245954]: [NOTICE]   (245958) : haproxy version is 2.8.14-c23fe91
Sep 30 21:46:59 compute-1 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[245954]: [NOTICE]   (245958) : path to executable is /usr/sbin/haproxy
Sep 30 21:46:59 compute-1 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[245954]: [WARNING]  (245958) : Exiting Master process...
Sep 30 21:46:59 compute-1 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[245954]: [ALERT]    (245958) : Current worker (245960) exited with code 143 (Terminated)
Sep 30 21:46:59 compute-1 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[245954]: [WARNING]  (245958) : All workers exited. Exiting... (0)
Sep 30 21:46:59 compute-1 systemd[1]: libpod-4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f.scope: Deactivated successfully.
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:59 compute-1 podman[246222]: 2025-09-30 21:46:59.696494699 +0000 UTC m=+0.043755172 container died 4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:46:59 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f-userdata-shm.mount: Deactivated successfully.
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.730 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:46:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-d455adef5912111f9252200bb03b83588d7f58769012e534864a5db006711397-merged.mount: Deactivated successfully.
Sep 30 21:46:59 compute-1 podman[246222]: 2025-09-30 21:46:59.74464605 +0000 UTC m=+0.091906513 container cleanup 4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:46:59 compute-1 systemd[1]: libpod-conmon-4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f.scope: Deactivated successfully.
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.805 2 INFO nova.virt.libvirt.driver [-] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Instance destroyed successfully.
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.806 2 DEBUG nova.objects.instance [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid c1ffcb24-6b20-4ed8-8287-65fb4e98d88c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:59 compute-1 podman[246260]: 2025-09-30 21:46:59.810652641 +0000 UTC m=+0.044377978 container remove 4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.818 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f26a89-5d5c-487c-aefd-931f8c8af4d2]: (4, ('Tue Sep 30 09:46:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 (4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f)\n4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f\nTue Sep 30 09:46:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 (4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f)\n4b4175c6ffce4fdad5e56feaa9b37555e4b8b34cf826a0a9e6e10bda31d7673f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.821 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4c02d5-5ca2-4333-9349-264b56e59afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.822 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bda010c-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.824 2 DEBUG nova.virt.libvirt.vif [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:46:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-179264408',display_name='tempest-TestNetworkAdvancedServerOps-server-179264408',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-179264408',id=152,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOjZFWrOkGLtM22QkbWpGIMaBp+BaIFIIqX3JnKMPx79D7bs+sVkeOWPP9lZzJjeCe2zeve1vj7ZAmUVopE269mNakJeQ6oTVT41jDINH3f5N8CxQSQQns3MXA4tC9JjTQ==',key_name='tempest-TestNetworkAdvancedServerOps-144739839',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:46:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-xa5oqg6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:46:53Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c1ffcb24-6b20-4ed8-8287-65fb4e98d88c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.824 2 DEBUG nova.network.os_vif_util [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.825 2 DEBUG nova.network.os_vif_util [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.825 2 DEBUG os_vif [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.828 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d27cdd-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:46:59 compute-1 kernel: tap5bda010c-f0: left promiscuous mode
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.850 2 INFO os_vif [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f')
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.851 2 INFO nova.virt.libvirt.driver [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Deleting instance files /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c_del
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.852 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[75f8de55-f297-4c8e-85a1-616df4cd0feb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.852 2 INFO nova.virt.libvirt.driver [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Deletion of /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c_del complete
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.858 2 INFO nova.virt.libvirt.driver [-] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Instance destroyed successfully.
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.859 2 DEBUG nova.objects.instance [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'resources' on Instance uuid 89b1d81c-69be-444e-8ba0-7bd444fa4d41 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.887 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9872fb-ab3e-49e4-bfef-1b36a5d700ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.889 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[41aeb33e-fc96-4675-83e2-ac9c94582177]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.900 2 DEBUG nova.virt.libvirt.vif [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:46:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-496048242',display_name='tempest-TestNetworkBasicOps-server-496048242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-496048242',id=155,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKDNjO5Jo8chzIkSZNaqeGTK+iihzCOepiRiJv/9rD8Rq/ByvlO+t5HPHK4rXiD0koDg+6gj5n6aypotfN5IfEHHv7vHuusBTBXY8Lz1oyWQe//C/2+Bp2ImRjH6jhcTww==',key_name='tempest-TestNetworkBasicOps-1175594242',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:46:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-u9s8yu4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:46:54Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=89b1d81c-69be-444e-8ba0-7bd444fa4d41,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.901 2 DEBUG nova.network.os_vif_util [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "address": "fa:16:3e:57:55:03", "network": {"id": "8c645c67-da61-4c69-b798-016ded873d36", "bridge": "br-int", "label": "tempest-network-smoke--1216319429", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c535e54-4f", "ovs_interfaceid": "1c535e54-4f52-4e80-8bfb-3ee8946adf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.901 2 DEBUG nova.network.os_vif_util [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:55:03,bridge_name='br-int',has_traffic_filtering=True,id=1c535e54-4f52-4e80-8bfb-3ee8946adf22,network=Network(8c645c67-da61-4c69-b798-016ded873d36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1c535e54-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.901 2 DEBUG os_vif [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:55:03,bridge_name='br-int',has_traffic_filtering=True,id=1c535e54-4f52-4e80-8bfb-3ee8946adf22,network=Network(8c645c67-da61-4c69-b798-016ded873d36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1c535e54-4f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.903 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c535e54-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.903 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2b8f9c51-49fd-4054-8864-76053974d491]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550347, 'reachable_time': 28993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246311, 'error': None, 'target': 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.907 2 INFO os_vif [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:55:03,bridge_name='br-int',has_traffic_filtering=True,id=1c535e54-4f52-4e80-8bfb-3ee8946adf22,network=Network(8c645c67-da61-4c69-b798-016ded873d36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1c535e54-4f')
Sep 30 21:46:59 compute-1 systemd[1]: run-netns-ovnmeta\x2d5bda010c\x2df47e\x2d4b74\x2d9f9e\x2d0682da4daba2.mount: Deactivated successfully.
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.908 2 INFO nova.virt.libvirt.driver [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Deleting instance files /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41_del
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.906 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.907 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[df8d4e0f-fbf1-4702-9313-363740694e41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.908 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 1c535e54-4f52-4e80-8bfb-3ee8946adf22 in datapath 8c645c67-da61-4c69-b798-016ded873d36 unbound from our chassis
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.908 2 INFO nova.virt.libvirt.driver [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Deletion of /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41_del complete
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.909 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c645c67-da61-4c69-b798-016ded873d36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.910 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e07545-06d7-4afb-96f6-443ded784367]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:46:59.910 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8c645c67-da61-4c69-b798-016ded873d36 namespace which is not needed anymore
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.986 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000009b, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/89b1d81c-69be-444e-8ba0-7bd444fa4d41/disk
Sep 30 21:46:59 compute-1 nova_compute[192795]: 2025-09-30 21:46:59.995 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000098, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk
Sep 30 21:47:00 compute-1 neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36[246156]: [NOTICE]   (246160) : haproxy version is 2.8.14-c23fe91
Sep 30 21:47:00 compute-1 neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36[246156]: [NOTICE]   (246160) : path to executable is /usr/sbin/haproxy
Sep 30 21:47:00 compute-1 neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36[246156]: [WARNING]  (246160) : Exiting Master process...
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.032 2 INFO nova.compute.manager [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Took 0.52 seconds to destroy the instance on the hypervisor.
Sep 30 21:47:00 compute-1 neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36[246156]: [ALERT]    (246160) : Current worker (246162) exited with code 143 (Terminated)
Sep 30 21:47:00 compute-1 neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36[246156]: [WARNING]  (246160) : All workers exited. Exiting... (0)
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.033 2 DEBUG oslo.service.loopingcall [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.033 2 DEBUG nova.compute.manager [-] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.033 2 DEBUG nova.network.neutron [-] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:47:00 compute-1 systemd[1]: libpod-a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123.scope: Deactivated successfully.
Sep 30 21:47:00 compute-1 podman[246329]: 2025-09-30 21:47:00.043347391 +0000 UTC m=+0.050253233 container died a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.055 2 INFO nova.compute.manager [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Took 0.49 seconds to destroy the instance on the hypervisor.
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.056 2 DEBUG oslo.service.loopingcall [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.056 2 DEBUG nova.compute.manager [-] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.056 2 DEBUG nova.network.neutron [-] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:47:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123-userdata-shm.mount: Deactivated successfully.
Sep 30 21:47:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-17ebd2030d1f6b4acc4337f5821cc0d16bebcccc620078c545403df443032405-merged.mount: Deactivated successfully.
Sep 30 21:47:00 compute-1 podman[246329]: 2025-09-30 21:47:00.080363906 +0000 UTC m=+0.087269738 container cleanup a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:47:00 compute-1 systemd[1]: libpod-conmon-a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123.scope: Deactivated successfully.
Sep 30 21:47:00 compute-1 podman[246358]: 2025-09-30 21:47:00.141091817 +0000 UTC m=+0.038876832 container remove a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:47:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:00.146 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[27bb3604-5352-4a0f-86fb-deb4b0bca517]: (4, ('Tue Sep 30 09:46:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36 (a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123)\na2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123\nTue Sep 30 09:47:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8c645c67-da61-4c69-b798-016ded873d36 (a2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123)\na2e9355465a5c50d3684c35534d5ea38b536aa92e29f414db0b5e920c12af123\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:00.147 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e60a4111-b89d-410c-9549-25e4402f75b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:00.148 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c645c67-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.149 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:47:00 compute-1 kernel: tap8c645c67-d0: left promiscuous mode
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.152 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5662MB free_disk=73.27126693725586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.152 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.152 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:00.163 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d85a5bee-7f48-444f-a494-ed4e4ac2bc9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:00.196 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e6954559-aa59-49c3-8fd3-fcd9e1382301]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:00.197 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2bac6018-9589-4049-b60d-5f9c13102c66]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:00.213 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3225cc-bd61-454b-b812-8c7431f921fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550977, 'reachable_time': 15537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246374, 'error': None, 'target': 'ovnmeta-8c645c67-da61-4c69-b798-016ded873d36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:00.215 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8c645c67-da61-4c69-b798-016ded873d36 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:47:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:00.215 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[a828feb2-89e3-4d25-aad2-d7b5c80655fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.339 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance c1ffcb24-6b20-4ed8-8287-65fb4e98d88c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.339 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 89b1d81c-69be-444e-8ba0-7bd444fa4d41 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.340 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.340 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.360 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.381 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.381 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.423 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.446 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.507 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.523 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.546 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.546 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:00 compute-1 systemd[1]: run-netns-ovnmeta\x2d8c645c67\x2dda61\x2d4c69\x2db798\x2d016ded873d36.mount: Deactivated successfully.
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.971 2 DEBUG nova.compute.manager [req-8ee9aa4c-d5fa-4ed6-b3db-01ddcfa3570e req-93690ed9-afaf-4609-ae43-7022abdc7f40 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-unplugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.971 2 DEBUG oslo_concurrency.lockutils [req-8ee9aa4c-d5fa-4ed6-b3db-01ddcfa3570e req-93690ed9-afaf-4609-ae43-7022abdc7f40 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.971 2 DEBUG oslo_concurrency.lockutils [req-8ee9aa4c-d5fa-4ed6-b3db-01ddcfa3570e req-93690ed9-afaf-4609-ae43-7022abdc7f40 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.972 2 DEBUG oslo_concurrency.lockutils [req-8ee9aa4c-d5fa-4ed6-b3db-01ddcfa3570e req-93690ed9-afaf-4609-ae43-7022abdc7f40 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.972 2 DEBUG nova.compute.manager [req-8ee9aa4c-d5fa-4ed6-b3db-01ddcfa3570e req-93690ed9-afaf-4609-ae43-7022abdc7f40 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] No waiting events found dispatching network-vif-unplugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:00 compute-1 nova_compute[192795]: 2025-09-30 21:47:00.972 2 DEBUG nova.compute.manager [req-8ee9aa4c-d5fa-4ed6-b3db-01ddcfa3570e req-93690ed9-afaf-4609-ae43-7022abdc7f40 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-unplugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.499 2 DEBUG nova.compute.manager [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Received event network-vif-unplugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.499 2 DEBUG oslo_concurrency.lockutils [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.500 2 DEBUG oslo_concurrency.lockutils [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.500 2 DEBUG oslo_concurrency.lockutils [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.500 2 DEBUG nova.compute.manager [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] No waiting events found dispatching network-vif-unplugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.500 2 DEBUG nova.compute.manager [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Received event network-vif-unplugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.501 2 DEBUG nova.compute.manager [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Received event network-vif-plugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.501 2 DEBUG oslo_concurrency.lockutils [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.501 2 DEBUG oslo_concurrency.lockutils [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.502 2 DEBUG oslo_concurrency.lockutils [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.502 2 DEBUG nova.compute.manager [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] No waiting events found dispatching network-vif-plugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.502 2 WARNING nova.compute.manager [req-3442a853-6199-4393-bd9d-bb1ff1b8fdaa req-d69425fd-9f55-4b70-81bb-eed5178ef2b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Received unexpected event network-vif-plugged-1c535e54-4f52-4e80-8bfb-3ee8946adf22 for instance with vm_state active and task_state deleting.
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.546 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.546 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.570 2 DEBUG nova.network.neutron [req-fa626f10-bca6-4366-bd81-35076c67b30c req-5ffcf315-11a0-4623-b293-2fd93b2f2ba8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updated VIF entry in instance network info cache for port d7d27cdd-9f0b-467f-901a-08b4834d1496. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.570 2 DEBUG nova.network.neutron [req-fa626f10-bca6-4366-bd81-35076c67b30c req-5ffcf315-11a0-4623-b293-2fd93b2f2ba8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updating instance_info_cache with network_info: [{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.587 2 DEBUG nova.network.neutron [-] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.611 2 DEBUG oslo_concurrency.lockutils [req-fa626f10-bca6-4366-bd81-35076c67b30c req-5ffcf315-11a0-4623-b293-2fd93b2f2ba8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:01 compute-1 nova_compute[192795]: 2025-09-30 21:47:01.632 2 INFO nova.compute.manager [-] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Took 1.60 seconds to deallocate network for instance.
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.288 2 DEBUG oslo_concurrency.lockutils [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.289 2 DEBUG oslo_concurrency.lockutils [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.374 2 DEBUG nova.compute.provider_tree [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.397 2 DEBUG nova.scheduler.client.report [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.431 2 DEBUG oslo_concurrency.lockutils [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.467 2 INFO nova.scheduler.client.report [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Deleted allocations for instance c1ffcb24-6b20-4ed8-8287-65fb4e98d88c
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.546 2 DEBUG nova.network.neutron [-] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.567 2 INFO nova.compute.manager [-] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Took 2.51 seconds to deallocate network for instance.
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.581 2 DEBUG oslo_concurrency.lockutils [None req-0606ddf5-4ae5-4fc8-ba8a-9b721e67def8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.685 2 DEBUG oslo_concurrency.lockutils [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.686 2 DEBUG oslo_concurrency.lockutils [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.726 2 DEBUG nova.compute.provider_tree [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.754 2 DEBUG nova.scheduler.client.report [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.786 2 DEBUG oslo_concurrency.lockutils [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.821 2 INFO nova.scheduler.client.report [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Deleted allocations for instance 89b1d81c-69be-444e-8ba0-7bd444fa4d41
Sep 30 21:47:02 compute-1 nova_compute[192795]: 2025-09-30 21:47:02.908 2 DEBUG oslo_concurrency.lockutils [None req-6534900d-1e73-4137-a7e8-19612728ec5a 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "89b1d81c-69be-444e-8ba0-7bd444fa4d41" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:03 compute-1 nova_compute[192795]: 2025-09-30 21:47:03.078 2 DEBUG nova.compute.manager [req-1a79dc32-b892-4246-b96a-6b23c8bccef7 req-ec6ece1d-3882-4f79-9c0a-1a165edfb3d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:03 compute-1 nova_compute[192795]: 2025-09-30 21:47:03.078 2 DEBUG oslo_concurrency.lockutils [req-1a79dc32-b892-4246-b96a-6b23c8bccef7 req-ec6ece1d-3882-4f79-9c0a-1a165edfb3d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:03 compute-1 nova_compute[192795]: 2025-09-30 21:47:03.078 2 DEBUG oslo_concurrency.lockutils [req-1a79dc32-b892-4246-b96a-6b23c8bccef7 req-ec6ece1d-3882-4f79-9c0a-1a165edfb3d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:03 compute-1 nova_compute[192795]: 2025-09-30 21:47:03.079 2 DEBUG oslo_concurrency.lockutils [req-1a79dc32-b892-4246-b96a-6b23c8bccef7 req-ec6ece1d-3882-4f79-9c0a-1a165edfb3d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:03 compute-1 nova_compute[192795]: 2025-09-30 21:47:03.079 2 DEBUG nova.compute.manager [req-1a79dc32-b892-4246-b96a-6b23c8bccef7 req-ec6ece1d-3882-4f79-9c0a-1a165edfb3d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] No waiting events found dispatching network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:03 compute-1 nova_compute[192795]: 2025-09-30 21:47:03.079 2 WARNING nova.compute.manager [req-1a79dc32-b892-4246-b96a-6b23c8bccef7 req-ec6ece1d-3882-4f79-9c0a-1a165edfb3d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received unexpected event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 for instance with vm_state deleted and task_state None.
Sep 30 21:47:03 compute-1 nova_compute[192795]: 2025-09-30 21:47:03.079 2 DEBUG nova.compute.manager [req-1a79dc32-b892-4246-b96a-6b23c8bccef7 req-ec6ece1d-3882-4f79-9c0a-1a165edfb3d7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-deleted-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:03 compute-1 nova_compute[192795]: 2025-09-30 21:47:03.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:04 compute-1 nova_compute[192795]: 2025-09-30 21:47:04.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.107 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.108 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.125 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.290 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.290 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.297 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.298 2 INFO nova.compute.claims [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.438 2 DEBUG nova.compute.provider_tree [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.459 2 DEBUG nova.scheduler.client.report [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.485 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.486 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.535 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.536 2 DEBUG nova.network.neutron [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.554 2 INFO nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.573 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:05 compute-1 nova_compute[192795]: 2025-09-30 21:47:05.736 2 DEBUG nova.policy [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.911 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.913 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.914 2 INFO nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Creating image(s)
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.915 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.916 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.917 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:07 compute-1 nova_compute[192795]: 2025-09-30 21:47:07.945 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.005 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.007 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.008 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.034 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.091 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.092 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.126 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.127 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.128 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.186 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.187 2 DEBUG nova.virt.disk.api [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.188 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.254 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.255 2 DEBUG nova.virt.disk.api [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.255 2 DEBUG nova.objects.instance [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid 92b445f9-0995-4201-aac6-9a8bd8c4a418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.295 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.296 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Ensure instance console log exists: /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.297 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.297 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.297 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:08 compute-1 nova_compute[192795]: 2025-09-30 21:47:08.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:09 compute-1 podman[246394]: 2025-09-30 21:47:09.217254203 +0000 UTC m=+0.047589247 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:47:09 compute-1 podman[246392]: 2025-09-30 21:47:09.225053056 +0000 UTC m=+0.065263474 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:47:09 compute-1 podman[246393]: 2025-09-30 21:47:09.31709558 +0000 UTC m=+0.141950549 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:47:09 compute-1 nova_compute[192795]: 2025-09-30 21:47:09.617 2 DEBUG nova.network.neutron [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Successfully created port: f604c930-42c7-4f2f-b4ba-f7f70ca08cfb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:47:09 compute-1 nova_compute[192795]: 2025-09-30 21:47:09.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:10 compute-1 nova_compute[192795]: 2025-09-30 21:47:10.635 2 DEBUG nova.network.neutron [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Successfully created port: 71b40126-7c96-4991-bdc4-716828a750fd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:47:11 compute-1 nova_compute[192795]: 2025-09-30 21:47:11.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.279 2 DEBUG nova.network.neutron [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Successfully updated port: f604c930-42c7-4f2f-b4ba-f7f70ca08cfb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.489 2 DEBUG nova.compute.manager [req-fcaf8d6b-b498-46ca-bf60-e55a14e8e140 req-46508530-e3b1-4432-b3c1-d34b74ce5821 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-changed-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.490 2 DEBUG nova.compute.manager [req-fcaf8d6b-b498-46ca-bf60-e55a14e8e140 req-46508530-e3b1-4432-b3c1-d34b74ce5821 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Refreshing instance network info cache due to event network-changed-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.490 2 DEBUG oslo_concurrency.lockutils [req-fcaf8d6b-b498-46ca-bf60-e55a14e8e140 req-46508530-e3b1-4432-b3c1-d34b74ce5821 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.491 2 DEBUG oslo_concurrency.lockutils [req-fcaf8d6b-b498-46ca-bf60-e55a14e8e140 req-46508530-e3b1-4432-b3c1-d34b74ce5821 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.491 2 DEBUG nova.network.neutron [req-fcaf8d6b-b498-46ca-bf60-e55a14e8e140 req-46508530-e3b1-4432-b3c1-d34b74ce5821 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Refreshing network info cache for port f604c930-42c7-4f2f-b4ba-f7f70ca08cfb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.797 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.798 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:47:12 compute-1 nova_compute[192795]: 2025-09-30 21:47:12.921 2 DEBUG nova.network.neutron [req-fcaf8d6b-b498-46ca-bf60-e55a14e8e140 req-46508530-e3b1-4432-b3c1-d34b74ce5821 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:47:13 compute-1 nova_compute[192795]: 2025-09-30 21:47:13.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.027 2 DEBUG nova.network.neutron [req-fcaf8d6b-b498-46ca-bf60-e55a14e8e140 req-46508530-e3b1-4432-b3c1-d34b74ce5821 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.047 2 DEBUG oslo_concurrency.lockutils [req-fcaf8d6b-b498-46ca-bf60-e55a14e8e140 req-46508530-e3b1-4432-b3c1-d34b74ce5821 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.115 2 DEBUG nova.network.neutron [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Successfully updated port: 71b40126-7c96-4991-bdc4-716828a750fd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.133 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.133 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.134 2 DEBUG nova.network.neutron [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:47:14 compute-1 podman[246458]: 2025-09-30 21:47:14.223229707 +0000 UTC m=+0.065773046 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.348 2 DEBUG nova.network.neutron [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.798 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268819.7982497, c1ffcb24-6b20-4ed8-8287-65fb4e98d88c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.799 2 INFO nova.compute.manager [-] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] VM Stopped (Lifecycle Event)
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.825 2 DEBUG nova.compute.manager [None req-eddddefe-0117-4f7e-97b6-02032731336c - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.854 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268819.8517156, 89b1d81c-69be-444e-8ba0-7bd444fa4d41 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.855 2 INFO nova.compute.manager [-] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] VM Stopped (Lifecycle Event)
Sep 30 21:47:14 compute-1 nova_compute[192795]: 2025-09-30 21:47:14.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:15 compute-1 nova_compute[192795]: 2025-09-30 21:47:15.473 2 DEBUG nova.compute.manager [None req-1270e015-1666-46c4-80f5-a0eee43a069b - - - - - -] [instance: 89b1d81c-69be-444e-8ba0-7bd444fa4d41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:16 compute-1 nova_compute[192795]: 2025-09-30 21:47:16.052 2 DEBUG nova.compute.manager [req-116f7c25-80b3-4af3-810e-245996fa9808 req-b0810a92-bb55-4ba8-8d2f-5fef24475d4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-changed-71b40126-7c96-4991-bdc4-716828a750fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:16 compute-1 nova_compute[192795]: 2025-09-30 21:47:16.053 2 DEBUG nova.compute.manager [req-116f7c25-80b3-4af3-810e-245996fa9808 req-b0810a92-bb55-4ba8-8d2f-5fef24475d4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Refreshing instance network info cache due to event network-changed-71b40126-7c96-4991-bdc4-716828a750fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:16 compute-1 nova_compute[192795]: 2025-09-30 21:47:16.053 2 DEBUG oslo_concurrency.lockutils [req-116f7c25-80b3-4af3-810e-245996fa9808 req-b0810a92-bb55-4ba8-8d2f-5fef24475d4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:18 compute-1 nova_compute[192795]: 2025-09-30 21:47:18.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:18 compute-1 nova_compute[192795]: 2025-09-30 21:47:18.967 2 DEBUG nova.network.neutron [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Updating instance_info_cache with network_info: [{"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:18 compute-1 nova_compute[192795]: 2025-09-30 21:47:18.991 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:18 compute-1 nova_compute[192795]: 2025-09-30 21:47:18.992 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Instance network_info: |[{"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:47:18 compute-1 nova_compute[192795]: 2025-09-30 21:47:18.992 2 DEBUG oslo_concurrency.lockutils [req-116f7c25-80b3-4af3-810e-245996fa9808 req-b0810a92-bb55-4ba8-8d2f-5fef24475d4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:18 compute-1 nova_compute[192795]: 2025-09-30 21:47:18.992 2 DEBUG nova.network.neutron [req-116f7c25-80b3-4af3-810e-245996fa9808 req-b0810a92-bb55-4ba8-8d2f-5fef24475d4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Refreshing network info cache for port 71b40126-7c96-4991-bdc4-716828a750fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:18 compute-1 nova_compute[192795]: 2025-09-30 21:47:18.997 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Start _get_guest_xml network_info=[{"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.000 2 WARNING nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.004 2 DEBUG nova.virt.libvirt.host [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.004 2 DEBUG nova.virt.libvirt.host [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.008 2 DEBUG nova.virt.libvirt.host [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.008 2 DEBUG nova.virt.libvirt.host [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.009 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.010 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.010 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.010 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.011 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.011 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.011 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.011 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.011 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.012 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.012 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.012 2 DEBUG nova.virt.hardware [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.015 2 DEBUG nova.virt.libvirt.vif [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1286955007',display_name='tempest-TestGettingAddress-server-1286955007',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1286955007',id=157,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFNizAru+rn1r5m8hhu4KyrTgI1anlmS25psFBq3L9HEhvkl0+I02Y99PbperiPr/frQcIa5uK1vOoCOrssf3A2v1wEzdcs7e2N8tU8ip/c+FMogBYdqYv4HPV/Z7m9ySQ==',key_name='tempest-TestGettingAddress-1032720980',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-2h3yv1p2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:05Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=92b445f9-0995-4201-aac6-9a8bd8c4a418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.016 2 DEBUG nova.network.os_vif_util [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.016 2 DEBUG nova.network.os_vif_util [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:64:66,bridge_name='br-int',has_traffic_filtering=True,id=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb,network=Network(f2fdd4b4-9613-49d9-a773-931de49e44b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf604c930-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.017 2 DEBUG nova.virt.libvirt.vif [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1286955007',display_name='tempest-TestGettingAddress-server-1286955007',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1286955007',id=157,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFNizAru+rn1r5m8hhu4KyrTgI1anlmS25psFBq3L9HEhvkl0+I02Y99PbperiPr/frQcIa5uK1vOoCOrssf3A2v1wEzdcs7e2N8tU8ip/c+FMogBYdqYv4HPV/Z7m9ySQ==',key_name='tempest-TestGettingAddress-1032720980',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-2h3yv1p2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:05Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=92b445f9-0995-4201-aac6-9a8bd8c4a418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.017 2 DEBUG nova.network.os_vif_util [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.018 2 DEBUG nova.network.os_vif_util [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:2a:80,bridge_name='br-int',has_traffic_filtering=True,id=71b40126-7c96-4991-bdc4-716828a750fd,network=Network(d22f103a-1a95-4031-ae6e-c474eae9834e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b40126-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.019 2 DEBUG nova.objects.instance [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 92b445f9-0995-4201-aac6-9a8bd8c4a418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.039 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <uuid>92b445f9-0995-4201-aac6-9a8bd8c4a418</uuid>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <name>instance-0000009d</name>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <nova:name>tempest-TestGettingAddress-server-1286955007</nova:name>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:47:19</nova:creationTime>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:47:19 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         <nova:port uuid="f604c930-42c7-4f2f-b4ba-f7f70ca08cfb">
Sep 30 21:47:19 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         <nova:port uuid="71b40126-7c96-4991-bdc4-716828a750fd">
Sep 30 21:47:19 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe5b:2a80" ipVersion="6"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe5b:2a80" ipVersion="6"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <system>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <entry name="serial">92b445f9-0995-4201-aac6-9a8bd8c4a418</entry>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <entry name="uuid">92b445f9-0995-4201-aac6-9a8bd8c4a418</entry>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </system>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <os>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   </os>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <features>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   </features>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk.config"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:4e:64:66"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <target dev="tapf604c930-42"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:5b:2a:80"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <target dev="tap71b40126-7c"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/console.log" append="off"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <video>
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </video>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:47:19 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:47:19 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:47:19 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:47:19 compute-1 nova_compute[192795]: </domain>
Sep 30 21:47:19 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.040 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Preparing to wait for external event network-vif-plugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.041 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.041 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.041 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.041 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Preparing to wait for external event network-vif-plugged-71b40126-7c96-4991-bdc4-716828a750fd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.042 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.042 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.042 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.043 2 DEBUG nova.virt.libvirt.vif [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1286955007',display_name='tempest-TestGettingAddress-server-1286955007',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1286955007',id=157,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFNizAru+rn1r5m8hhu4KyrTgI1anlmS25psFBq3L9HEhvkl0+I02Y99PbperiPr/frQcIa5uK1vOoCOrssf3A2v1wEzdcs7e2N8tU8ip/c+FMogBYdqYv4HPV/Z7m9ySQ==',key_name='tempest-TestGettingAddress-1032720980',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-2h3yv1p2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:05Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=92b445f9-0995-4201-aac6-9a8bd8c4a418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.043 2 DEBUG nova.network.os_vif_util [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.043 2 DEBUG nova.network.os_vif_util [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:64:66,bridge_name='br-int',has_traffic_filtering=True,id=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb,network=Network(f2fdd4b4-9613-49d9-a773-931de49e44b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf604c930-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.044 2 DEBUG os_vif [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:64:66,bridge_name='br-int',has_traffic_filtering=True,id=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb,network=Network(f2fdd4b4-9613-49d9-a773-931de49e44b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf604c930-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.045 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.045 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.048 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf604c930-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.048 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf604c930-42, col_values=(('external_ids', {'iface-id': 'f604c930-42c7-4f2f-b4ba-f7f70ca08cfb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:64:66', 'vm-uuid': '92b445f9-0995-4201-aac6-9a8bd8c4a418'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 NetworkManager[51724]: <info>  [1759268839.0515] manager: (tapf604c930-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.056 2 INFO os_vif [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:64:66,bridge_name='br-int',has_traffic_filtering=True,id=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb,network=Network(f2fdd4b4-9613-49d9-a773-931de49e44b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf604c930-42')
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.057 2 DEBUG nova.virt.libvirt.vif [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1286955007',display_name='tempest-TestGettingAddress-server-1286955007',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1286955007',id=157,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFNizAru+rn1r5m8hhu4KyrTgI1anlmS25psFBq3L9HEhvkl0+I02Y99PbperiPr/frQcIa5uK1vOoCOrssf3A2v1wEzdcs7e2N8tU8ip/c+FMogBYdqYv4HPV/Z7m9ySQ==',key_name='tempest-TestGettingAddress-1032720980',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-2h3yv1p2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:05Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=92b445f9-0995-4201-aac6-9a8bd8c4a418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.057 2 DEBUG nova.network.os_vif_util [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.059 2 DEBUG nova.network.os_vif_util [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:2a:80,bridge_name='br-int',has_traffic_filtering=True,id=71b40126-7c96-4991-bdc4-716828a750fd,network=Network(d22f103a-1a95-4031-ae6e-c474eae9834e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b40126-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.059 2 DEBUG os_vif [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:2a:80,bridge_name='br-int',has_traffic_filtering=True,id=71b40126-7c96-4991-bdc4-716828a750fd,network=Network(d22f103a-1a95-4031-ae6e-c474eae9834e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b40126-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.060 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.060 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71b40126-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71b40126-7c, col_values=(('external_ids', {'iface-id': '71b40126-7c96-4991-bdc4-716828a750fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:2a:80', 'vm-uuid': '92b445f9-0995-4201-aac6-9a8bd8c4a418'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 NetworkManager[51724]: <info>  [1759268839.0655] manager: (tap71b40126-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.070 2 INFO os_vif [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:2a:80,bridge_name='br-int',has_traffic_filtering=True,id=71b40126-7c96-4991-bdc4-716828a750fd,network=Network(d22f103a-1a95-4031-ae6e-c474eae9834e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b40126-7c')
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.125 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.125 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.126 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:4e:64:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.126 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:5b:2a:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.126 2 INFO nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Using config drive
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.787 2 INFO nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Creating config drive at /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk.config
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.791 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3yqchzld execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.913 2 DEBUG oslo_concurrency.processutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3yqchzld" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:19 compute-1 NetworkManager[51724]: <info>  [1759268839.9695] manager: (tapf604c930-42): new Tun device (/org/freedesktop/NetworkManager/Devices/307)
Sep 30 21:47:19 compute-1 kernel: tapf604c930-42: entered promiscuous mode
Sep 30 21:47:19 compute-1 ovn_controller[94902]: 2025-09-30T21:47:19Z|00616|binding|INFO|Claiming lport f604c930-42c7-4f2f-b4ba-f7f70ca08cfb for this chassis.
Sep 30 21:47:19 compute-1 ovn_controller[94902]: 2025-09-30T21:47:19Z|00617|binding|INFO|f604c930-42c7-4f2f-b4ba-f7f70ca08cfb: Claiming fa:16:3e:4e:64:66 10.100.0.9
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 kernel: tap71b40126-7c: entered promiscuous mode
Sep 30 21:47:19 compute-1 NetworkManager[51724]: <info>  [1759268839.9869] manager: (tap71b40126-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/308)
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 nova_compute[192795]: 2025-09-30 21:47:19.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-1 ovn_controller[94902]: 2025-09-30T21:47:19Z|00618|if_status|INFO|Not updating pb chassis for 71b40126-7c96-4991-bdc4-716828a750fd now as sb is readonly
Sep 30 21:47:19 compute-1 NetworkManager[51724]: <info>  [1759268839.9930] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Sep 30 21:47:19 compute-1 NetworkManager[51724]: <info>  [1759268839.9938] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.000 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:64:66 10.100.0.9'], port_security=['fa:16:3e:4e:64:66 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '92b445f9-0995-4201-aac6-9a8bd8c4a418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1d7fb313-89ed-47f5-8144-9e0ec910522f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb9a937-bd5b-4678-a5dd-7e128653bbf0, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.001 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f604c930-42c7-4f2f-b4ba-f7f70ca08cfb in datapath f2fdd4b4-9613-49d9-a773-931de49e44b7 bound to our chassis
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.003 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2fdd4b4-9613-49d9-a773-931de49e44b7
Sep 30 21:47:20 compute-1 systemd-udevd[246502]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:47:20 compute-1 systemd-udevd[246501]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.015 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[831d8d67-e87a-4adc-a9ca-afcbf2cbeca2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.017 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2fdd4b4-91 in ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:47:20 compute-1 NetworkManager[51724]: <info>  [1759268840.0189] device (tapf604c930-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:47:20 compute-1 NetworkManager[51724]: <info>  [1759268840.0198] device (tap71b40126-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:47:20 compute-1 NetworkManager[51724]: <info>  [1759268840.0206] device (tapf604c930-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:47:20 compute-1 NetworkManager[51724]: <info>  [1759268840.0211] device (tap71b40126-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.019 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2fdd4b4-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.019 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[000f40bd-45b9-4716-ba77-46a97715a14e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.022 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[af4f091b-3d81-4b41-904e-02e29ea722c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.040 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[8602b7d4-725c-4fbc-b5cd-fa5ffcc3c814]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 systemd-machined[152783]: New machine qemu-73-instance-0000009d.
Sep 30 21:47:20 compute-1 systemd[1]: Started Virtual Machine qemu-73-instance-0000009d.
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.076 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[34da4ab1-59d1-434b-9131-676ba36d1b28]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.109 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[d69fa898-d0bc-4f15-bd31-03e8b2bbdefb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 NetworkManager[51724]: <info>  [1759268840.1340] manager: (tapf2fdd4b4-90): new Veth device (/org/freedesktop/NetworkManager/Devices/311)
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.133 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[54457787-5d7e-43db-b984-c39b8a930438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.174 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[76e9dd31-b874-4b4c-b8ef-8f2a8c64d3f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_controller[94902]: 2025-09-30T21:47:20Z|00619|binding|INFO|Claiming lport 71b40126-7c96-4991-bdc4-716828a750fd for this chassis.
Sep 30 21:47:20 compute-1 ovn_controller[94902]: 2025-09-30T21:47:20Z|00620|binding|INFO|71b40126-7c96-4991-bdc4-716828a750fd: Claiming fa:16:3e:5b:2a:80 2001:db8:0:1:f816:3eff:fe5b:2a80 2001:db8::f816:3eff:fe5b:2a80
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.182 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[98468fbe-a1eb-45eb-9dbd-d45bef0aaa68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_controller[94902]: 2025-09-30T21:47:20Z|00621|binding|INFO|Setting lport f604c930-42c7-4f2f-b4ba-f7f70ca08cfb ovn-installed in OVS
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:20 compute-1 ovn_controller[94902]: 2025-09-30T21:47:20Z|00622|binding|INFO|Setting lport f604c930-42c7-4f2f-b4ba-f7f70ca08cfb up in Southbound
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.203 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:2a:80 2001:db8:0:1:f816:3eff:fe5b:2a80 2001:db8::f816:3eff:fe5b:2a80'], port_security=['fa:16:3e:5b:2a:80 2001:db8:0:1:f816:3eff:fe5b:2a80 2001:db8::f816:3eff:fe5b:2a80'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe5b:2a80/64 2001:db8::f816:3eff:fe5b:2a80/64', 'neutron:device_id': '92b445f9-0995-4201-aac6-9a8bd8c4a418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1d7fb313-89ed-47f5-8144-9e0ec910522f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9942446e-c20f-4a7d-bedc-ac08b4f4b886, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=71b40126-7c96-4991-bdc4-716828a750fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:20 compute-1 NetworkManager[51724]: <info>  [1759268840.2131] device (tapf2fdd4b4-90): carrier: link connected
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.219 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ef894219-ee34-4ce5-bda5-1a172c62ce8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_controller[94902]: 2025-09-30T21:47:20Z|00623|binding|INFO|Setting lport 71b40126-7c96-4991-bdc4-716828a750fd ovn-installed in OVS
Sep 30 21:47:20 compute-1 ovn_controller[94902]: 2025-09-30T21:47:20Z|00624|binding|INFO|Setting lport 71b40126-7c96-4991-bdc4-716828a750fd up in Southbound
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.237 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9b23b90a-86c7-4756-80ee-157ff9d64246]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2fdd4b4-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:57:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 197], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553588, 'reachable_time': 42264, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246538, 'error': None, 'target': 'ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.253 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0d927cf9-cdd8-4844-83c5-9fb82d37f6dd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:5772'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 553588, 'tstamp': 553588}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246539, 'error': None, 'target': 'ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.270 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[00a6804d-bd7f-424b-ad21-3d43d4d8100d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2fdd4b4-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:57:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 197], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553588, 'reachable_time': 42264, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246540, 'error': None, 'target': 'ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.304 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f5155d13-666f-4c3c-8a31-dc342f8be3a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.365 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[73498e43-0972-44e2-ba9c-3f69e0802711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.367 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2fdd4b4-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.367 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.367 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2fdd4b4-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:20 compute-1 NetworkManager[51724]: <info>  [1759268840.3697] manager: (tapf2fdd4b4-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Sep 30 21:47:20 compute-1 kernel: tapf2fdd4b4-90: entered promiscuous mode
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.372 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2fdd4b4-90, col_values=(('external_ids', {'iface-id': 'c7e1f19a-324f-4974-a4e5-ff6e6c250796'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:20 compute-1 ovn_controller[94902]: 2025-09-30T21:47:20Z|00625|binding|INFO|Releasing lport c7e1f19a-324f-4974-a4e5-ff6e6c250796 from this chassis (sb_readonly=0)
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.386 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2fdd4b4-9613-49d9-a773-931de49e44b7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2fdd4b4-9613-49d9-a773-931de49e44b7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.387 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cd924498-f40f-4c46-a398-d969ef9b760f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.387 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f2fdd4b4-9613-49d9-a773-931de49e44b7
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f2fdd4b4-9613-49d9-a773-931de49e44b7.pid.haproxy
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f2fdd4b4-9613-49d9-a773-931de49e44b7
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.388 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'env', 'PROCESS_TAG=haproxy-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2fdd4b4-9613-49d9-a773-931de49e44b7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:47:20 compute-1 podman[246572]: 2025-09-30 21:47:20.75057873 +0000 UTC m=+0.056209170 container create 94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.761 2 DEBUG nova.compute.manager [req-20f0b45b-bcde-419e-a6e1-29acc383f8a8 req-c7d1731a-e047-4398-8fbf-d23694df2894 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-plugged-71b40126-7c96-4991-bdc4-716828a750fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.762 2 DEBUG oslo_concurrency.lockutils [req-20f0b45b-bcde-419e-a6e1-29acc383f8a8 req-c7d1731a-e047-4398-8fbf-d23694df2894 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.763 2 DEBUG oslo_concurrency.lockutils [req-20f0b45b-bcde-419e-a6e1-29acc383f8a8 req-c7d1731a-e047-4398-8fbf-d23694df2894 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.763 2 DEBUG oslo_concurrency.lockutils [req-20f0b45b-bcde-419e-a6e1-29acc383f8a8 req-c7d1731a-e047-4398-8fbf-d23694df2894 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:20 compute-1 nova_compute[192795]: 2025-09-30 21:47:20.763 2 DEBUG nova.compute.manager [req-20f0b45b-bcde-419e-a6e1-29acc383f8a8 req-c7d1731a-e047-4398-8fbf-d23694df2894 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Processing event network-vif-plugged-71b40126-7c96-4991-bdc4-716828a750fd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:47:20 compute-1 systemd[1]: Started libpod-conmon-94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce.scope.
Sep 30 21:47:20 compute-1 podman[246572]: 2025-09-30 21:47:20.719842851 +0000 UTC m=+0.025473321 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:47:20 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:47:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/759b148ef3c501d2e4cb4a43b238373a5393319b49e6b723326927ec9c260345/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:47:20 compute-1 podman[246586]: 2025-09-30 21:47:20.853788121 +0000 UTC m=+0.066394581 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:47:20 compute-1 podman[246585]: 2025-09-30 21:47:20.853803122 +0000 UTC m=+0.070019882 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Sep 30 21:47:20 compute-1 podman[246572]: 2025-09-30 21:47:20.8545221 +0000 UTC m=+0.160152560 container init 94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:47:20 compute-1 podman[246572]: 2025-09-30 21:47:20.864400354 +0000 UTC m=+0.170030794 container start 94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:47:20 compute-1 podman[246587]: 2025-09-30 21:47:20.876128983 +0000 UTC m=+0.086433627 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 21:47:20 compute-1 neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7[246615]: [NOTICE]   (246652) : New worker (246654) forked
Sep 30 21:47:20 compute-1 neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7[246615]: [NOTICE]   (246652) : Loading success.
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.945 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 71b40126-7c96-4991-bdc4-716828a750fd in datapath d22f103a-1a95-4031-ae6e-c474eae9834e unbound from our chassis
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.947 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d22f103a-1a95-4031-ae6e-c474eae9834e
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.960 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[541afee4-12c5-4291-a6f8-14dce313df1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.961 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd22f103a-11 in ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.963 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd22f103a-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.963 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a284dd-6f17-460c-aef1-880303cc63e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.964 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[02816f10-134c-47b0-84bd-5e18e678eab9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:20.979 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[897e1de7-1b08-4a45-a787-a3f8c4cf7979]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.000 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[32f06fda-b4c5-4bc6-bc25-514616668f45]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.035 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb22a35-0f94-4948-8d3b-c97640f604af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.042 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a8acc196-ae02-469e-9fa5-2466d43ae769]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 NetworkManager[51724]: <info>  [1759268841.0436] manager: (tapd22f103a-10): new Veth device (/org/freedesktop/NetworkManager/Devices/313)
Sep 30 21:47:21 compute-1 systemd-udevd[246529]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.062 2 DEBUG nova.network.neutron [req-116f7c25-80b3-4af3-810e-245996fa9808 req-b0810a92-bb55-4ba8-8d2f-5fef24475d4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Updated VIF entry in instance network info cache for port 71b40126-7c96-4991-bdc4-716828a750fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.063 2 DEBUG nova.network.neutron [req-116f7c25-80b3-4af3-810e-245996fa9808 req-b0810a92-bb55-4ba8-8d2f-5fef24475d4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Updating instance_info_cache with network_info: [{"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.076 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6e71e5c4-4d5f-4d58-836b-f547880d108b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.080 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[1e69d612-16a7-449a-a753-5559e4e1c282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.103 2 DEBUG oslo_concurrency.lockutils [req-116f7c25-80b3-4af3-810e-245996fa9808 req-b0810a92-bb55-4ba8-8d2f-5fef24475d4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:21 compute-1 NetworkManager[51724]: <info>  [1759268841.1078] device (tapd22f103a-10): carrier: link connected
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.114 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[136c7628-68a7-4b1e-90fa-c5d652554ccb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.135 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4dd278-c985-4a1b-b56e-783e5651066f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd22f103a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:b1:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553678, 'reachable_time': 21137, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246673, 'error': None, 'target': 'ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.153 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dd68063a-4b73-4a5d-8228-a82a1cdce8e6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:b18a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 553678, 'tstamp': 553678}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246674, 'error': None, 'target': 'ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.177 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e886989c-8534-490a-b5a4-0350b45ec3fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd22f103a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:b1:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553678, 'reachable_time': 21137, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246675, 'error': None, 'target': 'ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.210 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0aaf091b-12e7-4fd4-81bf-d4aa56ce1a1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.250 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2fcceb-2660-4544-bf7d-a6f33e0edefd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.252 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd22f103a-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.252 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.253 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd22f103a-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:21 compute-1 NetworkManager[51724]: <info>  [1759268841.2560] manager: (tapd22f103a-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Sep 30 21:47:21 compute-1 kernel: tapd22f103a-10: entered promiscuous mode
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.257 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd22f103a-10, col_values=(('external_ids', {'iface-id': '4bd0bdb1-40a0-42f8-98d1-c84ba21808c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:21 compute-1 ovn_controller[94902]: 2025-09-30T21:47:21Z|00626|binding|INFO|Releasing lport 4bd0bdb1-40a0-42f8-98d1-c84ba21808c1 from this chassis (sb_readonly=0)
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.260 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d22f103a-1a95-4031-ae6e-c474eae9834e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d22f103a-1a95-4031-ae6e-c474eae9834e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.261 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bde2ee62-6abb-4961-8055-b01aa2b722e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.262 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-d22f103a-1a95-4031-ae6e-c474eae9834e
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/d22f103a-1a95-4031-ae6e-c474eae9834e.pid.haproxy
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID d22f103a-1a95-4031-ae6e-c474eae9834e
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:47:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:21.263 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e', 'env', 'PROCESS_TAG=haproxy-d22f103a-1a95-4031-ae6e-c474eae9834e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d22f103a-1a95-4031-ae6e-c474eae9834e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.556 2 DEBUG nova.compute.manager [req-29dc784c-dc3f-40cc-933e-941e788f7dbc req-9668f71d-1a32-49cd-8a11-7099840afb0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-plugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.556 2 DEBUG oslo_concurrency.lockutils [req-29dc784c-dc3f-40cc-933e-941e788f7dbc req-9668f71d-1a32-49cd-8a11-7099840afb0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.557 2 DEBUG oslo_concurrency.lockutils [req-29dc784c-dc3f-40cc-933e-941e788f7dbc req-9668f71d-1a32-49cd-8a11-7099840afb0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.557 2 DEBUG oslo_concurrency.lockutils [req-29dc784c-dc3f-40cc-933e-941e788f7dbc req-9668f71d-1a32-49cd-8a11-7099840afb0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.557 2 DEBUG nova.compute.manager [req-29dc784c-dc3f-40cc-933e-941e788f7dbc req-9668f71d-1a32-49cd-8a11-7099840afb0e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Processing event network-vif-plugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:47:21 compute-1 podman[246713]: 2025-09-30 21:47:21.655802212 +0000 UTC m=+0.065992952 container create 9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:47:21 compute-1 systemd[1]: Started libpod-conmon-9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a.scope.
Sep 30 21:47:21 compute-1 podman[246713]: 2025-09-30 21:47:21.617522157 +0000 UTC m=+0.027712977 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:47:21 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:47:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5eae7e0e386142a91b02ff524cae769ca688ebe86432b54f48ad957a19b5bab4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:47:21 compute-1 podman[246713]: 2025-09-30 21:47:21.738915786 +0000 UTC m=+0.149106526 container init 9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:47:21 compute-1 podman[246713]: 2025-09-30 21:47:21.744119235 +0000 UTC m=+0.154309955 container start 9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:47:21 compute-1 neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e[246728]: [NOTICE]   (246732) : New worker (246734) forked
Sep 30 21:47:21 compute-1 neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e[246728]: [NOTICE]   (246732) : Loading success.
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.867 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268841.8663883, 92b445f9-0995-4201-aac6-9a8bd8c4a418 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.867 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] VM Started (Lifecycle Event)
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.876 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.881 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.885 2 INFO nova.virt.libvirt.driver [-] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Instance spawned successfully.
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.885 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.893 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.897 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.906 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.907 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.907 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.907 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.908 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.908 2 DEBUG nova.virt.libvirt.driver [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.912 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.912 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268841.8720262, 92b445f9-0995-4201-aac6-9a8bd8c4a418 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.912 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] VM Paused (Lifecycle Event)
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.941 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.945 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268841.8802195, 92b445f9-0995-4201-aac6-9a8bd8c4a418 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.945 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] VM Resumed (Lifecycle Event)
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.973 2 INFO nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Took 14.06 seconds to spawn the instance on the hypervisor.
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.974 2 DEBUG nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.977 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:21 compute-1 nova_compute[192795]: 2025-09-30 21:47:21.983 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:47:22 compute-1 nova_compute[192795]: 2025-09-30 21:47:22.018 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:47:22 compute-1 nova_compute[192795]: 2025-09-30 21:47:22.087 2 INFO nova.compute.manager [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Took 16.85 seconds to build instance.
Sep 30 21:47:22 compute-1 nova_compute[192795]: 2025-09-30 21:47:22.102 2 DEBUG oslo_concurrency.lockutils [None req-67d764e1-77a8-406a-bd18-32a33eeb7a4c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.995s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:22 compute-1 nova_compute[192795]: 2025-09-30 21:47:22.847 2 DEBUG nova.compute.manager [req-2c90620f-d2a0-4b81-9184-96c7bd4c203f req-31cea374-2740-4619-96a5-6a107ec4fc23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-plugged-71b40126-7c96-4991-bdc4-716828a750fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:22 compute-1 nova_compute[192795]: 2025-09-30 21:47:22.848 2 DEBUG oslo_concurrency.lockutils [req-2c90620f-d2a0-4b81-9184-96c7bd4c203f req-31cea374-2740-4619-96a5-6a107ec4fc23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:22 compute-1 nova_compute[192795]: 2025-09-30 21:47:22.848 2 DEBUG oslo_concurrency.lockutils [req-2c90620f-d2a0-4b81-9184-96c7bd4c203f req-31cea374-2740-4619-96a5-6a107ec4fc23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:22 compute-1 nova_compute[192795]: 2025-09-30 21:47:22.849 2 DEBUG oslo_concurrency.lockutils [req-2c90620f-d2a0-4b81-9184-96c7bd4c203f req-31cea374-2740-4619-96a5-6a107ec4fc23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:22 compute-1 nova_compute[192795]: 2025-09-30 21:47:22.849 2 DEBUG nova.compute.manager [req-2c90620f-d2a0-4b81-9184-96c7bd4c203f req-31cea374-2740-4619-96a5-6a107ec4fc23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] No waiting events found dispatching network-vif-plugged-71b40126-7c96-4991-bdc4-716828a750fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:22 compute-1 nova_compute[192795]: 2025-09-30 21:47:22.849 2 WARNING nova.compute.manager [req-2c90620f-d2a0-4b81-9184-96c7bd4c203f req-31cea374-2740-4619-96a5-6a107ec4fc23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received unexpected event network-vif-plugged-71b40126-7c96-4991-bdc4-716828a750fd for instance with vm_state active and task_state None.
Sep 30 21:47:23 compute-1 nova_compute[192795]: 2025-09-30 21:47:23.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:23 compute-1 nova_compute[192795]: 2025-09-30 21:47:23.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:23 compute-1 nova_compute[192795]: 2025-09-30 21:47:23.935 2 DEBUG nova.compute.manager [req-890cec59-0c42-4b67-99bb-219682f62541 req-6a14f15c-1819-4ad6-850d-4226a7379c64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-plugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:23 compute-1 nova_compute[192795]: 2025-09-30 21:47:23.935 2 DEBUG oslo_concurrency.lockutils [req-890cec59-0c42-4b67-99bb-219682f62541 req-6a14f15c-1819-4ad6-850d-4226a7379c64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:23 compute-1 nova_compute[192795]: 2025-09-30 21:47:23.936 2 DEBUG oslo_concurrency.lockutils [req-890cec59-0c42-4b67-99bb-219682f62541 req-6a14f15c-1819-4ad6-850d-4226a7379c64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:23 compute-1 nova_compute[192795]: 2025-09-30 21:47:23.936 2 DEBUG oslo_concurrency.lockutils [req-890cec59-0c42-4b67-99bb-219682f62541 req-6a14f15c-1819-4ad6-850d-4226a7379c64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:23 compute-1 nova_compute[192795]: 2025-09-30 21:47:23.936 2 DEBUG nova.compute.manager [req-890cec59-0c42-4b67-99bb-219682f62541 req-6a14f15c-1819-4ad6-850d-4226a7379c64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] No waiting events found dispatching network-vif-plugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:23 compute-1 nova_compute[192795]: 2025-09-30 21:47:23.936 2 WARNING nova.compute.manager [req-890cec59-0c42-4b67-99bb-219682f62541 req-6a14f15c-1819-4ad6-850d-4226a7379c64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received unexpected event network-vif-plugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb for instance with vm_state active and task_state None.
Sep 30 21:47:24 compute-1 nova_compute[192795]: 2025-09-30 21:47:24.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:24 compute-1 nova_compute[192795]: 2025-09-30 21:47:24.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:26 compute-1 nova_compute[192795]: 2025-09-30 21:47:26.103 2 DEBUG nova.compute.manager [req-29841c45-98bc-4a80-b572-7abe251189ed req-80c02783-1586-4259-96f9-a52894e4d66c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-changed-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:26 compute-1 nova_compute[192795]: 2025-09-30 21:47:26.106 2 DEBUG nova.compute.manager [req-29841c45-98bc-4a80-b572-7abe251189ed req-80c02783-1586-4259-96f9-a52894e4d66c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Refreshing instance network info cache due to event network-changed-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:26 compute-1 nova_compute[192795]: 2025-09-30 21:47:26.107 2 DEBUG oslo_concurrency.lockutils [req-29841c45-98bc-4a80-b572-7abe251189ed req-80c02783-1586-4259-96f9-a52894e4d66c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:26 compute-1 nova_compute[192795]: 2025-09-30 21:47:26.107 2 DEBUG oslo_concurrency.lockutils [req-29841c45-98bc-4a80-b572-7abe251189ed req-80c02783-1586-4259-96f9-a52894e4d66c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:26 compute-1 nova_compute[192795]: 2025-09-30 21:47:26.108 2 DEBUG nova.network.neutron [req-29841c45-98bc-4a80-b572-7abe251189ed req-80c02783-1586-4259-96f9-a52894e4d66c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Refreshing network info cache for port f604c930-42c7-4f2f-b4ba-f7f70ca08cfb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:27 compute-1 nova_compute[192795]: 2025-09-30 21:47:27.350 2 DEBUG nova.network.neutron [req-29841c45-98bc-4a80-b572-7abe251189ed req-80c02783-1586-4259-96f9-a52894e4d66c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Updated VIF entry in instance network info cache for port f604c930-42c7-4f2f-b4ba-f7f70ca08cfb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:47:27 compute-1 nova_compute[192795]: 2025-09-30 21:47:27.353 2 DEBUG nova.network.neutron [req-29841c45-98bc-4a80-b572-7abe251189ed req-80c02783-1586-4259-96f9-a52894e4d66c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Updating instance_info_cache with network_info: [{"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:27 compute-1 nova_compute[192795]: 2025-09-30 21:47:27.391 2 DEBUG oslo_concurrency.lockutils [req-29841c45-98bc-4a80-b572-7abe251189ed req-80c02783-1586-4259-96f9-a52894e4d66c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:28 compute-1 nova_compute[192795]: 2025-09-30 21:47:28.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:29 compute-1 nova_compute[192795]: 2025-09-30 21:47:29.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:30 compute-1 podman[246743]: 2025-09-30 21:47:30.233023131 +0000 UTC m=+0.076148874 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.550 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "ed2454d0-acc5-4651-9e10-4bee8b32306c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.551 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.586 2 DEBUG nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.752 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.753 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.763 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.763 2 INFO nova.compute.claims [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.926 2 DEBUG nova.compute.provider_tree [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.946 2 DEBUG nova.scheduler.client.report [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.967 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:31 compute-1 nova_compute[192795]: 2025-09-30 21:47:31.982 2 DEBUG nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.061 2 DEBUG nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.062 2 DEBUG nova.network.neutron [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.085 2 INFO nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.108 2 DEBUG nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.257 2 DEBUG nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.260 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.261 2 INFO nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Creating image(s)
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.262 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "/var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.263 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.265 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.298 2 DEBUG nova.policy [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27859618cb1d493cb1531af26b200b92', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '043721d1d0a2480fa785367fa56c1fa4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.303 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.366 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.368 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.369 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.391 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.457 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.458 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.500 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.502 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.502 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.565 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.567 2 DEBUG nova.virt.disk.api [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Checking if we can resize image /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.567 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.627 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.628 2 DEBUG nova.virt.disk.api [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Cannot resize image /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.628 2 DEBUG nova.objects.instance [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'migration_context' on Instance uuid ed2454d0-acc5-4651-9e10-4bee8b32306c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.640 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.640 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Ensure instance console log exists: /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.640 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.641 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:32 compute-1 nova_compute[192795]: 2025-09-30 21:47:32.641 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:33 compute-1 nova_compute[192795]: 2025-09-30 21:47:33.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:34 compute-1 nova_compute[192795]: 2025-09-30 21:47:34.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:34 compute-1 nova_compute[192795]: 2025-09-30 21:47:34.658 2 DEBUG nova.network.neutron [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Successfully created port: 8a8507fe-c765-4719-be69-19b0ada7b9f6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:47:35 compute-1 ovn_controller[94902]: 2025-09-30T21:47:35Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:64:66 10.100.0.9
Sep 30 21:47:35 compute-1 ovn_controller[94902]: 2025-09-30T21:47:35Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:64:66 10.100.0.9
Sep 30 21:47:35 compute-1 nova_compute[192795]: 2025-09-30 21:47:35.434 2 DEBUG nova.network.neutron [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Successfully updated port: 8a8507fe-c765-4719-be69-19b0ada7b9f6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:47:35 compute-1 nova_compute[192795]: 2025-09-30 21:47:35.456 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:35 compute-1 nova_compute[192795]: 2025-09-30 21:47:35.456 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquired lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:35 compute-1 nova_compute[192795]: 2025-09-30 21:47:35.457 2 DEBUG nova.network.neutron [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:47:35 compute-1 nova_compute[192795]: 2025-09-30 21:47:35.567 2 DEBUG nova.compute.manager [req-fe427470-f12b-45cf-a6ef-51b3a1a0248f req-6f2d97c5-90fe-49ad-a45e-87ab995da3f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received event network-changed-8a8507fe-c765-4719-be69-19b0ada7b9f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:35 compute-1 nova_compute[192795]: 2025-09-30 21:47:35.567 2 DEBUG nova.compute.manager [req-fe427470-f12b-45cf-a6ef-51b3a1a0248f req-6f2d97c5-90fe-49ad-a45e-87ab995da3f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Refreshing instance network info cache due to event network-changed-8a8507fe-c765-4719-be69-19b0ada7b9f6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:35 compute-1 nova_compute[192795]: 2025-09-30 21:47:35.567 2 DEBUG oslo_concurrency.lockutils [req-fe427470-f12b-45cf-a6ef-51b3a1a0248f req-6f2d97c5-90fe-49ad-a45e-87ab995da3f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:35 compute-1 nova_compute[192795]: 2025-09-30 21:47:35.603 2 DEBUG nova.network.neutron [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.898 2 DEBUG nova.network.neutron [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Updating instance_info_cache with network_info: [{"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.919 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Releasing lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.919 2 DEBUG nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Instance network_info: |[{"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.920 2 DEBUG oslo_concurrency.lockutils [req-fe427470-f12b-45cf-a6ef-51b3a1a0248f req-6f2d97c5-90fe-49ad-a45e-87ab995da3f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.920 2 DEBUG nova.network.neutron [req-fe427470-f12b-45cf-a6ef-51b3a1a0248f req-6f2d97c5-90fe-49ad-a45e-87ab995da3f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Refreshing network info cache for port 8a8507fe-c765-4719-be69-19b0ada7b9f6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.925 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Start _get_guest_xml network_info=[{"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.933 2 WARNING nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.944 2 DEBUG nova.virt.libvirt.host [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.946 2 DEBUG nova.virt.libvirt.host [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.949 2 DEBUG nova.virt.libvirt.host [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.950 2 DEBUG nova.virt.libvirt.host [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.952 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.952 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.952 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.953 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.953 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.953 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.953 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.953 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.954 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.954 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.954 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.954 2 DEBUG nova.virt.hardware [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.958 2 DEBUG nova.virt.libvirt.vif [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1965800618',display_name='tempest-TestNetworkBasicOps-server-1965800618',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1965800618',id=159,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKvtH0sLx5OTTHFMQTYM3QYH7RQYmwVnsMHxiGCV3jIouhC4rsGuQe5mK3HRfYiZRnIza+8ZrTNpDOstmMsD9jeBCmFzzrSDdePalLYvmd55ETV5u8CeRLVHZ692BjKBWQ==',key_name='tempest-TestNetworkBasicOps-1714887880',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-f1kn9lfz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:32Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=ed2454d0-acc5-4651-9e10-4bee8b32306c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.959 2 DEBUG nova.network.os_vif_util [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.960 2 DEBUG nova.network.os_vif_util [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:5f:41,bridge_name='br-int',has_traffic_filtering=True,id=8a8507fe-c765-4719-be69-19b0ada7b9f6,network=Network(0e652426-74cc-49f4-8211-ee80c6ea6be6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8507fe-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.961 2 DEBUG nova.objects.instance [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'pci_devices' on Instance uuid ed2454d0-acc5-4651-9e10-4bee8b32306c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.974 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <uuid>ed2454d0-acc5-4651-9e10-4bee8b32306c</uuid>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <name>instance-0000009f</name>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkBasicOps-server-1965800618</nova:name>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:47:36</nova:creationTime>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:47:36 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:47:36 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:47:36 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:47:36 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:47:36 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:47:36 compute-1 nova_compute[192795]:         <nova:user uuid="27859618cb1d493cb1531af26b200b92">tempest-TestNetworkBasicOps-2126023928-project-member</nova:user>
Sep 30 21:47:36 compute-1 nova_compute[192795]:         <nova:project uuid="043721d1d0a2480fa785367fa56c1fa4">tempest-TestNetworkBasicOps-2126023928</nova:project>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:47:36 compute-1 nova_compute[192795]:         <nova:port uuid="8a8507fe-c765-4719-be69-19b0ada7b9f6">
Sep 30 21:47:36 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <system>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <entry name="serial">ed2454d0-acc5-4651-9e10-4bee8b32306c</entry>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <entry name="uuid">ed2454d0-acc5-4651-9e10-4bee8b32306c</entry>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     </system>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <os>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   </os>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <features>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   </features>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk.config"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:04:5f:41"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <target dev="tap8a8507fe-c7"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/console.log" append="off"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <video>
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     </video>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:47:36 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:47:36 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:47:36 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:47:36 compute-1 nova_compute[192795]: </domain>
Sep 30 21:47:36 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.976 2 DEBUG nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Preparing to wait for external event network-vif-plugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.977 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.977 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.978 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.980 2 DEBUG nova.virt.libvirt.vif [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1965800618',display_name='tempest-TestNetworkBasicOps-server-1965800618',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1965800618',id=159,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKvtH0sLx5OTTHFMQTYM3QYH7RQYmwVnsMHxiGCV3jIouhC4rsGuQe5mK3HRfYiZRnIza+8ZrTNpDOstmMsD9jeBCmFzzrSDdePalLYvmd55ETV5u8CeRLVHZ692BjKBWQ==',key_name='tempest-TestNetworkBasicOps-1714887880',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-f1kn9lfz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:32Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=ed2454d0-acc5-4651-9e10-4bee8b32306c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.980 2 DEBUG nova.network.os_vif_util [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.982 2 DEBUG nova.network.os_vif_util [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:5f:41,bridge_name='br-int',has_traffic_filtering=True,id=8a8507fe-c765-4719-be69-19b0ada7b9f6,network=Network(0e652426-74cc-49f4-8211-ee80c6ea6be6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8507fe-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.983 2 DEBUG os_vif [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:5f:41,bridge_name='br-int',has_traffic_filtering=True,id=8a8507fe-c765-4719-be69-19b0ada7b9f6,network=Network(0e652426-74cc-49f4-8211-ee80c6ea6be6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8507fe-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.984 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.985 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.991 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a8507fe-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.992 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8a8507fe-c7, col_values=(('external_ids', {'iface-id': '8a8507fe-c765-4719-be69-19b0ada7b9f6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:5f:41', 'vm-uuid': 'ed2454d0-acc5-4651-9e10-4bee8b32306c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:36 compute-1 NetworkManager[51724]: <info>  [1759268856.9957] manager: (tap8a8507fe-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Sep 30 21:47:36 compute-1 nova_compute[192795]: 2025-09-30 21:47:36.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.005 2 INFO os_vif [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:5f:41,bridge_name='br-int',has_traffic_filtering=True,id=8a8507fe-c765-4719-be69-19b0ada7b9f6,network=Network(0e652426-74cc-49f4-8211-ee80c6ea6be6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8507fe-c7')
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.058 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.058 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.058 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No VIF found with MAC fa:16:3e:04:5f:41, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.059 2 INFO nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Using config drive
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.803 2 INFO nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Creating config drive at /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk.config
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.808 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyn06gz5w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.934 2 DEBUG oslo_concurrency.processutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyn06gz5w" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:37 compute-1 kernel: tap8a8507fe-c7: entered promiscuous mode
Sep 30 21:47:37 compute-1 NetworkManager[51724]: <info>  [1759268857.9960] manager: (tap8a8507fe-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/316)
Sep 30 21:47:37 compute-1 nova_compute[192795]: 2025-09-30 21:47:37.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:37 compute-1 ovn_controller[94902]: 2025-09-30T21:47:37Z|00627|binding|INFO|Claiming lport 8a8507fe-c765-4719-be69-19b0ada7b9f6 for this chassis.
Sep 30 21:47:37 compute-1 ovn_controller[94902]: 2025-09-30T21:47:37Z|00628|binding|INFO|8a8507fe-c765-4719-be69-19b0ada7b9f6: Claiming fa:16:3e:04:5f:41 10.100.0.3
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.004 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:5f:41 10.100.0.3'], port_security=['fa:16:3e:04:5f:41 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ed2454d0-acc5-4651-9e10-4bee8b32306c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e652426-74cc-49f4-8211-ee80c6ea6be6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '506284a2-22bc-4ab7-ab23-2e02bd4112be', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15e3a51c-a544-4627-802c-b1b68004aa27, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=8a8507fe-c765-4719-be69-19b0ada7b9f6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.005 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 8a8507fe-c765-4719-be69-19b0ada7b9f6 in datapath 0e652426-74cc-49f4-8211-ee80c6ea6be6 bound to our chassis
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.007 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0e652426-74cc-49f4-8211-ee80c6ea6be6
Sep 30 21:47:38 compute-1 ovn_controller[94902]: 2025-09-30T21:47:38Z|00629|binding|INFO|Setting lport 8a8507fe-c765-4719-be69-19b0ada7b9f6 ovn-installed in OVS
Sep 30 21:47:38 compute-1 ovn_controller[94902]: 2025-09-30T21:47:38Z|00630|binding|INFO|Setting lport 8a8507fe-c765-4719-be69-19b0ada7b9f6 up in Southbound
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.021 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4712060e-55be-4eda-8eae-ec6bb2b184a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.023 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0e652426-71 in ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.027 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0e652426-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.028 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5acb9da9-73b8-4360-bc5f-663814158e30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.028 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0561ce66-fc73-4762-a817-46522f211f99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 systemd-udevd[246814]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:47:38 compute-1 systemd-machined[152783]: New machine qemu-74-instance-0000009f.
Sep 30 21:47:38 compute-1 NetworkManager[51724]: <info>  [1759268858.0434] device (tap8a8507fe-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.042 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[b1619306-2f21-4ade-86b5-308cfb4de960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 NetworkManager[51724]: <info>  [1759268858.0442] device (tap8a8507fe-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:47:38 compute-1 systemd[1]: Started Virtual Machine qemu-74-instance-0000009f.
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.069 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a615d858-42c2-4d85-8913-11ae711fa5ff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.099 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab2ad9e-b6ff-446e-a3ef-991c9bd9a76c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 NetworkManager[51724]: <info>  [1759268858.1177] manager: (tap0e652426-70): new Veth device (/org/freedesktop/NetworkManager/Devices/317)
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.118 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d25e94c5-dfb0-481b-9c3f-2bc002b38f37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.156 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[29655883-6aef-41c2-a85a-c5f7909f5c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.159 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4c2d2998-ad8d-4ee9-88d2-ddeba3520118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 NetworkManager[51724]: <info>  [1759268858.1838] device (tap0e652426-70): carrier: link connected
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.191 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc3e3f0-53ce-498c-b0ec-f84224de4b25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.208 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8c10fc51-5062-4f44-bb86-a53e29f2153b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e652426-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:2c:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 555385, 'reachable_time': 27691, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246846, 'error': None, 'target': 'ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.226 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbb0c0a-ea99-45de-b08d-7ed98ab9f557]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:2cdf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 555385, 'tstamp': 555385}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246847, 'error': None, 'target': 'ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.245 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[56390bca-3915-46fe-9e4b-1b51b3c58a3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e652426-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:2c:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 555385, 'reachable_time': 27691, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246848, 'error': None, 'target': 'ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.276 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f3fed499-b252-48d8-9df5-4bf7c263bfaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.343 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6579c1-b092-4153-a449-b33113355e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.345 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e652426-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.345 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.345 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e652426-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:38 compute-1 NetworkManager[51724]: <info>  [1759268858.3488] manager: (tap0e652426-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/318)
Sep 30 21:47:38 compute-1 kernel: tap0e652426-70: entered promiscuous mode
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.351 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0e652426-70, col_values=(('external_ids', {'iface-id': '687c215d-a270-4626-a3ed-d8ed28c5a280'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:38 compute-1 ovn_controller[94902]: 2025-09-30T21:47:38Z|00631|binding|INFO|Releasing lport 687c215d-a270-4626-a3ed-d8ed28c5a280 from this chassis (sb_readonly=0)
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.365 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e652426-74cc-49f4-8211-ee80c6ea6be6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e652426-74cc-49f4-8211-ee80c6ea6be6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.366 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[97b7ea8c-9b54-4796-a78c-1492f3aeca50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.367 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-0e652426-74cc-49f4-8211-ee80c6ea6be6
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/0e652426-74cc-49f4-8211-ee80c6ea6be6.pid.haproxy
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 0e652426-74cc-49f4-8211-ee80c6ea6be6
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.369 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6', 'env', 'PROCESS_TAG=haproxy-0e652426-74cc-49f4-8211-ee80c6ea6be6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0e652426-74cc-49f4-8211-ee80c6ea6be6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.706 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.707 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:38.707 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:38 compute-1 podman[246880]: 2025-09-30 21:47:38.779392062 +0000 UTC m=+0.067305864 container create b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:38 compute-1 systemd[1]: Started libpod-conmon-b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a.scope.
Sep 30 21:47:38 compute-1 podman[246880]: 2025-09-30 21:47:38.746447577 +0000 UTC m=+0.034361189 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:47:38 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:47:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edf29fdbfff5de1d2de1c714f0481370a02fe8e2721f7dce24982f4c1d82988f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:47:38 compute-1 podman[246880]: 2025-09-30 21:47:38.879621129 +0000 UTC m=+0.167534721 container init b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:47:38 compute-1 podman[246880]: 2025-09-30 21:47:38.889349809 +0000 UTC m=+0.177263371 container start b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:47:38 compute-1 neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6[246902]: [NOTICE]   (246906) : New worker (246908) forked
Sep 30 21:47:38 compute-1 neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6[246902]: [NOTICE]   (246906) : Loading success.
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.959 2 DEBUG nova.compute.manager [req-e7e92384-8f2b-445d-9083-a28c86fa8bdf req-eaac62ca-2fc3-4657-b69e-6bda6e9fdc11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received event network-vif-plugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.960 2 DEBUG oslo_concurrency.lockutils [req-e7e92384-8f2b-445d-9083-a28c86fa8bdf req-eaac62ca-2fc3-4657-b69e-6bda6e9fdc11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.961 2 DEBUG oslo_concurrency.lockutils [req-e7e92384-8f2b-445d-9083-a28c86fa8bdf req-eaac62ca-2fc3-4657-b69e-6bda6e9fdc11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.961 2 DEBUG oslo_concurrency.lockutils [req-e7e92384-8f2b-445d-9083-a28c86fa8bdf req-eaac62ca-2fc3-4657-b69e-6bda6e9fdc11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:38 compute-1 nova_compute[192795]: 2025-09-30 21:47:38.962 2 DEBUG nova.compute.manager [req-e7e92384-8f2b-445d-9083-a28c86fa8bdf req-eaac62ca-2fc3-4657-b69e-6bda6e9fdc11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Processing event network-vif-plugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.220 2 DEBUG nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.221 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268859.2197258, ed2454d0-acc5-4651-9e10-4bee8b32306c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.222 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] VM Started (Lifecycle Event)
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.225 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.228 2 INFO nova.virt.libvirt.driver [-] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Instance spawned successfully.
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.228 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.248 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.253 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.254 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.254 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.255 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.255 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.256 2 DEBUG nova.virt.libvirt.driver [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.261 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.289 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.289 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268859.2257066, ed2454d0-acc5-4651-9e10-4bee8b32306c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.289 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] VM Paused (Lifecycle Event)
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.309 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.313 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268859.2266514, ed2454d0-acc5-4651-9e10-4bee8b32306c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.313 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] VM Resumed (Lifecycle Event)
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.335 2 INFO nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Took 7.08 seconds to spawn the instance on the hypervisor.
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.336 2 DEBUG nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.337 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.344 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.375 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.435 2 INFO nova.compute.manager [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Took 7.77 seconds to build instance.
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.459 2 DEBUG oslo_concurrency.lockutils [None req-6ca34336-401c-4e21-934d-b542cdcc4b27 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.908s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.594 2 DEBUG nova.network.neutron [req-fe427470-f12b-45cf-a6ef-51b3a1a0248f req-6f2d97c5-90fe-49ad-a45e-87ab995da3f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Updated VIF entry in instance network info cache for port 8a8507fe-c765-4719-be69-19b0ada7b9f6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.595 2 DEBUG nova.network.neutron [req-fe427470-f12b-45cf-a6ef-51b3a1a0248f req-6f2d97c5-90fe-49ad-a45e-87ab995da3f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Updating instance_info_cache with network_info: [{"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:39 compute-1 nova_compute[192795]: 2025-09-30 21:47:39.620 2 DEBUG oslo_concurrency.lockutils [req-fe427470-f12b-45cf-a6ef-51b3a1a0248f req-6f2d97c5-90fe-49ad-a45e-87ab995da3f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:40 compute-1 podman[246917]: 2025-09-30 21:47:40.236277137 +0000 UTC m=+0.067488579 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:47:40 compute-1 podman[246919]: 2025-09-30 21:47:40.23962844 +0000 UTC m=+0.058234311 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:47:40 compute-1 podman[246918]: 2025-09-30 21:47:40.283978445 +0000 UTC m=+0.104347609 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:47:41 compute-1 nova_compute[192795]: 2025-09-30 21:47:41.036 2 DEBUG nova.compute.manager [req-0d87fb42-176f-436c-9e26-676a06b69685 req-9e735d2f-6f1f-4d00-8551-a7535d6f8b9d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received event network-vif-plugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:41 compute-1 nova_compute[192795]: 2025-09-30 21:47:41.036 2 DEBUG oslo_concurrency.lockutils [req-0d87fb42-176f-436c-9e26-676a06b69685 req-9e735d2f-6f1f-4d00-8551-a7535d6f8b9d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:41 compute-1 nova_compute[192795]: 2025-09-30 21:47:41.037 2 DEBUG oslo_concurrency.lockutils [req-0d87fb42-176f-436c-9e26-676a06b69685 req-9e735d2f-6f1f-4d00-8551-a7535d6f8b9d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:41 compute-1 nova_compute[192795]: 2025-09-30 21:47:41.037 2 DEBUG oslo_concurrency.lockutils [req-0d87fb42-176f-436c-9e26-676a06b69685 req-9e735d2f-6f1f-4d00-8551-a7535d6f8b9d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:41 compute-1 nova_compute[192795]: 2025-09-30 21:47:41.037 2 DEBUG nova.compute.manager [req-0d87fb42-176f-436c-9e26-676a06b69685 req-9e735d2f-6f1f-4d00-8551-a7535d6f8b9d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] No waiting events found dispatching network-vif-plugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:41 compute-1 nova_compute[192795]: 2025-09-30 21:47:41.038 2 WARNING nova.compute.manager [req-0d87fb42-176f-436c-9e26-676a06b69685 req-9e735d2f-6f1f-4d00-8551-a7535d6f8b9d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received unexpected event network-vif-plugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 for instance with vm_state active and task_state None.
Sep 30 21:47:42 compute-1 nova_compute[192795]: 2025-09-30 21:47:42.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:43 compute-1 nova_compute[192795]: 2025-09-30 21:47:43.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.004 2 DEBUG oslo_concurrency.lockutils [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.007 2 DEBUG oslo_concurrency.lockutils [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.007 2 DEBUG oslo_concurrency.lockutils [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.008 2 DEBUG oslo_concurrency.lockutils [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.008 2 DEBUG oslo_concurrency.lockutils [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.026 2 INFO nova.compute.manager [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Terminating instance
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.043 2 DEBUG nova.compute.manager [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:47:45 compute-1 kernel: tapf604c930-42 (unregistering): left promiscuous mode
Sep 30 21:47:45 compute-1 NetworkManager[51724]: <info>  [1759268865.0786] device (tapf604c930-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:47:45 compute-1 ovn_controller[94902]: 2025-09-30T21:47:45Z|00632|binding|INFO|Releasing lport f604c930-42c7-4f2f-b4ba-f7f70ca08cfb from this chassis (sb_readonly=0)
Sep 30 21:47:45 compute-1 ovn_controller[94902]: 2025-09-30T21:47:45Z|00633|binding|INFO|Setting lport f604c930-42c7-4f2f-b4ba-f7f70ca08cfb down in Southbound
Sep 30 21:47:45 compute-1 ovn_controller[94902]: 2025-09-30T21:47:45Z|00634|binding|INFO|Removing iface tapf604c930-42 ovn-installed in OVS
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.100 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:64:66 10.100.0.9'], port_security=['fa:16:3e:4e:64:66 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '92b445f9-0995-4201-aac6-9a8bd8c4a418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1d7fb313-89ed-47f5-8144-9e0ec910522f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb9a937-bd5b-4678-a5dd-7e128653bbf0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.102 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f604c930-42c7-4f2f-b4ba-f7f70ca08cfb in datapath f2fdd4b4-9613-49d9-a773-931de49e44b7 unbound from our chassis
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.105 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2fdd4b4-9613-49d9-a773-931de49e44b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:47:45 compute-1 kernel: tap71b40126-7c (unregistering): left promiscuous mode
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.115 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[825424b2-7c33-4e8f-93a5-1b3dac0f23a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.116 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7 namespace which is not needed anymore
Sep 30 21:47:45 compute-1 NetworkManager[51724]: <info>  [1759268865.1226] device (tap71b40126-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 ovn_controller[94902]: 2025-09-30T21:47:45Z|00635|binding|INFO|Releasing lport 71b40126-7c96-4991-bdc4-716828a750fd from this chassis (sb_readonly=0)
Sep 30 21:47:45 compute-1 ovn_controller[94902]: 2025-09-30T21:47:45Z|00636|binding|INFO|Setting lport 71b40126-7c96-4991-bdc4-716828a750fd down in Southbound
Sep 30 21:47:45 compute-1 ovn_controller[94902]: 2025-09-30T21:47:45Z|00637|binding|INFO|Removing iface tap71b40126-7c ovn-installed in OVS
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.151 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:2a:80 2001:db8:0:1:f816:3eff:fe5b:2a80 2001:db8::f816:3eff:fe5b:2a80'], port_security=['fa:16:3e:5b:2a:80 2001:db8:0:1:f816:3eff:fe5b:2a80 2001:db8::f816:3eff:fe5b:2a80'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe5b:2a80/64 2001:db8::f816:3eff:fe5b:2a80/64', 'neutron:device_id': '92b445f9-0995-4201-aac6-9a8bd8c4a418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1d7fb313-89ed-47f5-8144-9e0ec910522f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9942446e-c20f-4a7d-bedc-ac08b4f4b886, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=71b40126-7c96-4991-bdc4-716828a750fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009d.scope: Deactivated successfully.
Sep 30 21:47:45 compute-1 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009d.scope: Consumed 15.549s CPU time.
Sep 30 21:47:45 compute-1 systemd-machined[152783]: Machine qemu-73-instance-0000009d terminated.
Sep 30 21:47:45 compute-1 podman[246985]: 2025-09-30 21:47:45.211545744 +0000 UTC m=+0.103471798 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:47:45 compute-1 kernel: tapf604c930-42: entered promiscuous mode
Sep 30 21:47:45 compute-1 kernel: tapf604c930-42 (unregistering): left promiscuous mode
Sep 30 21:47:45 compute-1 NetworkManager[51724]: <info>  [1759268865.2735] manager: (tapf604c930-42): new Tun device (/org/freedesktop/NetworkManager/Devices/319)
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 ovn_controller[94902]: 2025-09-30T21:47:45Z|00638|binding|INFO|Claiming lport f604c930-42c7-4f2f-b4ba-f7f70ca08cfb for this chassis.
Sep 30 21:47:45 compute-1 ovn_controller[94902]: 2025-09-30T21:47:45Z|00639|binding|INFO|f604c930-42c7-4f2f-b4ba-f7f70ca08cfb: Claiming fa:16:3e:4e:64:66 10.100.0.9
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.296 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:64:66 10.100.0.9'], port_security=['fa:16:3e:4e:64:66 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '92b445f9-0995-4201-aac6-9a8bd8c4a418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1d7fb313-89ed-47f5-8144-9e0ec910522f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb9a937-bd5b-4678-a5dd-7e128653bbf0, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:45 compute-1 NetworkManager[51724]: <info>  [1759268865.3010] manager: (tap71b40126-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/320)
Sep 30 21:47:45 compute-1 ovn_controller[94902]: 2025-09-30T21:47:45Z|00640|binding|INFO|Releasing lport f604c930-42c7-4f2f-b4ba-f7f70ca08cfb from this chassis (sb_readonly=0)
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.316 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:64:66 10.100.0.9'], port_security=['fa:16:3e:4e:64:66 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '92b445f9-0995-4201-aac6-9a8bd8c4a418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1d7fb313-89ed-47f5-8144-9e0ec910522f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb9a937-bd5b-4678-a5dd-7e128653bbf0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7[246615]: [NOTICE]   (246652) : haproxy version is 2.8.14-c23fe91
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7[246615]: [NOTICE]   (246652) : path to executable is /usr/sbin/haproxy
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7[246615]: [WARNING]  (246652) : Exiting Master process...
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7[246615]: [WARNING]  (246652) : Exiting Master process...
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7[246615]: [ALERT]    (246652) : Current worker (246654) exited with code 143 (Terminated)
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7[246615]: [WARNING]  (246652) : All workers exited. Exiting... (0)
Sep 30 21:47:45 compute-1 systemd[1]: libpod-94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce.scope: Deactivated successfully.
Sep 30 21:47:45 compute-1 podman[247030]: 2025-09-30 21:47:45.333625431 +0000 UTC m=+0.076233975 container died 94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.357 2 INFO nova.virt.libvirt.driver [-] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Instance destroyed successfully.
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.358 2 DEBUG nova.objects.instance [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid 92b445f9-0995-4201-aac6-9a8bd8c4a418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:47:45 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce-userdata-shm.mount: Deactivated successfully.
Sep 30 21:47:45 compute-1 systemd[1]: var-lib-containers-storage-overlay-759b148ef3c501d2e4cb4a43b238373a5393319b49e6b723326927ec9c260345-merged.mount: Deactivated successfully.
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.376 2 DEBUG nova.virt.libvirt.vif [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:47:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1286955007',display_name='tempest-TestGettingAddress-server-1286955007',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1286955007',id=157,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFNizAru+rn1r5m8hhu4KyrTgI1anlmS25psFBq3L9HEhvkl0+I02Y99PbperiPr/frQcIa5uK1vOoCOrssf3A2v1wEzdcs7e2N8tU8ip/c+FMogBYdqYv4HPV/Z7m9ySQ==',key_name='tempest-TestGettingAddress-1032720980',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:47:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-2h3yv1p2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:47:22Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=92b445f9-0995-4201-aac6-9a8bd8c4a418,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.377 2 DEBUG nova.network.os_vif_util [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "address": "fa:16:3e:4e:64:66", "network": {"id": "f2fdd4b4-9613-49d9-a773-931de49e44b7", "bridge": "br-int", "label": "tempest-network-smoke--1116471088", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf604c930-42", "ovs_interfaceid": "f604c930-42c7-4f2f-b4ba-f7f70ca08cfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.378 2 DEBUG nova.network.os_vif_util [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:64:66,bridge_name='br-int',has_traffic_filtering=True,id=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb,network=Network(f2fdd4b4-9613-49d9-a773-931de49e44b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf604c930-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:45 compute-1 podman[247030]: 2025-09-30 21:47:45.378886979 +0000 UTC m=+0.121495523 container cleanup 94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.379 2 DEBUG os_vif [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:64:66,bridge_name='br-int',has_traffic_filtering=True,id=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb,network=Network(f2fdd4b4-9613-49d9-a773-931de49e44b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf604c930-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.381 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf604c930-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:47:45 compute-1 systemd[1]: libpod-conmon-94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce.scope: Deactivated successfully.
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.392 2 INFO os_vif [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:64:66,bridge_name='br-int',has_traffic_filtering=True,id=f604c930-42c7-4f2f-b4ba-f7f70ca08cfb,network=Network(f2fdd4b4-9613-49d9-a773-931de49e44b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf604c930-42')
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.393 2 DEBUG nova.virt.libvirt.vif [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:47:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1286955007',display_name='tempest-TestGettingAddress-server-1286955007',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1286955007',id=157,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFNizAru+rn1r5m8hhu4KyrTgI1anlmS25psFBq3L9HEhvkl0+I02Y99PbperiPr/frQcIa5uK1vOoCOrssf3A2v1wEzdcs7e2N8tU8ip/c+FMogBYdqYv4HPV/Z7m9ySQ==',key_name='tempest-TestGettingAddress-1032720980',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:47:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-2h3yv1p2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:47:22Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=92b445f9-0995-4201-aac6-9a8bd8c4a418,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.393 2 DEBUG nova.network.os_vif_util [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.394 2 DEBUG nova.network.os_vif_util [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:2a:80,bridge_name='br-int',has_traffic_filtering=True,id=71b40126-7c96-4991-bdc4-716828a750fd,network=Network(d22f103a-1a95-4031-ae6e-c474eae9834e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b40126-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.394 2 DEBUG os_vif [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:2a:80,bridge_name='br-int',has_traffic_filtering=True,id=71b40126-7c96-4991-bdc4-716828a750fd,network=Network(d22f103a-1a95-4031-ae6e-c474eae9834e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b40126-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.396 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71b40126-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.400 2 INFO os_vif [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:2a:80,bridge_name='br-int',has_traffic_filtering=True,id=71b40126-7c96-4991-bdc4-716828a750fd,network=Network(d22f103a-1a95-4031-ae6e-c474eae9834e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b40126-7c')
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.400 2 INFO nova.virt.libvirt.driver [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Deleting instance files /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418_del
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.401 2 INFO nova.virt.libvirt.driver [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Deletion of /var/lib/nova/instances/92b445f9-0995-4201-aac6-9a8bd8c4a418_del complete
Sep 30 21:47:45 compute-1 podman[247081]: 2025-09-30 21:47:45.464102575 +0000 UTC m=+0.052441476 container remove 94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.474 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[84ea4526-18d5-4bfa-8340-29682503f183]: (4, ('Tue Sep 30 09:47:45 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7 (94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce)\n94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce\nTue Sep 30 09:47:45 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7 (94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce)\n94393fea67a83834aa5793146e7e02db761fae7afb4dba7f95104c7b9080acce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.477 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[599c2656-6bd5-485c-8a59-254d86af1f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.479 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2fdd4b4-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 kernel: tapf2fdd4b4-90: left promiscuous mode
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.517 2 INFO nova.compute.manager [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Took 0.47 seconds to destroy the instance on the hypervisor.
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.517 2 DEBUG oslo.service.loopingcall [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.518 2 DEBUG nova.compute.manager [-] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.518 2 DEBUG nova.network.neutron [-] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.517 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c3f605-9c14-402b-a032-85b194a69f72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.556 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b25957d2-a149-4d24-af74-868eb9893710]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.559 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[68882901-f3d1-437b-8d00-e7d7025f0137]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.581 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4bfba3-38a9-418f-85e3-3dffe9a19c02]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553577, 'reachable_time': 33440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247096, 'error': None, 'target': 'ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 systemd[1]: run-netns-ovnmeta\x2df2fdd4b4\x2d9613\x2d49d9\x2da773\x2d931de49e44b7.mount: Deactivated successfully.
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.586 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f2fdd4b4-9613-49d9-a773-931de49e44b7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.586 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[29919a93-0647-4d9f-b0d6-142ab0c7d7e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.587 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 71b40126-7c96-4991-bdc4-716828a750fd in datapath d22f103a-1a95-4031-ae6e-c474eae9834e unbound from our chassis
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.588 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d22f103a-1a95-4031-ae6e-c474eae9834e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.589 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1e121ab0-f901-4734-ab4c-776709024942]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.590 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e namespace which is not needed anymore
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e[246728]: [NOTICE]   (246732) : haproxy version is 2.8.14-c23fe91
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e[246728]: [NOTICE]   (246732) : path to executable is /usr/sbin/haproxy
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e[246728]: [WARNING]  (246732) : Exiting Master process...
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e[246728]: [WARNING]  (246732) : Exiting Master process...
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e[246728]: [ALERT]    (246732) : Current worker (246734) exited with code 143 (Terminated)
Sep 30 21:47:45 compute-1 neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e[246728]: [WARNING]  (246732) : All workers exited. Exiting... (0)
Sep 30 21:47:45 compute-1 systemd[1]: libpod-9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a.scope: Deactivated successfully.
Sep 30 21:47:45 compute-1 podman[247114]: 2025-09-30 21:47:45.754209045 +0000 UTC m=+0.068192356 container died 9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 21:47:45 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a-userdata-shm.mount: Deactivated successfully.
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.781 2 DEBUG nova.compute.manager [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received event network-changed-8a8507fe-c765-4719-be69-19b0ada7b9f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.781 2 DEBUG nova.compute.manager [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Refreshing instance network info cache due to event network-changed-8a8507fe-c765-4719-be69-19b0ada7b9f6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.782 2 DEBUG oslo_concurrency.lockutils [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.782 2 DEBUG oslo_concurrency.lockutils [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.782 2 DEBUG nova.network.neutron [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Refreshing network info cache for port 8a8507fe-c765-4719-be69-19b0ada7b9f6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:45 compute-1 systemd[1]: var-lib-containers-storage-overlay-5eae7e0e386142a91b02ff524cae769ca688ebe86432b54f48ad957a19b5bab4-merged.mount: Deactivated successfully.
Sep 30 21:47:45 compute-1 podman[247114]: 2025-09-30 21:47:45.796223063 +0000 UTC m=+0.110206354 container cleanup 9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:47:45 compute-1 systemd[1]: libpod-conmon-9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a.scope: Deactivated successfully.
Sep 30 21:47:45 compute-1 podman[247145]: 2025-09-30 21:47:45.874791155 +0000 UTC m=+0.052741484 container remove 9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.880 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[05f167f3-8e3d-4bca-8023-0a3bc34c0756]: (4, ('Tue Sep 30 09:47:45 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e (9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a)\n9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a\nTue Sep 30 09:47:45 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e (9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a)\n9c8232ea4c78bba2f23f8f6f86090d43c74a0026bad9029ea91ea2837b0b702a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.882 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[13b4371a-a7ab-45f8-b0ad-a4ae8b254340]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.883 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd22f103a-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:45 compute-1 kernel: tapd22f103a-10: left promiscuous mode
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.906 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[024ed1d8-1383-427c-9e9c-15d3b2a5ff10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.940 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[362d72b6-fdf5-4c6c-970e-12b213a35c2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.942 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[50d686c1-093e-4c5c-a01f-ae7ba2f82a87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.964 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9e78101f-d3dc-4a7c-a0f9-8fba8494d05f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553670, 'reachable_time': 34140, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247160, 'error': None, 'target': 'ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.967 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.967 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3b1370-36a4-4ea0-baae-a8fd5d749a54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.968 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f604c930-42c7-4f2f-b4ba-f7f70ca08cfb in datapath f2fdd4b4-9613-49d9-a773-931de49e44b7 unbound from our chassis
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.970 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2fdd4b4-9613-49d9-a773-931de49e44b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.970 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8eabb59e-8c3d-4812-9038-fe7b9c5df1d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.971 103861 INFO neutron.agent.ovn.metadata.agent [-] Port f604c930-42c7-4f2f-b4ba-f7f70ca08cfb in datapath f2fdd4b4-9613-49d9-a773-931de49e44b7 unbound from our chassis
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.973 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2fdd4b4-9613-49d9-a773-931de49e44b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:47:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:45.973 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[edf48140-6a90-44e1-a6f4-6bb16a03c951]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.981 2 DEBUG nova.compute.manager [req-f9da664d-af45-4aa0-aeec-8a47752ac41d req-8c625d0d-9890-40b8-ae07-a165ec1e30f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-unplugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.981 2 DEBUG oslo_concurrency.lockutils [req-f9da664d-af45-4aa0-aeec-8a47752ac41d req-8c625d0d-9890-40b8-ae07-a165ec1e30f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.982 2 DEBUG oslo_concurrency.lockutils [req-f9da664d-af45-4aa0-aeec-8a47752ac41d req-8c625d0d-9890-40b8-ae07-a165ec1e30f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.982 2 DEBUG oslo_concurrency.lockutils [req-f9da664d-af45-4aa0-aeec-8a47752ac41d req-8c625d0d-9890-40b8-ae07-a165ec1e30f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.982 2 DEBUG nova.compute.manager [req-f9da664d-af45-4aa0-aeec-8a47752ac41d req-8c625d0d-9890-40b8-ae07-a165ec1e30f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] No waiting events found dispatching network-vif-unplugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:45 compute-1 nova_compute[192795]: 2025-09-30 21:47:45.983 2 DEBUG nova.compute.manager [req-f9da664d-af45-4aa0-aeec-8a47752ac41d req-8c625d0d-9890-40b8-ae07-a165ec1e30f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-unplugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:47:46 compute-1 systemd[1]: run-netns-ovnmeta\x2dd22f103a\x2d1a95\x2d4031\x2dae6e\x2dc474eae9834e.mount: Deactivated successfully.
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.612 2 DEBUG nova.network.neutron [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Updated VIF entry in instance network info cache for port 8a8507fe-c765-4719-be69-19b0ada7b9f6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.612 2 DEBUG nova.network.neutron [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Updating instance_info_cache with network_info: [{"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.645 2 DEBUG oslo_concurrency.lockutils [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.645 2 DEBUG nova.compute.manager [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-changed-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.646 2 DEBUG nova.compute.manager [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Refreshing instance network info cache due to event network-changed-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.646 2 DEBUG oslo_concurrency.lockutils [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.646 2 DEBUG oslo_concurrency.lockutils [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.646 2 DEBUG nova.network.neutron [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Refreshing network info cache for port f604c930-42c7-4f2f-b4ba-f7f70ca08cfb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:47.809 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:47 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:47.809 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.848 2 DEBUG nova.network.neutron [-] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.865 2 INFO nova.network.neutron [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Port f604c930-42c7-4f2f-b4ba-f7f70ca08cfb from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.866 2 DEBUG nova.network.neutron [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Updating instance_info_cache with network_info: [{"id": "71b40126-7c96-4991-bdc4-716828a750fd", "address": "fa:16:3e:5b:2a:80", "network": {"id": "d22f103a-1a95-4031-ae6e-c474eae9834e", "bridge": "br-int", "label": "tempest-network-smoke--610575536", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe5b:2a80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b40126-7c", "ovs_interfaceid": "71b40126-7c96-4991-bdc4-716828a750fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.869 2 INFO nova.compute.manager [-] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Took 2.35 seconds to deallocate network for instance.
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.892 2 DEBUG oslo_concurrency.lockutils [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-92b445f9-0995-4201-aac6-9a8bd8c4a418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.893 2 DEBUG nova.compute.manager [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-unplugged-71b40126-7c96-4991-bdc4-716828a750fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.893 2 DEBUG oslo_concurrency.lockutils [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.893 2 DEBUG oslo_concurrency.lockutils [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.894 2 DEBUG oslo_concurrency.lockutils [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.894 2 DEBUG nova.compute.manager [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] No waiting events found dispatching network-vif-unplugged-71b40126-7c96-4991-bdc4-716828a750fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.894 2 DEBUG nova.compute.manager [req-beb0b4f4-145f-4be6-aac5-8e4a17e33b6a req-59ad8252-c19e-472e-8aa6-16c3f1729127 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-unplugged-71b40126-7c96-4991-bdc4-716828a750fd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.917 2 DEBUG nova.compute.manager [req-e3eb5dc2-a9e8-49d2-9eb7-a1064316067d req-218b4d51-a971-4287-bff5-034721fa682a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-plugged-71b40126-7c96-4991-bdc4-716828a750fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.918 2 DEBUG oslo_concurrency.lockutils [req-e3eb5dc2-a9e8-49d2-9eb7-a1064316067d req-218b4d51-a971-4287-bff5-034721fa682a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.918 2 DEBUG oslo_concurrency.lockutils [req-e3eb5dc2-a9e8-49d2-9eb7-a1064316067d req-218b4d51-a971-4287-bff5-034721fa682a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.918 2 DEBUG oslo_concurrency.lockutils [req-e3eb5dc2-a9e8-49d2-9eb7-a1064316067d req-218b4d51-a971-4287-bff5-034721fa682a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.919 2 DEBUG nova.compute.manager [req-e3eb5dc2-a9e8-49d2-9eb7-a1064316067d req-218b4d51-a971-4287-bff5-034721fa682a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] No waiting events found dispatching network-vif-plugged-71b40126-7c96-4991-bdc4-716828a750fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.919 2 WARNING nova.compute.manager [req-e3eb5dc2-a9e8-49d2-9eb7-a1064316067d req-218b4d51-a971-4287-bff5-034721fa682a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received unexpected event network-vif-plugged-71b40126-7c96-4991-bdc4-716828a750fd for instance with vm_state active and task_state deleting.
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.987 2 DEBUG oslo_concurrency.lockutils [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:47 compute-1 nova_compute[192795]: 2025-09-30 21:47:47.988 2 DEBUG oslo_concurrency.lockutils [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.152 2 DEBUG nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-plugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.153 2 DEBUG oslo_concurrency.lockutils [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.153 2 DEBUG oslo_concurrency.lockutils [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.153 2 DEBUG oslo_concurrency.lockutils [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.154 2 DEBUG nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] No waiting events found dispatching network-vif-plugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.154 2 WARNING nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received unexpected event network-vif-plugged-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb for instance with vm_state deleted and task_state None.
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.154 2 DEBUG nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-deleted-71b40126-7c96-4991-bdc4-716828a750fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.154 2 INFO nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Neutron deleted interface 71b40126-7c96-4991-bdc4-716828a750fd; detaching it from the instance and deleting it from the info cache
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.155 2 DEBUG nova.network.neutron [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.161 2 DEBUG nova.compute.provider_tree [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.208 2 DEBUG nova.scheduler.client.report [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.214 2 DEBUG nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Detach interface failed, port_id=71b40126-7c96-4991-bdc4-716828a750fd, reason: Instance 92b445f9-0995-4201-aac6-9a8bd8c4a418 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.215 2 DEBUG nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Received event network-vif-deleted-f604c930-42c7-4f2f-b4ba-f7f70ca08cfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.261 2 DEBUG oslo_concurrency.lockutils [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.323 2 INFO nova.scheduler.client.report [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance 92b445f9-0995-4201-aac6-9a8bd8c4a418
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.436 2 DEBUG oslo_concurrency.lockutils [None req-8c606065-bcd9-4480-a688-fa64dbf0285c 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "92b445f9-0995-4201-aac6-9a8bd8c4a418" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:48 compute-1 nova_compute[192795]: 2025-09-30 21:47:48.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:50 compute-1 nova_compute[192795]: 2025-09-30 21:47:50.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:47:50.812 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:51 compute-1 podman[247174]: 2025-09-30 21:47:51.240678946 +0000 UTC m=+0.077207629 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:47:51 compute-1 podman[247175]: 2025-09-30 21:47:51.25542683 +0000 UTC m=+0.081601918 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:47:51 compute-1 podman[247173]: 2025-09-30 21:47:51.275320282 +0000 UTC m=+0.103377766 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7)
Sep 30 21:47:52 compute-1 ovn_controller[94902]: 2025-09-30T21:47:52Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:5f:41 10.100.0.3
Sep 30 21:47:52 compute-1 ovn_controller[94902]: 2025-09-30T21:47:52Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:5f:41 10.100.0.3
Sep 30 21:47:53 compute-1 nova_compute[192795]: 2025-09-30 21:47:53.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:55 compute-1 nova_compute[192795]: 2025-09-30 21:47:55.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:58 compute-1 nova_compute[192795]: 2025-09-30 21:47:58.611 2 INFO nova.compute.manager [None req-a1c8b06d-c437-4621-9384-f1f18b365273 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Get console output
Sep 30 21:47:58 compute-1 nova_compute[192795]: 2025-09-30 21:47:58.621 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:47:58 compute-1 nova_compute[192795]: 2025-09-30 21:47:58.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:59 compute-1 ovn_controller[94902]: 2025-09-30T21:47:59Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:5f:41 10.100.0.3
Sep 30 21:47:59 compute-1 nova_compute[192795]: 2025-09-30 21:47:59.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:59 compute-1 nova_compute[192795]: 2025-09-30 21:47:59.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:59 compute-1 nova_compute[192795]: 2025-09-30 21:47:59.726 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:59 compute-1 nova_compute[192795]: 2025-09-30 21:47:59.727 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:59 compute-1 nova_compute[192795]: 2025-09-30 21:47:59.727 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:59 compute-1 nova_compute[192795]: 2025-09-30 21:47:59.727 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:47:59 compute-1 nova_compute[192795]: 2025-09-30 21:47:59.847 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:59 compute-1 nova_compute[192795]: 2025-09-30 21:47:59.912 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:59 compute-1 nova_compute[192795]: 2025-09-30 21:47:59.914 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.008 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.204 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.206 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5520MB free_disk=73.2719955444336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.206 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.206 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.324 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance ed2454d0-acc5-4651-9e10-4bee8b32306c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.324 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.324 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.355 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268865.3544753, 92b445f9-0995-4201-aac6-9a8bd8c4a418 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.356 2 INFO nova.compute.manager [-] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] VM Stopped (Lifecycle Event)
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.376 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.378 2 DEBUG nova.compute.manager [None req-8fe0d056-e02a-49aa-9078-b254c968da88 - - - - - -] [instance: 92b445f9-0995-4201-aac6-9a8bd8c4a418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.389 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.417 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:48:00 compute-1 nova_compute[192795]: 2025-09-30 21:48:00.417 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:00 compute-1 ovn_controller[94902]: 2025-09-30T21:48:00Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:5f:41 10.100.0.3
Sep 30 21:48:01 compute-1 podman[247246]: 2025-09-30 21:48:01.260603335 +0000 UTC m=+0.088051407 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:48:01 compute-1 nova_compute[192795]: 2025-09-30 21:48:01.417 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:01 compute-1 nova_compute[192795]: 2025-09-30 21:48:01.418 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:01 compute-1 nova_compute[192795]: 2025-09-30 21:48:01.418 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:48:01 compute-1 ovn_controller[94902]: 2025-09-30T21:48:01Z|00641|binding|INFO|Releasing lport 687c215d-a270-4626-a3ed-d8ed28c5a280 from this chassis (sb_readonly=0)
Sep 30 21:48:01 compute-1 nova_compute[192795]: 2025-09-30 21:48:01.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:03 compute-1 ovn_controller[94902]: 2025-09-30T21:48:03Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:5f:41 10.100.0.3
Sep 30 21:48:03 compute-1 nova_compute[192795]: 2025-09-30 21:48:03.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.112 2 DEBUG nova.compute.manager [req-8913815f-e988-48d1-8e2e-fbf8a95ebe51 req-a634234a-16d7-42b6-a56e-7e17005866e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received event network-changed-8a8507fe-c765-4719-be69-19b0ada7b9f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.112 2 DEBUG nova.compute.manager [req-8913815f-e988-48d1-8e2e-fbf8a95ebe51 req-a634234a-16d7-42b6-a56e-7e17005866e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Refreshing instance network info cache due to event network-changed-8a8507fe-c765-4719-be69-19b0ada7b9f6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.113 2 DEBUG oslo_concurrency.lockutils [req-8913815f-e988-48d1-8e2e-fbf8a95ebe51 req-a634234a-16d7-42b6-a56e-7e17005866e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.113 2 DEBUG oslo_concurrency.lockutils [req-8913815f-e988-48d1-8e2e-fbf8a95ebe51 req-a634234a-16d7-42b6-a56e-7e17005866e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.113 2 DEBUG nova.network.neutron [req-8913815f-e988-48d1-8e2e-fbf8a95ebe51 req-a634234a-16d7-42b6-a56e-7e17005866e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Refreshing network info cache for port 8a8507fe-c765-4719-be69-19b0ada7b9f6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.231 2 DEBUG oslo_concurrency.lockutils [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "ed2454d0-acc5-4651-9e10-4bee8b32306c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.231 2 DEBUG oslo_concurrency.lockutils [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.232 2 DEBUG oslo_concurrency.lockutils [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.232 2 DEBUG oslo_concurrency.lockutils [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.232 2 DEBUG oslo_concurrency.lockutils [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.245 2 INFO nova.compute.manager [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Terminating instance
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.258 2 DEBUG nova.compute.manager [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:48:05 compute-1 kernel: tap8a8507fe-c7 (unregistering): left promiscuous mode
Sep 30 21:48:05 compute-1 NetworkManager[51724]: <info>  [1759268885.2828] device (tap8a8507fe-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:48:05 compute-1 ovn_controller[94902]: 2025-09-30T21:48:05Z|00642|binding|INFO|Releasing lport 8a8507fe-c765-4719-be69-19b0ada7b9f6 from this chassis (sb_readonly=0)
Sep 30 21:48:05 compute-1 ovn_controller[94902]: 2025-09-30T21:48:05Z|00643|binding|INFO|Setting lport 8a8507fe-c765-4719-be69-19b0ada7b9f6 down in Southbound
Sep 30 21:48:05 compute-1 ovn_controller[94902]: 2025-09-30T21:48:05Z|00644|binding|INFO|Removing iface tap8a8507fe-c7 ovn-installed in OVS
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.309 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:5f:41 10.100.0.3'], port_security=['fa:16:3e:04:5f:41 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ed2454d0-acc5-4651-9e10-4bee8b32306c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e652426-74cc-49f4-8211-ee80c6ea6be6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '506284a2-22bc-4ab7-ab23-2e02bd4112be', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15e3a51c-a544-4627-802c-b1b68004aa27, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=8a8507fe-c765-4719-be69-19b0ada7b9f6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.311 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 8a8507fe-c765-4719-be69-19b0ada7b9f6 in datapath 0e652426-74cc-49f4-8211-ee80c6ea6be6 unbound from our chassis
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.313 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e652426-74cc-49f4-8211-ee80c6ea6be6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.314 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[555e08f2-ae1c-4010-b84a-0a3db00472d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.314 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6 namespace which is not needed anymore
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009f.scope: Deactivated successfully.
Sep 30 21:48:05 compute-1 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009f.scope: Consumed 14.606s CPU time.
Sep 30 21:48:05 compute-1 systemd-machined[152783]: Machine qemu-74-instance-0000009f terminated.
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6[246902]: [NOTICE]   (246906) : haproxy version is 2.8.14-c23fe91
Sep 30 21:48:05 compute-1 neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6[246902]: [NOTICE]   (246906) : path to executable is /usr/sbin/haproxy
Sep 30 21:48:05 compute-1 neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6[246902]: [WARNING]  (246906) : Exiting Master process...
Sep 30 21:48:05 compute-1 neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6[246902]: [ALERT]    (246906) : Current worker (246908) exited with code 143 (Terminated)
Sep 30 21:48:05 compute-1 neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6[246902]: [WARNING]  (246906) : All workers exited. Exiting... (0)
Sep 30 21:48:05 compute-1 systemd[1]: libpod-b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a.scope: Deactivated successfully.
Sep 30 21:48:05 compute-1 podman[247289]: 2025-09-30 21:48:05.446190586 +0000 UTC m=+0.048872069 container died b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:48:05 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a-userdata-shm.mount: Deactivated successfully.
Sep 30 21:48:05 compute-1 systemd[1]: var-lib-containers-storage-overlay-edf29fdbfff5de1d2de1c714f0481370a02fe8e2721f7dce24982f4c1d82988f-merged.mount: Deactivated successfully.
Sep 30 21:48:05 compute-1 podman[247289]: 2025-09-30 21:48:05.494813937 +0000 UTC m=+0.097495420 container cleanup b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:48:05 compute-1 systemd[1]: libpod-conmon-b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a.scope: Deactivated successfully.
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.536 2 INFO nova.virt.libvirt.driver [-] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Instance destroyed successfully.
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.536 2 DEBUG nova.objects.instance [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'resources' on Instance uuid ed2454d0-acc5-4651-9e10-4bee8b32306c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.551 2 DEBUG nova.virt.libvirt.vif [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1965800618',display_name='tempest-TestNetworkBasicOps-server-1965800618',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1965800618',id=159,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKvtH0sLx5OTTHFMQTYM3QYH7RQYmwVnsMHxiGCV3jIouhC4rsGuQe5mK3HRfYiZRnIza+8ZrTNpDOstmMsD9jeBCmFzzrSDdePalLYvmd55ETV5u8CeRLVHZ692BjKBWQ==',key_name='tempest-TestNetworkBasicOps-1714887880',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:47:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-f1kn9lfz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:47:39Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=ed2454d0-acc5-4651-9e10-4bee8b32306c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.552 2 DEBUG nova.network.os_vif_util [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.553 2 DEBUG nova.network.os_vif_util [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:5f:41,bridge_name='br-int',has_traffic_filtering=True,id=8a8507fe-c765-4719-be69-19b0ada7b9f6,network=Network(0e652426-74cc-49f4-8211-ee80c6ea6be6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8507fe-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.553 2 DEBUG os_vif [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:5f:41,bridge_name='br-int',has_traffic_filtering=True,id=8a8507fe-c765-4719-be69-19b0ada7b9f6,network=Network(0e652426-74cc-49f4-8211-ee80c6ea6be6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8507fe-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.554 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a8507fe-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.560 2 INFO os_vif [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:5f:41,bridge_name='br-int',has_traffic_filtering=True,id=8a8507fe-c765-4719-be69-19b0ada7b9f6,network=Network(0e652426-74cc-49f4-8211-ee80c6ea6be6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8507fe-c7')
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.560 2 INFO nova.virt.libvirt.driver [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Deleting instance files /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c_del
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.561 2 INFO nova.virt.libvirt.driver [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Deletion of /var/lib/nova/instances/ed2454d0-acc5-4651-9e10-4bee8b32306c_del complete
Sep 30 21:48:05 compute-1 podman[247329]: 2025-09-30 21:48:05.564371666 +0000 UTC m=+0.045383592 container remove b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.571 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3c7af947-7b04-468f-a960-2e481e609f0c]: (4, ('Tue Sep 30 09:48:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6 (b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a)\nb3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a\nTue Sep 30 09:48:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6 (b3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a)\nb3dac47f6ad1ea8c6ac7a906bb422a9e79e7a5635bfa337b42be15cd9797552a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.573 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1b911aed-4d9f-49cb-9aaa-bdd9aa6e7af1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.574 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e652426-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 kernel: tap0e652426-70: left promiscuous mode
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.590 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0c2509-bf74-469c-a07d-791c1d8f81fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.617 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[487c466a-72b2-4e59-b920-7800631c8856]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.619 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8e79a99d-e6b4-4546-af2b-0695587864be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.635 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[40c253ae-e60b-4a43-81c7-3ca8958019ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 555376, 'reachable_time': 38764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247348, 'error': None, 'target': 'ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.637 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0e652426-74cc-49f4-8211-ee80c6ea6be6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:48:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:05.637 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[e3acfcfb-0d7a-4778-a0d5-ad1cd1e32fc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:05 compute-1 systemd[1]: run-netns-ovnmeta\x2d0e652426\x2d74cc\x2d49f4\x2d8211\x2dee80c6ea6be6.mount: Deactivated successfully.
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.641 2 DEBUG nova.compute.manager [req-fb40d065-480b-4c1e-b18c-fb9ef2025193 req-400f6aa2-4641-4877-accb-4f1715fed664 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received event network-vif-unplugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.641 2 DEBUG oslo_concurrency.lockutils [req-fb40d065-480b-4c1e-b18c-fb9ef2025193 req-400f6aa2-4641-4877-accb-4f1715fed664 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.641 2 DEBUG oslo_concurrency.lockutils [req-fb40d065-480b-4c1e-b18c-fb9ef2025193 req-400f6aa2-4641-4877-accb-4f1715fed664 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.641 2 DEBUG oslo_concurrency.lockutils [req-fb40d065-480b-4c1e-b18c-fb9ef2025193 req-400f6aa2-4641-4877-accb-4f1715fed664 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.641 2 DEBUG nova.compute.manager [req-fb40d065-480b-4c1e-b18c-fb9ef2025193 req-400f6aa2-4641-4877-accb-4f1715fed664 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] No waiting events found dispatching network-vif-unplugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.642 2 DEBUG nova.compute.manager [req-fb40d065-480b-4c1e-b18c-fb9ef2025193 req-400f6aa2-4641-4877-accb-4f1715fed664 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received event network-vif-unplugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.827 2 INFO nova.compute.manager [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Took 0.57 seconds to destroy the instance on the hypervisor.
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.828 2 DEBUG oslo.service.loopingcall [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.828 2 DEBUG nova.compute.manager [-] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:48:05 compute-1 nova_compute[192795]: 2025-09-30 21:48:05.828 2 DEBUG nova.network.neutron [-] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:48:06 compute-1 nova_compute[192795]: 2025-09-30 21:48:06.917 2 DEBUG nova.network.neutron [req-8913815f-e988-48d1-8e2e-fbf8a95ebe51 req-a634234a-16d7-42b6-a56e-7e17005866e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Updated VIF entry in instance network info cache for port 8a8507fe-c765-4719-be69-19b0ada7b9f6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:48:06 compute-1 nova_compute[192795]: 2025-09-30 21:48:06.917 2 DEBUG nova.network.neutron [req-8913815f-e988-48d1-8e2e-fbf8a95ebe51 req-a634234a-16d7-42b6-a56e-7e17005866e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Updating instance_info_cache with network_info: [{"id": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "address": "fa:16:3e:04:5f:41", "network": {"id": "0e652426-74cc-49f4-8211-ee80c6ea6be6", "bridge": "br-int", "label": "tempest-network-smoke--494751636", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8507fe-c7", "ovs_interfaceid": "8a8507fe-c765-4719-be69-19b0ada7b9f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:06 compute-1 nova_compute[192795]: 2025-09-30 21:48:06.958 2 DEBUG oslo_concurrency.lockutils [req-8913815f-e988-48d1-8e2e-fbf8a95ebe51 req-a634234a-16d7-42b6-a56e-7e17005866e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-ed2454d0-acc5-4651-9e10-4bee8b32306c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.328 2 DEBUG nova.network.neutron [-] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.347 2 INFO nova.compute.manager [-] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Took 1.52 seconds to deallocate network for instance.
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.445 2 DEBUG oslo_concurrency.lockutils [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.445 2 DEBUG oslo_concurrency.lockutils [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.521 2 DEBUG nova.compute.provider_tree [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.534 2 DEBUG nova.scheduler.client.report [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.574 2 DEBUG oslo_concurrency.lockutils [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.612 2 INFO nova.scheduler.client.report [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Deleted allocations for instance ed2454d0-acc5-4651-9e10-4bee8b32306c
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.674 2 DEBUG oslo_concurrency.lockutils [None req-545cbbe3-17f8-4bcc-8201-e5be7c323e2e 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.724 2 DEBUG nova.compute.manager [req-1d5d2330-2a21-49e4-a301-915445ff2b50 req-6ae8b4ef-8351-49e6-923b-097ba587beef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received event network-vif-plugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.725 2 DEBUG oslo_concurrency.lockutils [req-1d5d2330-2a21-49e4-a301-915445ff2b50 req-6ae8b4ef-8351-49e6-923b-097ba587beef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.725 2 DEBUG oslo_concurrency.lockutils [req-1d5d2330-2a21-49e4-a301-915445ff2b50 req-6ae8b4ef-8351-49e6-923b-097ba587beef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.725 2 DEBUG oslo_concurrency.lockutils [req-1d5d2330-2a21-49e4-a301-915445ff2b50 req-6ae8b4ef-8351-49e6-923b-097ba587beef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ed2454d0-acc5-4651-9e10-4bee8b32306c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.725 2 DEBUG nova.compute.manager [req-1d5d2330-2a21-49e4-a301-915445ff2b50 req-6ae8b4ef-8351-49e6-923b-097ba587beef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] No waiting events found dispatching network-vif-plugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.725 2 WARNING nova.compute.manager [req-1d5d2330-2a21-49e4-a301-915445ff2b50 req-6ae8b4ef-8351-49e6-923b-097ba587beef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received unexpected event network-vif-plugged-8a8507fe-c765-4719-be69-19b0ada7b9f6 for instance with vm_state deleted and task_state None.
Sep 30 21:48:07 compute-1 nova_compute[192795]: 2025-09-30 21:48:07.726 2 DEBUG nova.compute.manager [req-1d5d2330-2a21-49e4-a301-915445ff2b50 req-6ae8b4ef-8351-49e6-923b-097ba587beef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Received event network-vif-deleted-8a8507fe-c765-4719-be69-19b0ada7b9f6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:08 compute-1 nova_compute[192795]: 2025-09-30 21:48:08.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:10 compute-1 nova_compute[192795]: 2025-09-30 21:48:10.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:10 compute-1 podman[247352]: 2025-09-30 21:48:10.648120335 +0000 UTC m=+0.049619418 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:48:10 compute-1 podman[247350]: 2025-09-30 21:48:10.648118865 +0000 UTC m=+0.055224757 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:48:10 compute-1 podman[247351]: 2025-09-30 21:48:10.679159222 +0000 UTC m=+0.084297804 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Sep 30 21:48:12 compute-1 nova_compute[192795]: 2025-09-30 21:48:12.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:13 compute-1 nova_compute[192795]: 2025-09-30 21:48:13.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:13 compute-1 nova_compute[192795]: 2025-09-30 21:48:13.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:13 compute-1 nova_compute[192795]: 2025-09-30 21:48:13.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:14 compute-1 nova_compute[192795]: 2025-09-30 21:48:14.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:14 compute-1 nova_compute[192795]: 2025-09-30 21:48:14.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:48:14 compute-1 nova_compute[192795]: 2025-09-30 21:48:14.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:48:14 compute-1 nova_compute[192795]: 2025-09-30 21:48:14.714 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:48:15 compute-1 nova_compute[192795]: 2025-09-30 21:48:15.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:16 compute-1 podman[247420]: 2025-09-30 21:48:16.214167311 +0000 UTC m=+0.055528522 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:48:18 compute-1 nova_compute[192795]: 2025-09-30 21:48:18.709 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:18 compute-1 nova_compute[192795]: 2025-09-30 21:48:18.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:20 compute-1 nova_compute[192795]: 2025-09-30 21:48:20.534 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268885.533252, ed2454d0-acc5-4651-9e10-4bee8b32306c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:20 compute-1 nova_compute[192795]: 2025-09-30 21:48:20.535 2 INFO nova.compute.manager [-] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] VM Stopped (Lifecycle Event)
Sep 30 21:48:20 compute-1 nova_compute[192795]: 2025-09-30 21:48:20.552 2 DEBUG nova.compute.manager [None req-b8a17d75-20fd-4ae8-a03b-23320894f219 - - - - - -] [instance: ed2454d0-acc5-4651-9e10-4bee8b32306c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:20 compute-1 nova_compute[192795]: 2025-09-30 21:48:20.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:22 compute-1 podman[247442]: 2025-09-30 21:48:22.225715649 +0000 UTC m=+0.070942404 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal)
Sep 30 21:48:22 compute-1 podman[247443]: 2025-09-30 21:48:22.228085267 +0000 UTC m=+0.066123935 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:48:22 compute-1 podman[247444]: 2025-09-30 21:48:22.23827656 +0000 UTC m=+0.068518715 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:48:23 compute-1 nova_compute[192795]: 2025-09-30 21:48:23.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:25 compute-1 nova_compute[192795]: 2025-09-30 21:48:25.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:28 compute-1 nova_compute[192795]: 2025-09-30 21:48:28.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:30.460 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:a9:51 2001:db8:0:1:f816:3eff:fe2c:a951 2001:db8::f816:3eff:fe2c:a951'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe2c:a951/64 2001:db8::f816:3eff:fe2c:a951/64', 'neutron:device_id': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d03d334-4f0d-47df-a94a-1c647c7026cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5) old=Port_Binding(mac=['fa:16:3e:2c:a9:51 2001:db8::f816:3eff:fe2c:a951'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2c:a951/64', 'neutron:device_id': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:30.462 103861 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 in datapath d1ec18dd-20d4-4643-8e73-7d404d8b8493 updated
Sep 30 21:48:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:30.465 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1ec18dd-20d4-4643-8e73-7d404d8b8493, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:48:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:30.468 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4875c5b9-3bb9-408d-b170-20c722d08941]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:30 compute-1 nova_compute[192795]: 2025-09-30 21:48:30.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:32 compute-1 podman[247507]: 2025-09-30 21:48:32.247906073 +0000 UTC m=+0.086860858 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:48:33 compute-1 nova_compute[192795]: 2025-09-30 21:48:33.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:35 compute-1 nova_compute[192795]: 2025-09-30 21:48:35.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.385 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.386 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.416 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.531 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.532 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.538 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.539 2 INFO nova.compute.claims [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.683 2 DEBUG nova.compute.provider_tree [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.695 2 DEBUG nova.scheduler.client.report [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:48:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:38.706 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:38.707 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:38.707 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.720 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.721 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.836 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.837 2 DEBUG nova.network.neutron [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.861 2 INFO nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.875 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.981 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.982 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.982 2 INFO nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Creating image(s)
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.983 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.984 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.985 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:38 compute-1 nova_compute[192795]: 2025-09-30 21:48:38.998 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.049 2 DEBUG nova.policy [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.091 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.092 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.092 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.103 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.172 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.173 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.221 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.223 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.223 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.314 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.317 2 DEBUG nova.virt.disk.api [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.317 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.379 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.381 2 DEBUG nova.virt.disk.api [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.382 2 DEBUG nova.objects.instance [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid 5303fce8-c159-4964-8a55-bc25f0e493e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.406 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.407 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Ensure instance console log exists: /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.408 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.408 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:39 compute-1 nova_compute[192795]: 2025-09-30 21:48:39.408 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:40 compute-1 nova_compute[192795]: 2025-09-30 21:48:40.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:40 compute-1 nova_compute[192795]: 2025-09-30 21:48:40.946 2 DEBUG nova.network.neutron [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Successfully created port: 77fac564-d2ac-47f9-b08d-18f0ed166918 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:48:41 compute-1 podman[247546]: 2025-09-30 21:48:41.255617146 +0000 UTC m=+0.068225147 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:48:41 compute-1 podman[247544]: 2025-09-30 21:48:41.269869028 +0000 UTC m=+0.100836043 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:48:41 compute-1 podman[247545]: 2025-09-30 21:48:41.270124545 +0000 UTC m=+0.099593983 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:48:42 compute-1 nova_compute[192795]: 2025-09-30 21:48:42.128 2 DEBUG nova.network.neutron [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Successfully created port: efc7a6d8-d415-4f18-b112-76cadddd0255 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:48:43 compute-1 nova_compute[192795]: 2025-09-30 21:48:43.651 2 DEBUG nova.network.neutron [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Successfully updated port: 77fac564-d2ac-47f9-b08d-18f0ed166918 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:48:43 compute-1 nova_compute[192795]: 2025-09-30 21:48:43.831 2 DEBUG nova.compute.manager [req-d6a4cd64-1056-47c1-b88c-c50be0832b75 req-f32c11d8-7d3f-4e2f-9d40-429a06e7e886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-changed-77fac564-d2ac-47f9-b08d-18f0ed166918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:43 compute-1 nova_compute[192795]: 2025-09-30 21:48:43.832 2 DEBUG nova.compute.manager [req-d6a4cd64-1056-47c1-b88c-c50be0832b75 req-f32c11d8-7d3f-4e2f-9d40-429a06e7e886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Refreshing instance network info cache due to event network-changed-77fac564-d2ac-47f9-b08d-18f0ed166918. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:48:43 compute-1 nova_compute[192795]: 2025-09-30 21:48:43.833 2 DEBUG oslo_concurrency.lockutils [req-d6a4cd64-1056-47c1-b88c-c50be0832b75 req-f32c11d8-7d3f-4e2f-9d40-429a06e7e886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:43 compute-1 nova_compute[192795]: 2025-09-30 21:48:43.833 2 DEBUG oslo_concurrency.lockutils [req-d6a4cd64-1056-47c1-b88c-c50be0832b75 req-f32c11d8-7d3f-4e2f-9d40-429a06e7e886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:43 compute-1 nova_compute[192795]: 2025-09-30 21:48:43.833 2 DEBUG nova.network.neutron [req-d6a4cd64-1056-47c1-b88c-c50be0832b75 req-f32c11d8-7d3f-4e2f-9d40-429a06e7e886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Refreshing network info cache for port 77fac564-d2ac-47f9-b08d-18f0ed166918 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:48:43 compute-1 nova_compute[192795]: 2025-09-30 21:48:43.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:48:44.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:48:44 compute-1 nova_compute[192795]: 2025-09-30 21:48:44.086 2 DEBUG nova.network.neutron [req-d6a4cd64-1056-47c1-b88c-c50be0832b75 req-f32c11d8-7d3f-4e2f-9d40-429a06e7e886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:48:44 compute-1 nova_compute[192795]: 2025-09-30 21:48:44.576 2 DEBUG nova.network.neutron [req-d6a4cd64-1056-47c1-b88c-c50be0832b75 req-f32c11d8-7d3f-4e2f-9d40-429a06e7e886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:44 compute-1 nova_compute[192795]: 2025-09-30 21:48:44.582 2 DEBUG nova.network.neutron [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Successfully updated port: efc7a6d8-d415-4f18-b112-76cadddd0255 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:48:44 compute-1 nova_compute[192795]: 2025-09-30 21:48:44.601 2 DEBUG oslo_concurrency.lockutils [req-d6a4cd64-1056-47c1-b88c-c50be0832b75 req-f32c11d8-7d3f-4e2f-9d40-429a06e7e886 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:44 compute-1 nova_compute[192795]: 2025-09-30 21:48:44.606 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:44 compute-1 nova_compute[192795]: 2025-09-30 21:48:44.606 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:44 compute-1 nova_compute[192795]: 2025-09-30 21:48:44.607 2 DEBUG nova.network.neutron [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:48:44 compute-1 nova_compute[192795]: 2025-09-30 21:48:44.771 2 DEBUG nova.network.neutron [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:48:45 compute-1 nova_compute[192795]: 2025-09-30 21:48:45.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:46 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 21:48:46 compute-1 nova_compute[192795]: 2025-09-30 21:48:46.041 2 DEBUG nova.compute.manager [req-d70dca04-d0b2-487d-8c96-3d198d0b0620 req-739a01be-6deb-4764-8474-1c28cab2d33c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-changed-efc7a6d8-d415-4f18-b112-76cadddd0255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:46 compute-1 nova_compute[192795]: 2025-09-30 21:48:46.041 2 DEBUG nova.compute.manager [req-d70dca04-d0b2-487d-8c96-3d198d0b0620 req-739a01be-6deb-4764-8474-1c28cab2d33c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Refreshing instance network info cache due to event network-changed-efc7a6d8-d415-4f18-b112-76cadddd0255. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:48:46 compute-1 nova_compute[192795]: 2025-09-30 21:48:46.041 2 DEBUG oslo_concurrency.lockutils [req-d70dca04-d0b2-487d-8c96-3d198d0b0620 req-739a01be-6deb-4764-8474-1c28cab2d33c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:47 compute-1 podman[247615]: 2025-09-30 21:48:47.222303504 +0000 UTC m=+0.065486020 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.314 2 DEBUG nova.network.neutron [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updating instance_info_cache with network_info: [{"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.334 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.335 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Instance network_info: |[{"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.336 2 DEBUG oslo_concurrency.lockutils [req-d70dca04-d0b2-487d-8c96-3d198d0b0620 req-739a01be-6deb-4764-8474-1c28cab2d33c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.336 2 DEBUG nova.network.neutron [req-d70dca04-d0b2-487d-8c96-3d198d0b0620 req-739a01be-6deb-4764-8474-1c28cab2d33c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Refreshing network info cache for port efc7a6d8-d415-4f18-b112-76cadddd0255 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.340 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Start _get_guest_xml network_info=[{"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.348 2 WARNING nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.353 2 DEBUG nova.virt.libvirt.host [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.353 2 DEBUG nova.virt.libvirt.host [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.363 2 DEBUG nova.virt.libvirt.host [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.365 2 DEBUG nova.virt.libvirt.host [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.366 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.366 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.367 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.367 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.367 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.367 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.368 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.368 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.368 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.369 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.369 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.369 2 DEBUG nova.virt.hardware [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.375 2 DEBUG nova.virt.libvirt.vif [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-414695147',display_name='tempest-TestGettingAddress-server-414695147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-414695147',id=163,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-yr4z6l47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:38Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=5303fce8-c159-4964-8a55-bc25f0e493e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.376 2 DEBUG nova.network.os_vif_util [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.377 2 DEBUG nova.network.os_vif_util [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:23:78,bridge_name='br-int',has_traffic_filtering=True,id=77fac564-d2ac-47f9-b08d-18f0ed166918,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fac564-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.379 2 DEBUG nova.virt.libvirt.vif [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-414695147',display_name='tempest-TestGettingAddress-server-414695147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-414695147',id=163,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-yr4z6l47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:38Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=5303fce8-c159-4964-8a55-bc25f0e493e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.379 2 DEBUG nova.network.os_vif_util [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.380 2 DEBUG nova.network.os_vif_util [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:bf:1e,bridge_name='br-int',has_traffic_filtering=True,id=efc7a6d8-d415-4f18-b112-76cadddd0255,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc7a6d8-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.382 2 DEBUG nova.objects.instance [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 5303fce8-c159-4964-8a55-bc25f0e493e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.407 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <uuid>5303fce8-c159-4964-8a55-bc25f0e493e1</uuid>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <name>instance-000000a3</name>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <nova:name>tempest-TestGettingAddress-server-414695147</nova:name>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:48:47</nova:creationTime>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:48:47 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         <nova:port uuid="77fac564-d2ac-47f9-b08d-18f0ed166918">
Sep 30 21:48:47 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         <nova:port uuid="efc7a6d8-d415-4f18-b112-76cadddd0255">
Sep 30 21:48:47 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fede:bf1e" ipVersion="6"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fede:bf1e" ipVersion="6"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <system>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <entry name="serial">5303fce8-c159-4964-8a55-bc25f0e493e1</entry>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <entry name="uuid">5303fce8-c159-4964-8a55-bc25f0e493e1</entry>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </system>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <os>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   </os>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <features>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   </features>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk.config"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:c0:23:78"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <target dev="tap77fac564-d2"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:de:bf:1e"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <target dev="tapefc7a6d8-d4"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/console.log" append="off"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <video>
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </video>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:48:47 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:48:47 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:48:47 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:48:47 compute-1 nova_compute[192795]: </domain>
Sep 30 21:48:47 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.410 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Preparing to wait for external event network-vif-plugged-77fac564-d2ac-47f9-b08d-18f0ed166918 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.411 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.413 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.413 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.414 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Preparing to wait for external event network-vif-plugged-efc7a6d8-d415-4f18-b112-76cadddd0255 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.414 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.415 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.415 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.417 2 DEBUG nova.virt.libvirt.vif [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-414695147',display_name='tempest-TestGettingAddress-server-414695147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-414695147',id=163,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-yr4z6l47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:38Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=5303fce8-c159-4964-8a55-bc25f0e493e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.418 2 DEBUG nova.network.os_vif_util [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.419 2 DEBUG nova.network.os_vif_util [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:23:78,bridge_name='br-int',has_traffic_filtering=True,id=77fac564-d2ac-47f9-b08d-18f0ed166918,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fac564-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.420 2 DEBUG os_vif [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:23:78,bridge_name='br-int',has_traffic_filtering=True,id=77fac564-d2ac-47f9-b08d-18f0ed166918,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fac564-d2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.421 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.421 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.427 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77fac564-d2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.427 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77fac564-d2, col_values=(('external_ids', {'iface-id': '77fac564-d2ac-47f9-b08d-18f0ed166918', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:23:78', 'vm-uuid': '5303fce8-c159-4964-8a55-bc25f0e493e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-1 NetworkManager[51724]: <info>  [1759268927.4306] manager: (tap77fac564-d2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.438 2 INFO os_vif [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:23:78,bridge_name='br-int',has_traffic_filtering=True,id=77fac564-d2ac-47f9-b08d-18f0ed166918,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fac564-d2')
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.439 2 DEBUG nova.virt.libvirt.vif [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-414695147',display_name='tempest-TestGettingAddress-server-414695147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-414695147',id=163,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-yr4z6l47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:38Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=5303fce8-c159-4964-8a55-bc25f0e493e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.439 2 DEBUG nova.network.os_vif_util [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.440 2 DEBUG nova.network.os_vif_util [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:bf:1e,bridge_name='br-int',has_traffic_filtering=True,id=efc7a6d8-d415-4f18-b112-76cadddd0255,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc7a6d8-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.440 2 DEBUG os_vif [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:bf:1e,bridge_name='br-int',has_traffic_filtering=True,id=efc7a6d8-d415-4f18-b112-76cadddd0255,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc7a6d8-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.441 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.442 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.445 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefc7a6d8-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.445 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapefc7a6d8-d4, col_values=(('external_ids', {'iface-id': 'efc7a6d8-d415-4f18-b112-76cadddd0255', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:bf:1e', 'vm-uuid': '5303fce8-c159-4964-8a55-bc25f0e493e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-1 NetworkManager[51724]: <info>  [1759268927.4478] manager: (tapefc7a6d8-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.455 2 INFO os_vif [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:bf:1e,bridge_name='br-int',has_traffic_filtering=True,id=efc7a6d8-d415-4f18-b112-76cadddd0255,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc7a6d8-d4')
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.517 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.518 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.518 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:c0:23:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.518 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:de:bf:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:48:47 compute-1 nova_compute[192795]: 2025-09-30 21:48:47.519 2 INFO nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Using config drive
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.029 2 INFO nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Creating config drive at /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk.config
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.034 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt8gv9f2v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.160 2 DEBUG oslo_concurrency.processutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt8gv9f2v" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:48 compute-1 NetworkManager[51724]: <info>  [1759268928.2373] manager: (tap77fac564-d2): new Tun device (/org/freedesktop/NetworkManager/Devices/323)
Sep 30 21:48:48 compute-1 kernel: tap77fac564-d2: entered promiscuous mode
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-1 ovn_controller[94902]: 2025-09-30T21:48:48Z|00645|binding|INFO|Claiming lport 77fac564-d2ac-47f9-b08d-18f0ed166918 for this chassis.
Sep 30 21:48:48 compute-1 ovn_controller[94902]: 2025-09-30T21:48:48Z|00646|binding|INFO|77fac564-d2ac-47f9-b08d-18f0ed166918: Claiming fa:16:3e:c0:23:78 10.100.0.10
Sep 30 21:48:48 compute-1 NetworkManager[51724]: <info>  [1759268928.2555] manager: (tapefc7a6d8-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.262 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:23:78 10.100.0.10'], port_security=['fa:16:3e:c0:23:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5303fce8-c159-4964-8a55-bc25f0e493e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '272ca623-2e10-4dc9-b9ff-e2fd55c61f6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfb1184d-a559-4543-ae06-e2b48bcfb2c3, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=77fac564-d2ac-47f9-b08d-18f0ed166918) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.264 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 77fac564-d2ac-47f9-b08d-18f0ed166918 in datapath 40cfa99d-fae5-4f7e-b4bc-e90e389ced61 bound to our chassis
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.266 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 40cfa99d-fae5-4f7e-b4bc-e90e389ced61
Sep 30 21:48:48 compute-1 systemd-udevd[247661]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:48:48 compute-1 systemd-udevd[247662]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.281 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7963d9ad-2113-44e6-a103-72324cdb1eb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.282 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap40cfa99d-f1 in ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.285 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap40cfa99d-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.285 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[668514df-6f37-4c06-b9b3-bdbcd2b3aeb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.286 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6d96667c-9cab-404a-a253-d600d748c384]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 NetworkManager[51724]: <info>  [1759268928.2971] device (tap77fac564-d2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:48:48 compute-1 NetworkManager[51724]: <info>  [1759268928.2986] device (tap77fac564-d2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.304 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[8f74c184-5c81-4529-8da3-38ae5efb3180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 systemd-machined[152783]: New machine qemu-75-instance-000000a3.
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.338 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[edd852e0-acdb-4225-8ba4-9313a8546040]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 kernel: tapefc7a6d8-d4: entered promiscuous mode
Sep 30 21:48:48 compute-1 NetworkManager[51724]: <info>  [1759268928.3551] device (tapefc7a6d8-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:48:48 compute-1 systemd[1]: Started Virtual Machine qemu-75-instance-000000a3.
Sep 30 21:48:48 compute-1 NetworkManager[51724]: <info>  [1759268928.3566] device (tapefc7a6d8-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:48:48 compute-1 ovn_controller[94902]: 2025-09-30T21:48:48Z|00647|binding|INFO|Claiming lport efc7a6d8-d415-4f18-b112-76cadddd0255 for this chassis.
Sep 30 21:48:48 compute-1 ovn_controller[94902]: 2025-09-30T21:48:48Z|00648|binding|INFO|efc7a6d8-d415-4f18-b112-76cadddd0255: Claiming fa:16:3e:de:bf:1e 2001:db8:0:1:f816:3eff:fede:bf1e 2001:db8::f816:3eff:fede:bf1e
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.365 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:bf:1e 2001:db8:0:1:f816:3eff:fede:bf1e 2001:db8::f816:3eff:fede:bf1e'], port_security=['fa:16:3e:de:bf:1e 2001:db8:0:1:f816:3eff:fede:bf1e 2001:db8::f816:3eff:fede:bf1e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fede:bf1e/64 2001:db8::f816:3eff:fede:bf1e/64', 'neutron:device_id': '5303fce8-c159-4964-8a55-bc25f0e493e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '272ca623-2e10-4dc9-b9ff-e2fd55c61f6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d03d334-4f0d-47df-a94a-1c647c7026cb, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=efc7a6d8-d415-4f18-b112-76cadddd0255) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:48 compute-1 ovn_controller[94902]: 2025-09-30T21:48:48Z|00649|binding|INFO|Setting lport 77fac564-d2ac-47f9-b08d-18f0ed166918 ovn-installed in OVS
Sep 30 21:48:48 compute-1 ovn_controller[94902]: 2025-09-30T21:48:48Z|00650|binding|INFO|Setting lport 77fac564-d2ac-47f9-b08d-18f0ed166918 up in Southbound
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.371 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[606d4fd8-51df-42e8-b045-bb60e239a759]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.380 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2389861f-b5e7-4d8f-a89b-89ac3a146e79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_controller[94902]: 2025-09-30T21:48:48Z|00651|binding|INFO|Setting lport efc7a6d8-d415-4f18-b112-76cadddd0255 ovn-installed in OVS
Sep 30 21:48:48 compute-1 ovn_controller[94902]: 2025-09-30T21:48:48Z|00652|binding|INFO|Setting lport efc7a6d8-d415-4f18-b112-76cadddd0255 up in Southbound
Sep 30 21:48:48 compute-1 NetworkManager[51724]: <info>  [1759268928.3824] manager: (tap40cfa99d-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/325)
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.410 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2e66196a-4499-49ca-9f57-820a43eba772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.413 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[172c35ef-377f-4307-b716-52465a414e5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 NetworkManager[51724]: <info>  [1759268928.4354] device (tap40cfa99d-f0): carrier: link connected
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.444 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[fc70501f-64c2-4778-9601-48d99aa6b583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.466 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[36741a4e-2201-4352-a492-72d86ce3e573]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40cfa99d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:50:29:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562410, 'reachable_time': 31137, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247697, 'error': None, 'target': 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.484 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d865350e-4fde-4363-be8d-4ad575ae78db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe50:29b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 562410, 'tstamp': 562410}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247698, 'error': None, 'target': 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.502 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0974d8-3a78-45f6-80d5-53bda69605cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40cfa99d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:50:29:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562410, 'reachable_time': 31137, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247699, 'error': None, 'target': 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.553 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[826cf47b-fd86-40e2-9525-ed5b2eac6436]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.623 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3befe1e0-c4a9-4dea-8679-4f63a5830d2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.625 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40cfa99d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.625 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.626 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40cfa99d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-1 NetworkManager[51724]: <info>  [1759268928.6290] manager: (tap40cfa99d-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Sep 30 21:48:48 compute-1 kernel: tap40cfa99d-f0: entered promiscuous mode
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.632 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap40cfa99d-f0, col_values=(('external_ids', {'iface-id': 'db554134-d733-46a3-ad79-d127cd6e8575'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-1 ovn_controller[94902]: 2025-09-30T21:48:48Z|00653|binding|INFO|Releasing lport db554134-d733-46a3-ad79-d127cd6e8575 from this chassis (sb_readonly=0)
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.648 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/40cfa99d-fae5-4f7e-b4bc-e90e389ced61.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/40cfa99d-fae5-4f7e-b4bc-e90e389ced61.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.649 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff1628e-59a1-4d86-b7f1-868d74b8388b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.651 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-40cfa99d-fae5-4f7e-b4bc-e90e389ced61
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/40cfa99d-fae5-4f7e-b4bc-e90e389ced61.pid.haproxy
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 40cfa99d-fae5-4f7e-b4bc-e90e389ced61
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:48:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:48.652 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'env', 'PROCESS_TAG=haproxy-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/40cfa99d-fae5-4f7e-b4bc-e90e389ced61.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:48:48 compute-1 nova_compute[192795]: 2025-09-30 21:48:48.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.047 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:49 compute-1 podman[247739]: 2025-09-30 21:48:49.057262152 +0000 UTC m=+0.051564645 container create cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:48:49 compute-1 systemd[1]: Started libpod-conmon-cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1.scope.
Sep 30 21:48:49 compute-1 podman[247739]: 2025-09-30 21:48:49.030936741 +0000 UTC m=+0.025239244 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:48:49 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:48:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98f138f4772a76b8d072e6764cc6ab66d60930a9a56510141c4d82c2b7cebe0a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:48:49 compute-1 podman[247739]: 2025-09-30 21:48:49.169905957 +0000 UTC m=+0.164208440 container init cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:48:49 compute-1 podman[247739]: 2025-09-30 21:48:49.179150274 +0000 UTC m=+0.173452767 container start cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:48:49 compute-1 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[247754]: [NOTICE]   (247758) : New worker (247760) forked
Sep 30 21:48:49 compute-1 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[247754]: [NOTICE]   (247758) : Loading success.
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.252 103861 INFO neutron.agent.ovn.metadata.agent [-] Port efc7a6d8-d415-4f18-b112-76cadddd0255 in datapath d1ec18dd-20d4-4643-8e73-7d404d8b8493 unbound from our chassis
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.256 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1ec18dd-20d4-4643-8e73-7d404d8b8493
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.271 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[582d181e-41a0-4383-a7f3-217cc68e1a2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.273 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1ec18dd-21 in ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.276 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1ec18dd-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.276 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e4743f-8dee-48eb-9e9b-aff0c6c35e23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.277 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0108eb-adb7-41c2-a533-fda81777454b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.293 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[37449571-b849-4fd9-9328-c9f89b1f8331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.307 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6fb8d538-c5a5-4c66-84f9-4170d01b8cd0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.315 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268929.3146489, 5303fce8-c159-4964-8a55-bc25f0e493e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.316 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] VM Started (Lifecycle Event)
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.341 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[529b4431-859b-4e88-b7e0-6a9e3bfdd160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.348 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:49 compute-1 NetworkManager[51724]: <info>  [1759268929.3503] manager: (tapd1ec18dd-20): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Sep 30 21:48:49 compute-1 systemd-udevd[247681]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.350 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c20ef087-ba29-4b50-bc8b-352dfdb62523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.353 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268929.3157954, 5303fce8-c159-4964-8a55-bc25f0e493e1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.354 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] VM Paused (Lifecycle Event)
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.376 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.381 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.393 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[16c4d0bd-0e52-409c-acfd-50bb992d0faa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.396 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0b92bb-fa1b-41ad-864b-31f2bc79a2e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.402 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:48:49 compute-1 NetworkManager[51724]: <info>  [1759268929.4300] device (tapd1ec18dd-20): carrier: link connected
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.435 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b771d0-ab17-4a1e-a5ea-eb2f50ecdbfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.461 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[99b66c37-2987-4adb-bb0e-f5f2a609b41d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1ec18dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:a9:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 207], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562510, 'reachable_time': 30837, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247779, 'error': None, 'target': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.487 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c93bab1c-76e7-4d83-a77c-9709f17abde9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:a951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 562510, 'tstamp': 562510}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247780, 'error': None, 'target': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.516 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a489f9d4-4133-4545-b9b4-31249e59bc33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1ec18dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:a9:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 207], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562510, 'reachable_time': 30837, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247781, 'error': None, 'target': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.561 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9791b267-b280-44d3-b5c4-3cbd403209fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.604 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb03c3e-2cbe-4ff3-81a4-07b960523f7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.606 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1ec18dd-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.606 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.606 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1ec18dd-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:49 compute-1 NetworkManager[51724]: <info>  [1759268929.6099] manager: (tapd1ec18dd-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:49 compute-1 kernel: tapd1ec18dd-20: entered promiscuous mode
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.615 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1ec18dd-20, col_values=(('external_ids', {'iface-id': '4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:49 compute-1 ovn_controller[94902]: 2025-09-30T21:48:49Z|00654|binding|INFO|Releasing lport 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 from this chassis (sb_readonly=0)
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.619 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1ec18dd-20d4-4643-8e73-7d404d8b8493.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1ec18dd-20d4-4643-8e73-7d404d8b8493.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.620 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b1b9e8-92b8-409b-a9ff-f9fbfd1f4777]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.621 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-d1ec18dd-20d4-4643-8e73-7d404d8b8493
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/d1ec18dd-20d4-4643-8e73-7d404d8b8493.pid.haproxy
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID d1ec18dd-20d4-4643-8e73-7d404d8b8493
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:48:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:49.622 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'env', 'PROCESS_TAG=haproxy-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1ec18dd-20d4-4643-8e73-7d404d8b8493.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.959 2 DEBUG nova.network.neutron [req-d70dca04-d0b2-487d-8c96-3d198d0b0620 req-739a01be-6deb-4764-8474-1c28cab2d33c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updated VIF entry in instance network info cache for port efc7a6d8-d415-4f18-b112-76cadddd0255. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.960 2 DEBUG nova.network.neutron [req-d70dca04-d0b2-487d-8c96-3d198d0b0620 req-739a01be-6deb-4764-8474-1c28cab2d33c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updating instance_info_cache with network_info: [{"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:49 compute-1 podman[247811]: 2025-09-30 21:48:49.965318774 +0000 UTC m=+0.046087920 container create 1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:48:49 compute-1 nova_compute[192795]: 2025-09-30 21:48:49.990 2 DEBUG oslo_concurrency.lockutils [req-d70dca04-d0b2-487d-8c96-3d198d0b0620 req-739a01be-6deb-4764-8474-1c28cab2d33c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:50 compute-1 systemd[1]: Started libpod-conmon-1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e.scope.
Sep 30 21:48:50 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:48:50 compute-1 podman[247811]: 2025-09-30 21:48:49.942587492 +0000 UTC m=+0.023356648 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:48:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b06eb3879d7d2a6fc107729da638fafea387e1f1624d516f0b50552a0d2b1b5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:48:50 compute-1 podman[247811]: 2025-09-30 21:48:50.054780375 +0000 UTC m=+0.135549551 container init 1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:48:50 compute-1 podman[247811]: 2025-09-30 21:48:50.062483145 +0000 UTC m=+0.143252291 container start 1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:48:50 compute-1 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[247826]: [NOTICE]   (247830) : New worker (247832) forked
Sep 30 21:48:50 compute-1 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[247826]: [NOTICE]   (247830) : Loading success.
Sep 30 21:48:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:50.149 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.373 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-plugged-77fac564-d2ac-47f9-b08d-18f0ed166918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.374 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.375 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.375 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.376 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Processing event network-vif-plugged-77fac564-d2ac-47f9-b08d-18f0ed166918 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.376 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-plugged-77fac564-d2ac-47f9-b08d-18f0ed166918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.377 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.377 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.378 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.378 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] No event matching network-vif-plugged-77fac564-d2ac-47f9-b08d-18f0ed166918 in dict_keys([('network-vif-plugged', 'efc7a6d8-d415-4f18-b112-76cadddd0255')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.378 2 WARNING nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received unexpected event network-vif-plugged-77fac564-d2ac-47f9-b08d-18f0ed166918 for instance with vm_state building and task_state spawning.
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.379 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-plugged-efc7a6d8-d415-4f18-b112-76cadddd0255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.379 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.380 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.380 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.381 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Processing event network-vif-plugged-efc7a6d8-d415-4f18-b112-76cadddd0255 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.381 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-plugged-efc7a6d8-d415-4f18-b112-76cadddd0255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.382 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.382 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.382 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.383 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] No waiting events found dispatching network-vif-plugged-efc7a6d8-d415-4f18-b112-76cadddd0255 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.383 2 WARNING nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received unexpected event network-vif-plugged-efc7a6d8-d415-4f18-b112-76cadddd0255 for instance with vm_state building and task_state spawning.
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.384 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.389 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268930.3891895, 5303fce8-c159-4964-8a55-bc25f0e493e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.390 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] VM Resumed (Lifecycle Event)
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.392 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.396 2 INFO nova.virt.libvirt.driver [-] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Instance spawned successfully.
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.397 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.441 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.446 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.500 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.500 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.501 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.501 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.501 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.502 2 DEBUG nova.virt.libvirt.driver [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.586 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.646 2 INFO nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Took 11.66 seconds to spawn the instance on the hypervisor.
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.647 2 DEBUG nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.769 2 INFO nova.compute.manager [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Took 12.29 seconds to build instance.
Sep 30 21:48:50 compute-1 nova_compute[192795]: 2025-09-30 21:48:50.795 2 DEBUG oslo_concurrency.lockutils [None req-eebb0acc-a02a-4be6-a962-a6e819394683 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:52 compute-1 nova_compute[192795]: 2025-09-30 21:48:52.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:53 compute-1 podman[247841]: 2025-09-30 21:48:53.215368575 +0000 UTC m=+0.058305032 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, release=1755695350, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:48:53 compute-1 podman[247843]: 2025-09-30 21:48:53.215445077 +0000 UTC m=+0.051291579 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:48:53 compute-1 podman[247842]: 2025-09-30 21:48:53.220930223 +0000 UTC m=+0.062294121 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:48:53 compute-1 NetworkManager[51724]: <info>  [1759268933.5490] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Sep 30 21:48:53 compute-1 NetworkManager[51724]: <info>  [1759268933.5509] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.704 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "33176d38-ded7-4585-ab86-3b25756b50a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.704 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.720 2 DEBUG nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:48:53 compute-1 ovn_controller[94902]: 2025-09-30T21:48:53Z|00655|binding|INFO|Releasing lport 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 from this chassis (sb_readonly=0)
Sep 30 21:48:53 compute-1 ovn_controller[94902]: 2025-09-30T21:48:53Z|00656|binding|INFO|Releasing lport db554134-d733-46a3-ad79-d127cd6e8575 from this chassis (sb_readonly=0)
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.813 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.814 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.821 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.822 2 INFO nova.compute.claims [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.959 2 DEBUG nova.compute.provider_tree [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.972 2 DEBUG nova.scheduler.client.report [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.993 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:53 compute-1 nova_compute[192795]: 2025-09-30 21:48:53.994 2 DEBUG nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.051 2 DEBUG nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.052 2 DEBUG nova.network.neutron [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.068 2 INFO nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.087 2 DEBUG nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.212 2 DEBUG nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.213 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.214 2 INFO nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Creating image(s)
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.214 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "/var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.214 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.215 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "/var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.226 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.295 2 DEBUG nova.policy [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27859618cb1d493cb1531af26b200b92', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '043721d1d0a2480fa785367fa56c1fa4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.306 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.307 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.307 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.325 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.406 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.407 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.446 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.448 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.448 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.497 2 DEBUG nova.compute.manager [req-bfc84498-d610-4117-8467-4fb99debdd74 req-f3027469-1e31-4077-abdd-2db86a0124b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-changed-77fac564-d2ac-47f9-b08d-18f0ed166918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.497 2 DEBUG nova.compute.manager [req-bfc84498-d610-4117-8467-4fb99debdd74 req-f3027469-1e31-4077-abdd-2db86a0124b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Refreshing instance network info cache due to event network-changed-77fac564-d2ac-47f9-b08d-18f0ed166918. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.497 2 DEBUG oslo_concurrency.lockutils [req-bfc84498-d610-4117-8467-4fb99debdd74 req-f3027469-1e31-4077-abdd-2db86a0124b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.498 2 DEBUG oslo_concurrency.lockutils [req-bfc84498-d610-4117-8467-4fb99debdd74 req-f3027469-1e31-4077-abdd-2db86a0124b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.498 2 DEBUG nova.network.neutron [req-bfc84498-d610-4117-8467-4fb99debdd74 req-f3027469-1e31-4077-abdd-2db86a0124b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Refreshing network info cache for port 77fac564-d2ac-47f9-b08d-18f0ed166918 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.505 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.505 2 DEBUG nova.virt.disk.api [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Checking if we can resize image /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.505 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.569 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.570 2 DEBUG nova.virt.disk.api [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Cannot resize image /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.570 2 DEBUG nova.objects.instance [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'migration_context' on Instance uuid 33176d38-ded7-4585-ab86-3b25756b50a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.588 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.589 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Ensure instance console log exists: /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.589 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.590 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:54 compute-1 nova_compute[192795]: 2025-09-30 21:48:54.590 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:57 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:48:57.152 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:57 compute-1 nova_compute[192795]: 2025-09-30 21:48:57.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:58 compute-1 nova_compute[192795]: 2025-09-30 21:48:58.241 2 DEBUG nova.network.neutron [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Successfully created port: 3522906d-96c0-460c-8864-7d0da2b92fc0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:48:58 compute-1 nova_compute[192795]: 2025-09-30 21:48:58.499 2 DEBUG nova.network.neutron [req-bfc84498-d610-4117-8467-4fb99debdd74 req-f3027469-1e31-4077-abdd-2db86a0124b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updated VIF entry in instance network info cache for port 77fac564-d2ac-47f9-b08d-18f0ed166918. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:48:58 compute-1 nova_compute[192795]: 2025-09-30 21:48:58.500 2 DEBUG nova.network.neutron [req-bfc84498-d610-4117-8467-4fb99debdd74 req-f3027469-1e31-4077-abdd-2db86a0124b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updating instance_info_cache with network_info: [{"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:58 compute-1 nova_compute[192795]: 2025-09-30 21:48:58.527 2 DEBUG oslo_concurrency.lockutils [req-bfc84498-d610-4117-8467-4fb99debdd74 req-f3027469-1e31-4077-abdd-2db86a0124b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:58 compute-1 nova_compute[192795]: 2025-09-30 21:48:58.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:59 compute-1 nova_compute[192795]: 2025-09-30 21:48:59.713 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.738 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.738 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.738 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.739 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.819 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.879 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.881 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:00 compute-1 nova_compute[192795]: 2025-09-30 21:49:00.939 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.136 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.137 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5489MB free_disk=73.29999160766602GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.138 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.138 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.224 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 5303fce8-c159-4964-8a55-bc25f0e493e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.225 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 33176d38-ded7-4585-ab86-3b25756b50a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.225 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.225 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.279 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.297 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.336 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.337 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.778 2 DEBUG nova.network.neutron [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Successfully updated port: 3522906d-96c0-460c-8864-7d0da2b92fc0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.802 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.802 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquired lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.803 2 DEBUG nova.network.neutron [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.922 2 DEBUG nova.compute.manager [req-2abaa82c-2b9c-49f7-afbe-505eb01c6de2 req-edff6367-10da-4750-b925-2bbd68acfc4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received event network-changed-3522906d-96c0-460c-8864-7d0da2b92fc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.922 2 DEBUG nova.compute.manager [req-2abaa82c-2b9c-49f7-afbe-505eb01c6de2 req-edff6367-10da-4750-b925-2bbd68acfc4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Refreshing instance network info cache due to event network-changed-3522906d-96c0-460c-8864-7d0da2b92fc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.923 2 DEBUG oslo_concurrency.lockutils [req-2abaa82c-2b9c-49f7-afbe-505eb01c6de2 req-edff6367-10da-4750-b925-2bbd68acfc4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:01 compute-1 nova_compute[192795]: 2025-09-30 21:49:01.966 2 DEBUG nova.network.neutron [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.338 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.724 2 DEBUG nova.network.neutron [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Updating instance_info_cache with network_info: [{"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.755 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Releasing lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.755 2 DEBUG nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Instance network_info: |[{"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.756 2 DEBUG oslo_concurrency.lockutils [req-2abaa82c-2b9c-49f7-afbe-505eb01c6de2 req-edff6367-10da-4750-b925-2bbd68acfc4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.756 2 DEBUG nova.network.neutron [req-2abaa82c-2b9c-49f7-afbe-505eb01c6de2 req-edff6367-10da-4750-b925-2bbd68acfc4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Refreshing network info cache for port 3522906d-96c0-460c-8864-7d0da2b92fc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.760 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Start _get_guest_xml network_info=[{"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.765 2 WARNING nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.771 2 DEBUG nova.virt.libvirt.host [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.773 2 DEBUG nova.virt.libvirt.host [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.783 2 DEBUG nova.virt.libvirt.host [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.785 2 DEBUG nova.virt.libvirt.host [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.786 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.787 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.788 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.788 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.789 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.789 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.789 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.790 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.790 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.791 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.791 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.792 2 DEBUG nova.virt.hardware [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.799 2 DEBUG nova.virt.libvirt.vif [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1320041238',display_name='tempest-TestNetworkBasicOps-server-1320041238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1320041238',id=165,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4Quez6rH07hO11bnVBHapHQpuODPsUfvGojX99zjz36B9pPYT8MUw35fqsoQVp9rSz0sysMAXACiS9gfckq0iSiY6wy4fmUZVf3KExWVMbmFIshemcIOYsG9NrTVI27Q==',key_name='tempest-TestNetworkBasicOps-85814944',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-zvwd3t8i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:54Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=33176d38-ded7-4585-ab86-3b25756b50a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.800 2 DEBUG nova.network.os_vif_util [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.801 2 DEBUG nova.network.os_vif_util [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:e9:98,bridge_name='br-int',has_traffic_filtering=True,id=3522906d-96c0-460c-8864-7d0da2b92fc0,network=Network(8de3daa0-c8e4-4471-8a0c-c05c67411ae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3522906d-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.803 2 DEBUG nova.objects.instance [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 33176d38-ded7-4585-ab86-3b25756b50a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.822 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <uuid>33176d38-ded7-4585-ab86-3b25756b50a8</uuid>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <name>instance-000000a5</name>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkBasicOps-server-1320041238</nova:name>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:49:02</nova:creationTime>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:49:02 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:49:02 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:49:02 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:49:02 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:49:02 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:49:02 compute-1 nova_compute[192795]:         <nova:user uuid="27859618cb1d493cb1531af26b200b92">tempest-TestNetworkBasicOps-2126023928-project-member</nova:user>
Sep 30 21:49:02 compute-1 nova_compute[192795]:         <nova:project uuid="043721d1d0a2480fa785367fa56c1fa4">tempest-TestNetworkBasicOps-2126023928</nova:project>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:49:02 compute-1 nova_compute[192795]:         <nova:port uuid="3522906d-96c0-460c-8864-7d0da2b92fc0">
Sep 30 21:49:02 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <system>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <entry name="serial">33176d38-ded7-4585-ab86-3b25756b50a8</entry>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <entry name="uuid">33176d38-ded7-4585-ab86-3b25756b50a8</entry>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     </system>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <os>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   </os>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <features>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   </features>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk.config"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:83:e9:98"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <target dev="tap3522906d-96"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/console.log" append="off"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <video>
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     </video>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:49:02 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:49:02 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:49:02 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:49:02 compute-1 nova_compute[192795]: </domain>
Sep 30 21:49:02 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.824 2 DEBUG nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Preparing to wait for external event network-vif-plugged-3522906d-96c0-460c-8864-7d0da2b92fc0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.824 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.824 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.825 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.825 2 DEBUG nova.virt.libvirt.vif [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1320041238',display_name='tempest-TestNetworkBasicOps-server-1320041238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1320041238',id=165,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4Quez6rH07hO11bnVBHapHQpuODPsUfvGojX99zjz36B9pPYT8MUw35fqsoQVp9rSz0sysMAXACiS9gfckq0iSiY6wy4fmUZVf3KExWVMbmFIshemcIOYsG9NrTVI27Q==',key_name='tempest-TestNetworkBasicOps-85814944',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-zvwd3t8i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:54Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=33176d38-ded7-4585-ab86-3b25756b50a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.826 2 DEBUG nova.network.os_vif_util [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.826 2 DEBUG nova.network.os_vif_util [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:e9:98,bridge_name='br-int',has_traffic_filtering=True,id=3522906d-96c0-460c-8864-7d0da2b92fc0,network=Network(8de3daa0-c8e4-4471-8a0c-c05c67411ae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3522906d-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.827 2 DEBUG os_vif [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:e9:98,bridge_name='br-int',has_traffic_filtering=True,id=3522906d-96c0-460c-8864-7d0da2b92fc0,network=Network(8de3daa0-c8e4-4471-8a0c-c05c67411ae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3522906d-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.828 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.838 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3522906d-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.839 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3522906d-96, col_values=(('external_ids', {'iface-id': '3522906d-96c0-460c-8864-7d0da2b92fc0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:e9:98', 'vm-uuid': '33176d38-ded7-4585-ab86-3b25756b50a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:02 compute-1 NetworkManager[51724]: <info>  [1759268942.8421] manager: (tap3522906d-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.854 2 INFO os_vif [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:e9:98,bridge_name='br-int',has_traffic_filtering=True,id=3522906d-96c0-460c-8864-7d0da2b92fc0,network=Network(8de3daa0-c8e4-4471-8a0c-c05c67411ae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3522906d-96')
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.910 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.910 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.910 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] No VIF found with MAC fa:16:3e:83:e9:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:49:02 compute-1 nova_compute[192795]: 2025-09-30 21:49:02.911 2 INFO nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Using config drive
Sep 30 21:49:03 compute-1 podman[247942]: 2025-09-30 21:49:03.254973676 +0000 UTC m=+0.092333283 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid)
Sep 30 21:49:03 compute-1 nova_compute[192795]: 2025-09-30 21:49:03.834 2 INFO nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Creating config drive at /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk.config
Sep 30 21:49:03 compute-1 nova_compute[192795]: 2025-09-30 21:49:03.844 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp38muncf1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:03 compute-1 nova_compute[192795]: 2025-09-30 21:49:03.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:03 compute-1 nova_compute[192795]: 2025-09-30 21:49:03.993 2 DEBUG oslo_concurrency.processutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp38muncf1" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:04 compute-1 kernel: tap3522906d-96: entered promiscuous mode
Sep 30 21:49:04 compute-1 NetworkManager[51724]: <info>  [1759268944.0923] manager: (tap3522906d-96): new Tun device (/org/freedesktop/NetworkManager/Devices/332)
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:04 compute-1 ovn_controller[94902]: 2025-09-30T21:49:04Z|00657|binding|INFO|Claiming lport 3522906d-96c0-460c-8864-7d0da2b92fc0 for this chassis.
Sep 30 21:49:04 compute-1 ovn_controller[94902]: 2025-09-30T21:49:04Z|00658|binding|INFO|3522906d-96c0-460c-8864-7d0da2b92fc0: Claiming fa:16:3e:83:e9:98 10.100.0.11
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.107 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:e9:98 10.100.0.11'], port_security=['fa:16:3e:83:e9:98 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '33176d38-ded7-4585-ab86-3b25756b50a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8de3daa0-c8e4-4471-8a0c-c05c67411ae1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd5d81c6a-117f-4eeb-9b9a-cadc270ff078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58bbb783-b6ec-4366-bc6e-a18a122f97d1, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=3522906d-96c0-460c-8864-7d0da2b92fc0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.110 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 3522906d-96c0-460c-8864-7d0da2b92fc0 in datapath 8de3daa0-c8e4-4471-8a0c-c05c67411ae1 bound to our chassis
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.113 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8de3daa0-c8e4-4471-8a0c-c05c67411ae1
Sep 30 21:49:04 compute-1 ovn_controller[94902]: 2025-09-30T21:49:04Z|00659|binding|INFO|Setting lport 3522906d-96c0-460c-8864-7d0da2b92fc0 ovn-installed in OVS
Sep 30 21:49:04 compute-1 ovn_controller[94902]: 2025-09-30T21:49:04Z|00660|binding|INFO|Setting lport 3522906d-96c0-460c-8864-7d0da2b92fc0 up in Southbound
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.137 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ef1f39e7-1bec-4ee5-aa04-13a3ecec731e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.138 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8de3daa0-c1 in ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.141 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8de3daa0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.141 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8b235377-b39b-4c7f-ac0c-0bc029384872]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.145 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d0231008-efec-4ed2-a96f-5d8a7897b636]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 systemd-udevd[247981]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:49:04 compute-1 systemd-machined[152783]: New machine qemu-76-instance-000000a5.
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.164 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[81f17eb0-2a28-4e7d-b4b6-f72226575195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 NetworkManager[51724]: <info>  [1759268944.1710] device (tap3522906d-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:49:04 compute-1 NetworkManager[51724]: <info>  [1759268944.1721] device (tap3522906d-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:49:04 compute-1 systemd[1]: Started Virtual Machine qemu-76-instance-000000a5.
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.196 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d3094a10-9c47-4a9d-bbbd-614ba1f1012f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.249 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a8f79773-b95b-40f8-8dde-d3bca12d602a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 NetworkManager[51724]: <info>  [1759268944.2672] manager: (tap8de3daa0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/333)
Sep 30 21:49:04 compute-1 systemd-udevd[247986]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.268 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7e5dd1-1dc8-48e6-ac4d-b83818fcfccd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.320 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[23d272f2-0335-4f78-97bd-3476b7b6fe33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.324 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[de9000e6-6177-462e-9ba8-4245504c6b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_controller[94902]: 2025-09-30T21:49:04Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:23:78 10.100.0.10
Sep 30 21:49:04 compute-1 ovn_controller[94902]: 2025-09-30T21:49:04Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:23:78 10.100.0.10
Sep 30 21:49:04 compute-1 NetworkManager[51724]: <info>  [1759268944.3645] device (tap8de3daa0-c0): carrier: link connected
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.373 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2abb0c92-dc21-4066-b348-f9520a2aa1b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.393 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[34e65eb8-912f-4f69-86e5-93656d25a483]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8de3daa0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:57:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564003, 'reachable_time': 33223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248015, 'error': None, 'target': 'ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.417 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[29901262-76fb-43ba-a340-eb3c12af30a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe49:570e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564003, 'tstamp': 564003}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248016, 'error': None, 'target': 'ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.438 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2d5618c1-cc8c-4fa9-a2fa-26c374682614]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8de3daa0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:57:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564003, 'reachable_time': 33223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248017, 'error': None, 'target': 'ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.489 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[06c19240-6ae5-4b2e-be02-2e85dd51b860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.584 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1a54568f-7a20-4fd2-967b-c6497037c7b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.585 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8de3daa0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.586 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.586 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8de3daa0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:04 compute-1 kernel: tap8de3daa0-c0: entered promiscuous mode
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.591 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8de3daa0-c0, col_values=(('external_ids', {'iface-id': '6314482d-722f-4e50-9866-2fdbc5f34ac0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:04 compute-1 NetworkManager[51724]: <info>  [1759268944.5919] manager: (tap8de3daa0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Sep 30 21:49:04 compute-1 ovn_controller[94902]: 2025-09-30T21:49:04Z|00661|binding|INFO|Releasing lport 6314482d-722f-4e50-9866-2fdbc5f34ac0 from this chassis (sb_readonly=0)
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.606 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8de3daa0-c8e4-4471-8a0c-c05c67411ae1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8de3daa0-c8e4-4471-8a0c-c05c67411ae1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.607 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[88e5e0fc-cade-461e-bd84-0e1f87b70cb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.608 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-8de3daa0-c8e4-4471-8a0c-c05c67411ae1
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/8de3daa0-c8e4-4471-8a0c-c05c67411ae1.pid.haproxy
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 8de3daa0-c8e4-4471-8a0c-c05c67411ae1
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:49:04 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:04.608 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1', 'env', 'PROCESS_TAG=haproxy-8de3daa0-c8e4-4471-8a0c-c05c67411ae1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8de3daa0-c8e4-4471-8a0c-c05c67411ae1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.845 2 DEBUG nova.compute.manager [req-8bc3c4fb-4dc4-4b9a-96e0-26386cb6c62f req-b65099e3-14e7-485f-b8e8-fe46d528d3a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received event network-vif-plugged-3522906d-96c0-460c-8864-7d0da2b92fc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.846 2 DEBUG oslo_concurrency.lockutils [req-8bc3c4fb-4dc4-4b9a-96e0-26386cb6c62f req-b65099e3-14e7-485f-b8e8-fe46d528d3a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.846 2 DEBUG oslo_concurrency.lockutils [req-8bc3c4fb-4dc4-4b9a-96e0-26386cb6c62f req-b65099e3-14e7-485f-b8e8-fe46d528d3a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.846 2 DEBUG oslo_concurrency.lockutils [req-8bc3c4fb-4dc4-4b9a-96e0-26386cb6c62f req-b65099e3-14e7-485f-b8e8-fe46d528d3a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:04 compute-1 nova_compute[192795]: 2025-09-30 21:49:04.847 2 DEBUG nova.compute.manager [req-8bc3c4fb-4dc4-4b9a-96e0-26386cb6c62f req-b65099e3-14e7-485f-b8e8-fe46d528d3a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Processing event network-vif-plugged-3522906d-96c0-460c-8864-7d0da2b92fc0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:49:05 compute-1 podman[248049]: 2025-09-30 21:49:05.071393402 +0000 UTC m=+0.059361384 container create 13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:49:05 compute-1 systemd[1]: Started libpod-conmon-13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee.scope.
Sep 30 21:49:05 compute-1 podman[248049]: 2025-09-30 21:49:05.040963011 +0000 UTC m=+0.028931013 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:49:05 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:49:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c19e87ce3721f2bc3473e4ef99fbd8eb027c411b4b82a2c865966546ad7d60de/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:49:05 compute-1 podman[248049]: 2025-09-30 21:49:05.194623209 +0000 UTC m=+0.182591221 container init 13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, io.buildah.version=1.41.3)
Sep 30 21:49:05 compute-1 podman[248049]: 2025-09-30 21:49:05.201570597 +0000 UTC m=+0.189538579 container start 13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 21:49:05 compute-1 neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1[248070]: [NOTICE]   (248074) : New worker (248076) forked
Sep 30 21:49:05 compute-1 neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1[248070]: [NOTICE]   (248074) : Loading success.
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.552 2 DEBUG nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.554 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268945.552699, 33176d38-ded7-4585-ab86-3b25756b50a8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.554 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] VM Started (Lifecycle Event)
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.558 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.563 2 INFO nova.virt.libvirt.driver [-] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Instance spawned successfully.
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.563 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.583 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.589 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.601 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.601 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.602 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.603 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.604 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.605 2 DEBUG nova.virt.libvirt.driver [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.626 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.627 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268945.5536778, 33176d38-ded7-4585-ab86-3b25756b50a8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.628 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] VM Paused (Lifecycle Event)
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.670 2 DEBUG nova.network.neutron [req-2abaa82c-2b9c-49f7-afbe-505eb01c6de2 req-edff6367-10da-4750-b925-2bbd68acfc4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Updated VIF entry in instance network info cache for port 3522906d-96c0-460c-8864-7d0da2b92fc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.671 2 DEBUG nova.network.neutron [req-2abaa82c-2b9c-49f7-afbe-505eb01c6de2 req-edff6367-10da-4750-b925-2bbd68acfc4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Updating instance_info_cache with network_info: [{"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.677 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.682 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759268945.5579715, 33176d38-ded7-4585-ab86-3b25756b50a8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.683 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] VM Resumed (Lifecycle Event)
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.720 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.721 2 DEBUG oslo_concurrency.lockutils [req-2abaa82c-2b9c-49f7-afbe-505eb01c6de2 req-edff6367-10da-4750-b925-2bbd68acfc4a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.725 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.730 2 INFO nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Took 11.52 seconds to spawn the instance on the hypervisor.
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.730 2 DEBUG nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.764 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.834 2 INFO nova.compute.manager [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Took 12.06 seconds to build instance.
Sep 30 21:49:05 compute-1 nova_compute[192795]: 2025-09-30 21:49:05.851 2 DEBUG oslo_concurrency.lockutils [None req-2a684527-df31-4de3-a332-af4facd6e999 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:06 compute-1 nova_compute[192795]: 2025-09-30 21:49:06.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:06 compute-1 nova_compute[192795]: 2025-09-30 21:49:06.935 2 DEBUG nova.compute.manager [req-ce0f4882-0b0a-4e1f-a463-e053ca8fc957 req-11b3e56f-b2f9-4c6d-9e46-5be61dcdd733 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received event network-vif-plugged-3522906d-96c0-460c-8864-7d0da2b92fc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:06 compute-1 nova_compute[192795]: 2025-09-30 21:49:06.935 2 DEBUG oslo_concurrency.lockutils [req-ce0f4882-0b0a-4e1f-a463-e053ca8fc957 req-11b3e56f-b2f9-4c6d-9e46-5be61dcdd733 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:06 compute-1 nova_compute[192795]: 2025-09-30 21:49:06.936 2 DEBUG oslo_concurrency.lockutils [req-ce0f4882-0b0a-4e1f-a463-e053ca8fc957 req-11b3e56f-b2f9-4c6d-9e46-5be61dcdd733 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:06 compute-1 nova_compute[192795]: 2025-09-30 21:49:06.936 2 DEBUG oslo_concurrency.lockutils [req-ce0f4882-0b0a-4e1f-a463-e053ca8fc957 req-11b3e56f-b2f9-4c6d-9e46-5be61dcdd733 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:06 compute-1 nova_compute[192795]: 2025-09-30 21:49:06.936 2 DEBUG nova.compute.manager [req-ce0f4882-0b0a-4e1f-a463-e053ca8fc957 req-11b3e56f-b2f9-4c6d-9e46-5be61dcdd733 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] No waiting events found dispatching network-vif-plugged-3522906d-96c0-460c-8864-7d0da2b92fc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:06 compute-1 nova_compute[192795]: 2025-09-30 21:49:06.936 2 WARNING nova.compute.manager [req-ce0f4882-0b0a-4e1f-a463-e053ca8fc957 req-11b3e56f-b2f9-4c6d-9e46-5be61dcdd733 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received unexpected event network-vif-plugged-3522906d-96c0-460c-8864-7d0da2b92fc0 for instance with vm_state active and task_state None.
Sep 30 21:49:07 compute-1 nova_compute[192795]: 2025-09-30 21:49:07.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:07 compute-1 nova_compute[192795]: 2025-09-30 21:49:07.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:07 compute-1 nova_compute[192795]: 2025-09-30 21:49:07.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:08 compute-1 nova_compute[192795]: 2025-09-30 21:49:08.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:10 compute-1 unix_chkpwd[248087]: password check failed for user (root)
Sep 30 21:49:10 compute-1 sshd-session[248085]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239  user=root
Sep 30 21:49:10 compute-1 nova_compute[192795]: 2025-09-30 21:49:10.476 2 DEBUG nova.compute.manager [req-6cc79e4a-6eef-4e9d-b4d2-f973fd65c9c4 req-8c846722-ed97-4fbb-a9c8-2d7e01e19d26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received event network-changed-3522906d-96c0-460c-8864-7d0da2b92fc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:10 compute-1 nova_compute[192795]: 2025-09-30 21:49:10.476 2 DEBUG nova.compute.manager [req-6cc79e4a-6eef-4e9d-b4d2-f973fd65c9c4 req-8c846722-ed97-4fbb-a9c8-2d7e01e19d26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Refreshing instance network info cache due to event network-changed-3522906d-96c0-460c-8864-7d0da2b92fc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:49:10 compute-1 nova_compute[192795]: 2025-09-30 21:49:10.477 2 DEBUG oslo_concurrency.lockutils [req-6cc79e4a-6eef-4e9d-b4d2-f973fd65c9c4 req-8c846722-ed97-4fbb-a9c8-2d7e01e19d26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:10 compute-1 nova_compute[192795]: 2025-09-30 21:49:10.477 2 DEBUG oslo_concurrency.lockutils [req-6cc79e4a-6eef-4e9d-b4d2-f973fd65c9c4 req-8c846722-ed97-4fbb-a9c8-2d7e01e19d26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:10 compute-1 nova_compute[192795]: 2025-09-30 21:49:10.477 2 DEBUG nova.network.neutron [req-6cc79e4a-6eef-4e9d-b4d2-f973fd65c9c4 req-8c846722-ed97-4fbb-a9c8-2d7e01e19d26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Refreshing network info cache for port 3522906d-96c0-460c-8864-7d0da2b92fc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:49:11 compute-1 sshd-session[248085]: Failed password for root from 167.71.248.239 port 35318 ssh2
Sep 30 21:49:12 compute-1 podman[248088]: 2025-09-30 21:49:12.234398363 +0000 UTC m=+0.064752630 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:49:12 compute-1 podman[248090]: 2025-09-30 21:49:12.249457119 +0000 UTC m=+0.062748004 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:49:12 compute-1 podman[248089]: 2025-09-30 21:49:12.306439747 +0000 UTC m=+0.129658901 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Sep 30 21:49:12 compute-1 nova_compute[192795]: 2025-09-30 21:49:12.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:13 compute-1 sshd-session[248085]: Connection closed by authenticating user root 167.71.248.239 port 35318 [preauth]
Sep 30 21:49:13 compute-1 nova_compute[192795]: 2025-09-30 21:49:13.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:13 compute-1 nova_compute[192795]: 2025-09-30 21:49:13.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:14 compute-1 nova_compute[192795]: 2025-09-30 21:49:14.702 2 DEBUG nova.network.neutron [req-6cc79e4a-6eef-4e9d-b4d2-f973fd65c9c4 req-8c846722-ed97-4fbb-a9c8-2d7e01e19d26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Updated VIF entry in instance network info cache for port 3522906d-96c0-460c-8864-7d0da2b92fc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:49:14 compute-1 nova_compute[192795]: 2025-09-30 21:49:14.703 2 DEBUG nova.network.neutron [req-6cc79e4a-6eef-4e9d-b4d2-f973fd65c9c4 req-8c846722-ed97-4fbb-a9c8-2d7e01e19d26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Updating instance_info_cache with network_info: [{"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:14 compute-1 nova_compute[192795]: 2025-09-30 21:49:14.726 2 DEBUG oslo_concurrency.lockutils [req-6cc79e4a-6eef-4e9d-b4d2-f973fd65c9c4 req-8c846722-ed97-4fbb-a9c8-2d7e01e19d26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:16 compute-1 nova_compute[192795]: 2025-09-30 21:49:16.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:16 compute-1 nova_compute[192795]: 2025-09-30 21:49:16.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:49:16 compute-1 nova_compute[192795]: 2025-09-30 21:49:16.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:49:17 compute-1 nova_compute[192795]: 2025-09-30 21:49:17.706 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:17 compute-1 nova_compute[192795]: 2025-09-30 21:49:17.707 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:17 compute-1 nova_compute[192795]: 2025-09-30 21:49:17.707 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:49:17 compute-1 nova_compute[192795]: 2025-09-30 21:49:17.707 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5303fce8-c159-4964-8a55-bc25f0e493e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:49:17 compute-1 nova_compute[192795]: 2025-09-30 21:49:17.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:18 compute-1 podman[248177]: 2025-09-30 21:49:18.25563192 +0000 UTC m=+0.078868220 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:49:18 compute-1 nova_compute[192795]: 2025-09-30 21:49:18.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:19 compute-1 ovn_controller[94902]: 2025-09-30T21:49:19Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:e9:98 10.100.0.11
Sep 30 21:49:19 compute-1 ovn_controller[94902]: 2025-09-30T21:49:19Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:e9:98 10.100.0.11
Sep 30 21:49:22 compute-1 nova_compute[192795]: 2025-09-30 21:49:22.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:23 compute-1 nova_compute[192795]: 2025-09-30 21:49:23.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:24 compute-1 podman[248200]: 2025-09-30 21:49:24.242276745 +0000 UTC m=+0.065075868 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:49:24 compute-1 podman[248198]: 2025-09-30 21:49:24.252429249 +0000 UTC m=+0.078353856 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Sep 30 21:49:24 compute-1 podman[248199]: 2025-09-30 21:49:24.271149224 +0000 UTC m=+0.092662972 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:49:24 compute-1 nova_compute[192795]: 2025-09-30 21:49:24.710 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updating instance_info_cache with network_info: [{"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:24 compute-1 nova_compute[192795]: 2025-09-30 21:49:24.732 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:24 compute-1 nova_compute[192795]: 2025-09-30 21:49:24.733 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.693 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.694 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.695 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.695 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.695 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.696 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.727 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.728 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Image id 86b6907c-d747-4e98-8897-42105915831d yields fingerprint e0a114b373fedfcc318870f9bde30baf716d5a2a _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.728 2 INFO nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] image 86b6907c-d747-4e98-8897-42105915831d at (/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a): checking
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.728 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] image 86b6907c-d747-4e98-8897-42105915831d at (/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.730 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.731 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] 5303fce8-c159-4964-8a55-bc25f0e493e1 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.731 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] 5303fce8-c159-4964-8a55-bc25f0e493e1 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.731 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.795 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.796 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 5303fce8-c159-4964-8a55-bc25f0e493e1 is backed by e0a114b373fedfcc318870f9bde30baf716d5a2a _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.797 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] 33176d38-ded7-4585-ab86-3b25756b50a8 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.797 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] 33176d38-ded7-4585-ab86-3b25756b50a8 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.798 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.868 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.870 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 33176d38-ded7-4585-ab86-3b25756b50a8 is backed by e0a114b373fedfcc318870f9bde30baf716d5a2a _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.870 2 WARNING nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.871 2 WARNING nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.871 2 WARNING nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.871 2 INFO nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Active base files: /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.871 2 INFO nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Removable base files: /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60 /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.872 2 INFO nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2129e8818893e7ae30fcedd85140012349e40d60
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.872 2 INFO nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.873 2 INFO nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f24a4f4d3b6fa9761e8135d14a901b2ab183c59d
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.873 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.873 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.873 2 DEBUG nova.virt.libvirt.imagecache [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Sep 30 21:49:27 compute-1 nova_compute[192795]: 2025-09-30 21:49:27.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:28 compute-1 nova_compute[192795]: 2025-09-30 21:49:28.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.068 2 DEBUG oslo_concurrency.lockutils [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "33176d38-ded7-4585-ab86-3b25756b50a8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.069 2 DEBUG oslo_concurrency.lockutils [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.069 2 DEBUG oslo_concurrency.lockutils [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.070 2 DEBUG oslo_concurrency.lockutils [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.070 2 DEBUG oslo_concurrency.lockutils [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.085 2 INFO nova.compute.manager [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Terminating instance
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.095 2 DEBUG nova.compute.manager [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:49:31 compute-1 kernel: tap3522906d-96 (unregistering): left promiscuous mode
Sep 30 21:49:31 compute-1 NetworkManager[51724]: <info>  [1759268971.1216] device (tap3522906d-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:49:31 compute-1 ovn_controller[94902]: 2025-09-30T21:49:31Z|00662|binding|INFO|Releasing lport 3522906d-96c0-460c-8864-7d0da2b92fc0 from this chassis (sb_readonly=0)
Sep 30 21:49:31 compute-1 ovn_controller[94902]: 2025-09-30T21:49:31Z|00663|binding|INFO|Setting lport 3522906d-96c0-460c-8864-7d0da2b92fc0 down in Southbound
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 ovn_controller[94902]: 2025-09-30T21:49:31Z|00664|binding|INFO|Removing iface tap3522906d-96 ovn-installed in OVS
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.145 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:e9:98 10.100.0.11'], port_security=['fa:16:3e:83:e9:98 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '33176d38-ded7-4585-ab86-3b25756b50a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8de3daa0-c8e4-4471-8a0c-c05c67411ae1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '043721d1d0a2480fa785367fa56c1fa4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd5d81c6a-117f-4eeb-9b9a-cadc270ff078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58bbb783-b6ec-4366-bc6e-a18a122f97d1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=3522906d-96c0-460c-8864-7d0da2b92fc0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.147 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 3522906d-96c0-460c-8864-7d0da2b92fc0 in datapath 8de3daa0-c8e4-4471-8a0c-c05c67411ae1 unbound from our chassis
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.150 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8de3daa0-c8e4-4471-8a0c-c05c67411ae1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.151 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[296940a3-9ea6-44c5-acd0-f929edab20b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.152 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1 namespace which is not needed anymore
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Sep 30 21:49:31 compute-1 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a5.scope: Consumed 14.167s CPU time.
Sep 30 21:49:31 compute-1 systemd-machined[152783]: Machine qemu-76-instance-000000a5 terminated.
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1[248070]: [NOTICE]   (248074) : haproxy version is 2.8.14-c23fe91
Sep 30 21:49:31 compute-1 neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1[248070]: [NOTICE]   (248074) : path to executable is /usr/sbin/haproxy
Sep 30 21:49:31 compute-1 neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1[248070]: [WARNING]  (248074) : Exiting Master process...
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1[248070]: [ALERT]    (248074) : Current worker (248076) exited with code 143 (Terminated)
Sep 30 21:49:31 compute-1 neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1[248070]: [WARNING]  (248074) : All workers exited. Exiting... (0)
Sep 30 21:49:31 compute-1 systemd[1]: libpod-13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee.scope: Deactivated successfully.
Sep 30 21:49:31 compute-1 podman[248287]: 2025-09-30 21:49:31.347180337 +0000 UTC m=+0.069443896 container died 13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:49:31 compute-1 systemd[1]: var-lib-containers-storage-overlay-c19e87ce3721f2bc3473e4ef99fbd8eb027c411b4b82a2c865966546ad7d60de-merged.mount: Deactivated successfully.
Sep 30 21:49:31 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee-userdata-shm.mount: Deactivated successfully.
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.389 2 INFO nova.virt.libvirt.driver [-] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Instance destroyed successfully.
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.392 2 DEBUG nova.objects.instance [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lazy-loading 'resources' on Instance uuid 33176d38-ded7-4585-ab86-3b25756b50a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:49:31 compute-1 podman[248287]: 2025-09-30 21:49:31.394815552 +0000 UTC m=+0.117079111 container cleanup 13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:49:31 compute-1 systemd[1]: libpod-conmon-13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee.scope: Deactivated successfully.
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.413 2 DEBUG nova.virt.libvirt.vif [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:48:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1320041238',display_name='tempest-TestNetworkBasicOps-server-1320041238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1320041238',id=165,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4Quez6rH07hO11bnVBHapHQpuODPsUfvGojX99zjz36B9pPYT8MUw35fqsoQVp9rSz0sysMAXACiS9gfckq0iSiY6wy4fmUZVf3KExWVMbmFIshemcIOYsG9NrTVI27Q==',key_name='tempest-TestNetworkBasicOps-85814944',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:49:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='043721d1d0a2480fa785367fa56c1fa4',ramdisk_id='',reservation_id='r-zvwd3t8i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-2126023928',owner_user_name='tempest-TestNetworkBasicOps-2126023928-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:49:05Z,user_data=None,user_id='27859618cb1d493cb1531af26b200b92',uuid=33176d38-ded7-4585-ab86-3b25756b50a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.414 2 DEBUG nova.network.os_vif_util [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converting VIF {"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.415 2 DEBUG nova.network.os_vif_util [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:e9:98,bridge_name='br-int',has_traffic_filtering=True,id=3522906d-96c0-460c-8864-7d0da2b92fc0,network=Network(8de3daa0-c8e4-4471-8a0c-c05c67411ae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3522906d-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.415 2 DEBUG os_vif [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:e9:98,bridge_name='br-int',has_traffic_filtering=True,id=3522906d-96c0-460c-8864-7d0da2b92fc0,network=Network(8de3daa0-c8e4-4471-8a0c-c05c67411ae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3522906d-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.418 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3522906d-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.425 2 INFO os_vif [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:e9:98,bridge_name='br-int',has_traffic_filtering=True,id=3522906d-96c0-460c-8864-7d0da2b92fc0,network=Network(8de3daa0-c8e4-4471-8a0c-c05c67411ae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3522906d-96')
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.425 2 INFO nova.virt.libvirt.driver [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Deleting instance files /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8_del
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.426 2 INFO nova.virt.libvirt.driver [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Deletion of /var/lib/nova/instances/33176d38-ded7-4585-ab86-3b25756b50a8_del complete
Sep 30 21:49:31 compute-1 podman[248328]: 2025-09-30 21:49:31.468072481 +0000 UTC m=+0.048388648 container remove 13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.475 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[13deec9d-a036-4e73-b3d2-e4659dbf79f8]: (4, ('Tue Sep 30 09:49:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1 (13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee)\n13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee\nTue Sep 30 09:49:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1 (13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee)\n13cc6d559a0dac65e3299cea7b31ab0a042b5d08a0605d49b02071ea0f78b2ee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.477 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[79bf81aa-ec4d-44c3-aba8-d34ece97970e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.478 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8de3daa0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:31 compute-1 kernel: tap8de3daa0-c0: left promiscuous mode
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.491 2 INFO nova.compute.manager [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.492 2 DEBUG oslo.service.loopingcall [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.494 2 DEBUG nova.compute.manager [-] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.494 2 DEBUG nova.network.neutron [-] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.495 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[91cf4adf-3d9e-4c03-8e0a-df03c18851b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.519 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7c546be3-1e2c-4f60-9d5a-8dfbeee773c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.521 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[be885e57-05e8-4b59-9470-8792d0cf4538]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.546 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb7bea8-7c0d-46dd-a91c-838056afaf90]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563991, 'reachable_time': 35172, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248343, 'error': None, 'target': 'ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.550 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8de3daa0-c8e4-4471-8a0c-c05c67411ae1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:49:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:31.550 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[b32e5ab6-0225-425a-a7b6-36d61593e9f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:31 compute-1 systemd[1]: run-netns-ovnmeta\x2d8de3daa0\x2dc8e4\x2d4471\x2d8a0c\x2dc05c67411ae1.mount: Deactivated successfully.
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.985 2 DEBUG nova.compute.manager [req-7ea29c91-2ab0-4f5d-a779-5aa2668dc9f7 req-515098d9-0bee-429c-9dee-f8260912518f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received event network-changed-3522906d-96c0-460c-8864-7d0da2b92fc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.986 2 DEBUG nova.compute.manager [req-7ea29c91-2ab0-4f5d-a779-5aa2668dc9f7 req-515098d9-0bee-429c-9dee-f8260912518f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Refreshing instance network info cache due to event network-changed-3522906d-96c0-460c-8864-7d0da2b92fc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.987 2 DEBUG oslo_concurrency.lockutils [req-7ea29c91-2ab0-4f5d-a779-5aa2668dc9f7 req-515098d9-0bee-429c-9dee-f8260912518f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.987 2 DEBUG oslo_concurrency.lockutils [req-7ea29c91-2ab0-4f5d-a779-5aa2668dc9f7 req-515098d9-0bee-429c-9dee-f8260912518f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:31 compute-1 nova_compute[192795]: 2025-09-30 21:49:31.988 2 DEBUG nova.network.neutron [req-7ea29c91-2ab0-4f5d-a779-5aa2668dc9f7 req-515098d9-0bee-429c-9dee-f8260912518f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Refreshing network info cache for port 3522906d-96c0-460c-8864-7d0da2b92fc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.474 2 DEBUG nova.compute.manager [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received event network-vif-unplugged-3522906d-96c0-460c-8864-7d0da2b92fc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.474 2 DEBUG oslo_concurrency.lockutils [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.474 2 DEBUG oslo_concurrency.lockutils [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.475 2 DEBUG oslo_concurrency.lockutils [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.475 2 DEBUG nova.compute.manager [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] No waiting events found dispatching network-vif-unplugged-3522906d-96c0-460c-8864-7d0da2b92fc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.475 2 DEBUG nova.compute.manager [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received event network-vif-unplugged-3522906d-96c0-460c-8864-7d0da2b92fc0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.475 2 DEBUG nova.compute.manager [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received event network-vif-plugged-3522906d-96c0-460c-8864-7d0da2b92fc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.475 2 DEBUG oslo_concurrency.lockutils [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.476 2 DEBUG oslo_concurrency.lockutils [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.476 2 DEBUG oslo_concurrency.lockutils [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.476 2 DEBUG nova.compute.manager [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] No waiting events found dispatching network-vif-plugged-3522906d-96c0-460c-8864-7d0da2b92fc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.476 2 WARNING nova.compute.manager [req-f56d0f37-a45c-4a73-88b1-c7f2f1ae4eeb req-eb145b1f-7c23-45d4-aacc-a873be98b7c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received unexpected event network-vif-plugged-3522906d-96c0-460c-8864-7d0da2b92fc0 for instance with vm_state active and task_state deleting.
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.601 2 DEBUG nova.network.neutron [-] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.620 2 INFO nova.compute.manager [-] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Took 1.13 seconds to deallocate network for instance.
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.688 2 DEBUG oslo_concurrency.lockutils [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.688 2 DEBUG oslo_concurrency.lockutils [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.777 2 DEBUG nova.compute.provider_tree [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.798 2 DEBUG nova.scheduler.client.report [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.826 2 DEBUG oslo_concurrency.lockutils [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.874 2 INFO nova.scheduler.client.report [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Deleted allocations for instance 33176d38-ded7-4585-ab86-3b25756b50a8
Sep 30 21:49:32 compute-1 nova_compute[192795]: 2025-09-30 21:49:32.973 2 DEBUG oslo_concurrency.lockutils [None req-b9d22004-a713-4cc1-ae9c-189d7c952ac6 27859618cb1d493cb1531af26b200b92 043721d1d0a2480fa785367fa56c1fa4 - - default default] Lock "33176d38-ded7-4585-ab86-3b25756b50a8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.904s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:33 compute-1 nova_compute[192795]: 2025-09-30 21:49:33.053 2 DEBUG nova.network.neutron [req-7ea29c91-2ab0-4f5d-a779-5aa2668dc9f7 req-515098d9-0bee-429c-9dee-f8260912518f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Updated VIF entry in instance network info cache for port 3522906d-96c0-460c-8864-7d0da2b92fc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:49:33 compute-1 nova_compute[192795]: 2025-09-30 21:49:33.054 2 DEBUG nova.network.neutron [req-7ea29c91-2ab0-4f5d-a779-5aa2668dc9f7 req-515098d9-0bee-429c-9dee-f8260912518f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Updating instance_info_cache with network_info: [{"id": "3522906d-96c0-460c-8864-7d0da2b92fc0", "address": "fa:16:3e:83:e9:98", "network": {"id": "8de3daa0-c8e4-4471-8a0c-c05c67411ae1", "bridge": "br-int", "label": "tempest-network-smoke--2012175570", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "043721d1d0a2480fa785367fa56c1fa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3522906d-96", "ovs_interfaceid": "3522906d-96c0-460c-8864-7d0da2b92fc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:33 compute-1 nova_compute[192795]: 2025-09-30 21:49:33.072 2 DEBUG oslo_concurrency.lockutils [req-7ea29c91-2ab0-4f5d-a779-5aa2668dc9f7 req-515098d9-0bee-429c-9dee-f8260912518f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-33176d38-ded7-4585-ab86-3b25756b50a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:33 compute-1 nova_compute[192795]: 2025-09-30 21:49:33.486 2 DEBUG nova.compute.manager [req-4ef7e8d8-44d7-495f-90ef-c22c32d89012 req-e29ceeaf-ee78-4428-a414-6373be919d41 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Received event network-vif-deleted-3522906d-96c0-460c-8864-7d0da2b92fc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:33 compute-1 nova_compute[192795]: 2025-09-30 21:49:33.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:34 compute-1 podman[248344]: 2025-09-30 21:49:34.222642982 +0000 UTC m=+0.065127359 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:49:36 compute-1 nova_compute[192795]: 2025-09-30 21:49:36.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:38.707 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:38.709 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:38.711 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:38 compute-1 nova_compute[192795]: 2025-09-30 21:49:38.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-1 nova_compute[192795]: 2025-09-30 21:49:41.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:43 compute-1 podman[248365]: 2025-09-30 21:49:43.229341095 +0000 UTC m=+0.065140630 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:49:43 compute-1 podman[248367]: 2025-09-30 21:49:43.229953921 +0000 UTC m=+0.055771056 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:49:43 compute-1 podman[248366]: 2025-09-30 21:49:43.307174026 +0000 UTC m=+0.138708176 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:49:43 compute-1 nova_compute[192795]: 2025-09-30 21:49:43.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:46 compute-1 nova_compute[192795]: 2025-09-30 21:49:46.381 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268971.3789206, 33176d38-ded7-4585-ab86-3b25756b50a8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:46 compute-1 nova_compute[192795]: 2025-09-30 21:49:46.381 2 INFO nova.compute.manager [-] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] VM Stopped (Lifecycle Event)
Sep 30 21:49:46 compute-1 nova_compute[192795]: 2025-09-30 21:49:46.402 2 DEBUG nova.compute.manager [None req-7a3ed7e7-c938-47a9-ac43-6d8080cd9cd6 - - - - - -] [instance: 33176d38-ded7-4585-ab86-3b25756b50a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:46 compute-1 nova_compute[192795]: 2025-09-30 21:49:46.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:46 compute-1 ovn_controller[94902]: 2025-09-30T21:49:46Z|00665|binding|INFO|Releasing lport 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 from this chassis (sb_readonly=0)
Sep 30 21:49:46 compute-1 ovn_controller[94902]: 2025-09-30T21:49:46Z|00666|binding|INFO|Releasing lport db554134-d733-46a3-ad79-d127cd6e8575 from this chassis (sb_readonly=0)
Sep 30 21:49:46 compute-1 nova_compute[192795]: 2025-09-30 21:49:46.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:48 compute-1 nova_compute[192795]: 2025-09-30 21:49:48.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:49 compute-1 podman[248430]: 2025-09-30 21:49:49.223129522 +0000 UTC m=+0.067442691 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:49:49 compute-1 nova_compute[192795]: 2025-09-30 21:49:49.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:50.349 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:50 compute-1 nova_compute[192795]: 2025-09-30 21:49:50.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:50.352 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:49:50 compute-1 ovn_controller[94902]: 2025-09-30T21:49:50Z|00667|binding|INFO|Releasing lport 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 from this chassis (sb_readonly=0)
Sep 30 21:49:50 compute-1 ovn_controller[94902]: 2025-09-30T21:49:50Z|00668|binding|INFO|Releasing lport db554134-d733-46a3-ad79-d127cd6e8575 from this chassis (sb_readonly=0)
Sep 30 21:49:50 compute-1 nova_compute[192795]: 2025-09-30 21:49:50.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:51 compute-1 nova_compute[192795]: 2025-09-30 21:49:51.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:53 compute-1 nova_compute[192795]: 2025-09-30 21:49:53.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:54 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:49:54.354 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:54 compute-1 nova_compute[192795]: 2025-09-30 21:49:54.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:54 compute-1 nova_compute[192795]: 2025-09-30 21:49:54.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:49:54 compute-1 nova_compute[192795]: 2025-09-30 21:49:54.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:55 compute-1 podman[248464]: 2025-09-30 21:49:55.263523099 +0000 UTC m=+0.098600903 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:49:55 compute-1 podman[248465]: 2025-09-30 21:49:55.27023882 +0000 UTC m=+0.093672300 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:49:55 compute-1 podman[248466]: 2025-09-30 21:49:55.27839866 +0000 UTC m=+0.103730691 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:49:56 compute-1 nova_compute[192795]: 2025-09-30 21:49:56.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:58 compute-1 nova_compute[192795]: 2025-09-30 21:49:58.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:59 compute-1 nova_compute[192795]: 2025-09-30 21:49:59.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:59 compute-1 nova_compute[192795]: 2025-09-30 21:49:59.715 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:00 compute-1 ovn_controller[94902]: 2025-09-30T21:50:00Z|00669|binding|INFO|Releasing lport 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 from this chassis (sb_readonly=0)
Sep 30 21:50:00 compute-1 ovn_controller[94902]: 2025-09-30T21:50:00Z|00670|binding|INFO|Releasing lport db554134-d733-46a3-ad79-d127cd6e8575 from this chassis (sb_readonly=0)
Sep 30 21:50:00 compute-1 nova_compute[192795]: 2025-09-30 21:50:00.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.696 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.767 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.768 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.768 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.768 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.847 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.931 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:01 compute-1 nova_compute[192795]: 2025-09-30 21:50:01.933 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.028 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.200 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.203 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5469MB free_disk=73.2723274230957GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.203 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.204 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.398 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 5303fce8-c159-4964-8a55-bc25f0e493e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.399 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.400 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.442 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.458 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.482 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:50:02 compute-1 nova_compute[192795]: 2025-09-30 21:50:02.482 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:03 compute-1 nova_compute[192795]: 2025-09-30 21:50:03.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:03 compute-1 nova_compute[192795]: 2025-09-30 21:50:03.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:04 compute-1 nova_compute[192795]: 2025-09-30 21:50:04.482 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:04 compute-1 unix_chkpwd[248532]: password check failed for user (root)
Sep 30 21:50:04 compute-1 sshd-session[248462]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:05 compute-1 podman[248533]: 2025-09-30 21:50:05.233606608 +0000 UTC m=+0.063628479 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, container_name=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:50:06 compute-1 nova_compute[192795]: 2025-09-30 21:50:06.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-1 sshd-session[248462]: Failed password for root from 8.210.178.40 port 43776 ssh2
Sep 30 21:50:06 compute-1 nova_compute[192795]: 2025-09-30 21:50:06.892 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:06 compute-1 nova_compute[192795]: 2025-09-30 21:50:06.893 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:06 compute-1 nova_compute[192795]: 2025-09-30 21:50:06.926 2 DEBUG nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.012 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.013 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.018 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.019 2 INFO nova.compute.claims [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.187 2 DEBUG nova.compute.provider_tree [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.204 2 DEBUG nova.scheduler.client.report [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.253 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.254 2 DEBUG nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.315 2 DEBUG nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.316 2 DEBUG nova.network.neutron [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.337 2 INFO nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.371 2 DEBUG nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.559 2 DEBUG nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.560 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.561 2 INFO nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Creating image(s)
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.561 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.562 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.562 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.577 2 DEBUG nova.policy [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.581 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.637 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.638 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.639 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.650 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.708 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.709 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.749 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.750 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.750 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.813 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.814 2 DEBUG nova.virt.disk.api [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.815 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.882 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.884 2 DEBUG nova.virt.disk.api [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.885 2 DEBUG nova.objects.instance [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.902 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.903 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Ensure instance console log exists: /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.903 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.904 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:07 compute-1 nova_compute[192795]: 2025-09-30 21:50:07.904 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:08 compute-1 nova_compute[192795]: 2025-09-30 21:50:08.262 2 DEBUG nova.network.neutron [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Successfully created port: afe949d7-0062-4e9f-8390-230f0f7d8f19 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:50:09 compute-1 nova_compute[192795]: 2025-09-30 21:50:09.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:09 compute-1 nova_compute[192795]: 2025-09-30 21:50:09.253 2 DEBUG nova.network.neutron [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Successfully updated port: afe949d7-0062-4e9f-8390-230f0f7d8f19 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:50:09 compute-1 nova_compute[192795]: 2025-09-30 21:50:09.269 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:50:09 compute-1 nova_compute[192795]: 2025-09-30 21:50:09.270 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:50:09 compute-1 nova_compute[192795]: 2025-09-30 21:50:09.270 2 DEBUG nova.network.neutron [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:50:09 compute-1 unix_chkpwd[248572]: password check failed for user (root)
Sep 30 21:50:09 compute-1 nova_compute[192795]: 2025-09-30 21:50:09.469 2 DEBUG nova.network.neutron [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:50:11 compute-1 nova_compute[192795]: 2025-09-30 21:50:11.177 2 DEBUG nova.compute.manager [req-1ba65c11-2e85-432e-b6e4-98697ba90a9b req-4719d3ec-de71-41f7-9f1c-f325a9a3bb1e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-changed-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:11 compute-1 nova_compute[192795]: 2025-09-30 21:50:11.178 2 DEBUG nova.compute.manager [req-1ba65c11-2e85-432e-b6e4-98697ba90a9b req-4719d3ec-de71-41f7-9f1c-f325a9a3bb1e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Refreshing instance network info cache due to event network-changed-afe949d7-0062-4e9f-8390-230f0f7d8f19. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:50:11 compute-1 nova_compute[192795]: 2025-09-30 21:50:11.178 2 DEBUG oslo_concurrency.lockutils [req-1ba65c11-2e85-432e-b6e4-98697ba90a9b req-4719d3ec-de71-41f7-9f1c-f325a9a3bb1e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:50:11 compute-1 sshd-session[248462]: Failed password for root from 8.210.178.40 port 43776 ssh2
Sep 30 21:50:11 compute-1 nova_compute[192795]: 2025-09-30 21:50:11.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.201 2 DEBUG nova.network.neutron [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Updating instance_info_cache with network_info: [{"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.260 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.263 2 DEBUG nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance network_info: |[{"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.265 2 DEBUG oslo_concurrency.lockutils [req-1ba65c11-2e85-432e-b6e4-98697ba90a9b req-4719d3ec-de71-41f7-9f1c-f325a9a3bb1e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.266 2 DEBUG nova.network.neutron [req-1ba65c11-2e85-432e-b6e4-98697ba90a9b req-4719d3ec-de71-41f7-9f1c-f325a9a3bb1e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Refreshing network info cache for port afe949d7-0062-4e9f-8390-230f0f7d8f19 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.269 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Start _get_guest_xml network_info=[{"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.277 2 WARNING nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.283 2 DEBUG nova.virt.libvirt.host [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.283 2 DEBUG nova.virt.libvirt.host [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.287 2 DEBUG nova.virt.libvirt.host [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.287 2 DEBUG nova.virt.libvirt.host [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.289 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.290 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.290 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.290 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.291 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.291 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.291 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.291 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.291 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.292 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.292 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.292 2 DEBUG nova.virt.hardware [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.296 2 DEBUG nova.virt.libvirt.vif [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:50:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-565258952',display_name='tempest-TestNetworkAdvancedServerOps-server-565258952',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-565258952',id=169,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOp7qq5pR5n4XGNqI6kSLPTrrkD6Em1jYop0AGJR6ftdSM0oNQoVr+JgCc61Z23g5STfy7N8SKmBL8YamCFCSB1WwpAlW6PwTOkSNAiZLvHWVmc3Z3SkZqG4BYTe7NxzPA==',key_name='tempest-TestNetworkAdvancedServerOps-364751036',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-q1bnr3be',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:50:07Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c6f47087-6db5-4113-b9d1-f72e8a71f342,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.297 2 DEBUG nova.network.os_vif_util [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.297 2 DEBUG nova.network.os_vif_util [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.298 2 DEBUG nova.objects.instance [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.326 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <uuid>c6f47087-6db5-4113-b9d1-f72e8a71f342</uuid>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <name>instance-000000a9</name>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-565258952</nova:name>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:50:12</nova:creationTime>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:50:12 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:50:12 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:50:12 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:50:12 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:50:12 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:50:12 compute-1 nova_compute[192795]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:50:12 compute-1 nova_compute[192795]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:50:12 compute-1 nova_compute[192795]:         <nova:port uuid="afe949d7-0062-4e9f-8390-230f0f7d8f19">
Sep 30 21:50:12 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <system>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <entry name="serial">c6f47087-6db5-4113-b9d1-f72e8a71f342</entry>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <entry name="uuid">c6f47087-6db5-4113-b9d1-f72e8a71f342</entry>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     </system>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <os>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   </os>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <features>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   </features>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.config"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:2b:c9:9c"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <target dev="tapafe949d7-00"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/console.log" append="off"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <video>
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     </video>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:50:12 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:50:12 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:50:12 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:50:12 compute-1 nova_compute[192795]: </domain>
Sep 30 21:50:12 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.328 2 DEBUG nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Preparing to wait for external event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.328 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.328 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.329 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.329 2 DEBUG nova.virt.libvirt.vif [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:50:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-565258952',display_name='tempest-TestNetworkAdvancedServerOps-server-565258952',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-565258952',id=169,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOp7qq5pR5n4XGNqI6kSLPTrrkD6Em1jYop0AGJR6ftdSM0oNQoVr+JgCc61Z23g5STfy7N8SKmBL8YamCFCSB1WwpAlW6PwTOkSNAiZLvHWVmc3Z3SkZqG4BYTe7NxzPA==',key_name='tempest-TestNetworkAdvancedServerOps-364751036',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-q1bnr3be',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:50:07Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c6f47087-6db5-4113-b9d1-f72e8a71f342,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.330 2 DEBUG nova.network.os_vif_util [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.331 2 DEBUG nova.network.os_vif_util [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.331 2 DEBUG os_vif [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.332 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.332 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.337 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafe949d7-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.338 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapafe949d7-00, col_values=(('external_ids', {'iface-id': 'afe949d7-0062-4e9f-8390-230f0f7d8f19', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:c9:9c', 'vm-uuid': 'c6f47087-6db5-4113-b9d1-f72e8a71f342'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:50:12 compute-1 NetworkManager[51724]: <info>  [1759269012.3429] manager: (tapafe949d7-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.352 2 INFO os_vif [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00')
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.375 2 DEBUG nova.compute.manager [req-a50af9e8-0497-4831-84d1-ff5cbfca0dda req-a709af3e-0cf2-498a-af76-c3cf81f9bebd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-changed-77fac564-d2ac-47f9-b08d-18f0ed166918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.376 2 DEBUG nova.compute.manager [req-a50af9e8-0497-4831-84d1-ff5cbfca0dda req-a709af3e-0cf2-498a-af76-c3cf81f9bebd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Refreshing instance network info cache due to event network-changed-77fac564-d2ac-47f9-b08d-18f0ed166918. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.377 2 DEBUG oslo_concurrency.lockutils [req-a50af9e8-0497-4831-84d1-ff5cbfca0dda req-a709af3e-0cf2-498a-af76-c3cf81f9bebd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.377 2 DEBUG oslo_concurrency.lockutils [req-a50af9e8-0497-4831-84d1-ff5cbfca0dda req-a709af3e-0cf2-498a-af76-c3cf81f9bebd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.377 2 DEBUG nova.network.neutron [req-a50af9e8-0497-4831-84d1-ff5cbfca0dda req-a709af3e-0cf2-498a-af76-c3cf81f9bebd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Refreshing network info cache for port 77fac564-d2ac-47f9-b08d-18f0ed166918 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.417 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.418 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.418 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No VIF found with MAC fa:16:3e:2b:c9:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.419 2 INFO nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Using config drive
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.493 2 DEBUG oslo_concurrency.lockutils [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.493 2 DEBUG oslo_concurrency.lockutils [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.494 2 DEBUG oslo_concurrency.lockutils [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.494 2 DEBUG oslo_concurrency.lockutils [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.494 2 DEBUG oslo_concurrency.lockutils [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.507 2 INFO nova.compute.manager [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Terminating instance
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.517 2 DEBUG nova.compute.manager [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:50:12 compute-1 kernel: tap77fac564-d2 (unregistering): left promiscuous mode
Sep 30 21:50:12 compute-1 NetworkManager[51724]: <info>  [1759269012.5425] device (tap77fac564-d2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 ovn_controller[94902]: 2025-09-30T21:50:12Z|00671|binding|INFO|Releasing lport 77fac564-d2ac-47f9-b08d-18f0ed166918 from this chassis (sb_readonly=0)
Sep 30 21:50:12 compute-1 ovn_controller[94902]: 2025-09-30T21:50:12Z|00672|binding|INFO|Setting lport 77fac564-d2ac-47f9-b08d-18f0ed166918 down in Southbound
Sep 30 21:50:12 compute-1 ovn_controller[94902]: 2025-09-30T21:50:12Z|00673|binding|INFO|Removing iface tap77fac564-d2 ovn-installed in OVS
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.565 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:23:78 10.100.0.10'], port_security=['fa:16:3e:c0:23:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5303fce8-c159-4964-8a55-bc25f0e493e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '272ca623-2e10-4dc9-b9ff-e2fd55c61f6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfb1184d-a559-4543-ae06-e2b48bcfb2c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=77fac564-d2ac-47f9-b08d-18f0ed166918) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.568 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 77fac564-d2ac-47f9-b08d-18f0ed166918 in datapath 40cfa99d-fae5-4f7e-b4bc-e90e389ced61 unbound from our chassis
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.572 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 40cfa99d-fae5-4f7e-b4bc-e90e389ced61, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.575 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[68b60df1-114d-496d-9f1a-4357dc4357b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.576 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 namespace which is not needed anymore
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 kernel: tapefc7a6d8-d4 (unregistering): left promiscuous mode
Sep 30 21:50:12 compute-1 NetworkManager[51724]: <info>  [1759269012.6092] device (tapefc7a6d8-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 ovn_controller[94902]: 2025-09-30T21:50:12Z|00674|binding|INFO|Releasing lport efc7a6d8-d415-4f18-b112-76cadddd0255 from this chassis (sb_readonly=0)
Sep 30 21:50:12 compute-1 ovn_controller[94902]: 2025-09-30T21:50:12Z|00675|binding|INFO|Setting lport efc7a6d8-d415-4f18-b112-76cadddd0255 down in Southbound
Sep 30 21:50:12 compute-1 ovn_controller[94902]: 2025-09-30T21:50:12Z|00676|binding|INFO|Removing iface tapefc7a6d8-d4 ovn-installed in OVS
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.662 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:bf:1e 2001:db8:0:1:f816:3eff:fede:bf1e 2001:db8::f816:3eff:fede:bf1e'], port_security=['fa:16:3e:de:bf:1e 2001:db8:0:1:f816:3eff:fede:bf1e 2001:db8::f816:3eff:fede:bf1e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fede:bf1e/64 2001:db8::f816:3eff:fede:bf1e/64', 'neutron:device_id': '5303fce8-c159-4964-8a55-bc25f0e493e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '272ca623-2e10-4dc9-b9ff-e2fd55c61f6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d03d334-4f0d-47df-a94a-1c647c7026cb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=efc7a6d8-d415-4f18-b112-76cadddd0255) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Sep 30 21:50:12 compute-1 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a3.scope: Consumed 16.944s CPU time.
Sep 30 21:50:12 compute-1 systemd-machined[152783]: Machine qemu-75-instance-000000a3 terminated.
Sep 30 21:50:12 compute-1 NetworkManager[51724]: <info>  [1759269012.7423] manager: (tap77fac564-d2): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Sep 30 21:50:12 compute-1 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[247754]: [NOTICE]   (247758) : haproxy version is 2.8.14-c23fe91
Sep 30 21:50:12 compute-1 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[247754]: [NOTICE]   (247758) : path to executable is /usr/sbin/haproxy
Sep 30 21:50:12 compute-1 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[247754]: [WARNING]  (247758) : Exiting Master process...
Sep 30 21:50:12 compute-1 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[247754]: [WARNING]  (247758) : Exiting Master process...
Sep 30 21:50:12 compute-1 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[247754]: [ALERT]    (247758) : Current worker (247760) exited with code 143 (Terminated)
Sep 30 21:50:12 compute-1 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[247754]: [WARNING]  (247758) : All workers exited. Exiting... (0)
Sep 30 21:50:12 compute-1 systemd[1]: libpod-cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1.scope: Deactivated successfully.
Sep 30 21:50:12 compute-1 podman[248606]: 2025-09-30 21:50:12.754215134 +0000 UTC m=+0.054880434 container died cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:50:12 compute-1 NetworkManager[51724]: <info>  [1759269012.7554] manager: (tapefc7a6d8-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/337)
Sep 30 21:50:12 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1-userdata-shm.mount: Deactivated successfully.
Sep 30 21:50:12 compute-1 unix_chkpwd[248658]: password check failed for user (root)
Sep 30 21:50:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-98f138f4772a76b8d072e6764cc6ab66d60930a9a56510141c4d82c2b7cebe0a-merged.mount: Deactivated successfully.
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.815 2 INFO nova.virt.libvirt.driver [-] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Instance destroyed successfully.
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.815 2 DEBUG nova.objects.instance [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid 5303fce8-c159-4964-8a55-bc25f0e493e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:12 compute-1 podman[248606]: 2025-09-30 21:50:12.821572282 +0000 UTC m=+0.122237582 container cleanup cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.826 2 DEBUG nova.virt.libvirt.vif [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:48:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-414695147',display_name='tempest-TestGettingAddress-server-414695147',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-414695147',id=163,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:48:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-yr4z6l47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:48:50Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=5303fce8-c159-4964-8a55-bc25f0e493e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.827 2 DEBUG nova.network.os_vif_util [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.828 2 DEBUG nova.network.os_vif_util [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:23:78,bridge_name='br-int',has_traffic_filtering=True,id=77fac564-d2ac-47f9-b08d-18f0ed166918,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fac564-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.828 2 DEBUG os_vif [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:23:78,bridge_name='br-int',has_traffic_filtering=True,id=77fac564-d2ac-47f9-b08d-18f0ed166918,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fac564-d2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.832 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77fac564-d2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 systemd[1]: libpod-conmon-cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1.scope: Deactivated successfully.
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.846 2 INFO os_vif [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:23:78,bridge_name='br-int',has_traffic_filtering=True,id=77fac564-d2ac-47f9-b08d-18f0ed166918,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fac564-d2')
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.847 2 DEBUG nova.virt.libvirt.vif [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:48:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-414695147',display_name='tempest-TestGettingAddress-server-414695147',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-414695147',id=163,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:48:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-yr4z6l47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:48:50Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=5303fce8-c159-4964-8a55-bc25f0e493e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.848 2 DEBUG nova.network.os_vif_util [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.849 2 DEBUG nova.network.os_vif_util [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:de:bf:1e,bridge_name='br-int',has_traffic_filtering=True,id=efc7a6d8-d415-4f18-b112-76cadddd0255,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc7a6d8-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.850 2 DEBUG os_vif [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:bf:1e,bridge_name='br-int',has_traffic_filtering=True,id=efc7a6d8-d415-4f18-b112-76cadddd0255,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc7a6d8-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.853 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefc7a6d8-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.862 2 INFO os_vif [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:bf:1e,bridge_name='br-int',has_traffic_filtering=True,id=efc7a6d8-d415-4f18-b112-76cadddd0255,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc7a6d8-d4')
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.863 2 INFO nova.virt.libvirt.driver [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Deleting instance files /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1_del
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.864 2 INFO nova.virt.libvirt.driver [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Deletion of /var/lib/nova/instances/5303fce8-c159-4964-8a55-bc25f0e493e1_del complete
Sep 30 21:50:12 compute-1 podman[248663]: 2025-09-30 21:50:12.906698519 +0000 UTC m=+0.053592398 container remove cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.913 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d553bdd5-b726-4e43-84ca-36cec32c22b3]: (4, ('Tue Sep 30 09:50:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 (cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1)\ncacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1\nTue Sep 30 09:50:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 (cacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1)\ncacbb1e42bc0054ee17d6113bd35b1be4f301fac0b2c3ee37a19b9b4f6f6deb1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.915 2 INFO nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Creating config drive at /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.config
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.915 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbe469e-a329-4a24-8827-327459d02441]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.916 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40cfa99d-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:12 compute-1 kernel: tap40cfa99d-f0: left promiscuous mode
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.920 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2yewh91v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.938 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fa9889f3-f10b-493d-97ec-d503c5990bfa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.964 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ba39cc64-c017-4815-9a78-1ae98973bee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.966 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a162ad02-bb4e-4df0-9c4f-23c5b17add9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.976 2 INFO nova.compute.manager [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Took 0.46 seconds to destroy the instance on the hypervisor.
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.977 2 DEBUG oslo.service.loopingcall [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.977 2 DEBUG nova.compute.manager [-] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:50:12 compute-1 nova_compute[192795]: 2025-09-30 21:50:12.978 2 DEBUG nova.network.neutron [-] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.988 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a3bdc7de-2c33-4811-b083-d88862373981]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562403, 'reachable_time': 36110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248688, 'error': None, 'target': 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.993 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:50:12 compute-1 systemd[1]: run-netns-ovnmeta\x2d40cfa99d\x2dfae5\x2d4f7e\x2db4bc\x2de90e389ced61.mount: Deactivated successfully.
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.993 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[705516b6-08f0-4e9c-916b-e58fbb242522]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.996 103861 INFO neutron.agent.ovn.metadata.agent [-] Port efc7a6d8-d415-4f18-b112-76cadddd0255 in datapath d1ec18dd-20d4-4643-8e73-7d404d8b8493 unbound from our chassis
Sep 30 21:50:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:12.999 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1ec18dd-20d4-4643-8e73-7d404d8b8493, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.001 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[27fcc7a1-eb78-46fb-9a4b-9c98a22b02da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.001 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 namespace which is not needed anymore
Sep 30 21:50:13 compute-1 nova_compute[192795]: 2025-09-30 21:50:13.073 2 DEBUG oslo_concurrency.processutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2yewh91v" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:13 compute-1 kernel: tapafe949d7-00: entered promiscuous mode
Sep 30 21:50:13 compute-1 NetworkManager[51724]: <info>  [1759269013.1886] manager: (tapafe949d7-00): new Tun device (/org/freedesktop/NetworkManager/Devices/338)
Sep 30 21:50:13 compute-1 nova_compute[192795]: 2025-09-30 21:50:13.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:13 compute-1 systemd-udevd[248580]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:50:13 compute-1 ovn_controller[94902]: 2025-09-30T21:50:13Z|00677|binding|INFO|Claiming lport afe949d7-0062-4e9f-8390-230f0f7d8f19 for this chassis.
Sep 30 21:50:13 compute-1 ovn_controller[94902]: 2025-09-30T21:50:13Z|00678|binding|INFO|afe949d7-0062-4e9f-8390-230f0f7d8f19: Claiming fa:16:3e:2b:c9:9c 10.100.0.10
Sep 30 21:50:13 compute-1 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[247826]: [NOTICE]   (247830) : haproxy version is 2.8.14-c23fe91
Sep 30 21:50:13 compute-1 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[247826]: [NOTICE]   (247830) : path to executable is /usr/sbin/haproxy
Sep 30 21:50:13 compute-1 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[247826]: [WARNING]  (247830) : Exiting Master process...
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.200 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:c9:9c 10.100.0.10'], port_security=['fa:16:3e:2b:c9:9c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c6f47087-6db5-4113-b9d1-f72e8a71f342', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'def4f612-8ce8-4228-a0e7-0d189e100661', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b337c30-4f37-407b-8b6c-508e829086f6, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=afe949d7-0062-4e9f-8390-230f0f7d8f19) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:13 compute-1 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[247826]: [ALERT]    (247830) : Current worker (247832) exited with code 143 (Terminated)
Sep 30 21:50:13 compute-1 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[247826]: [WARNING]  (247830) : All workers exited. Exiting... (0)
Sep 30 21:50:13 compute-1 ovn_controller[94902]: 2025-09-30T21:50:13Z|00679|binding|INFO|Setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 ovn-installed in OVS
Sep 30 21:50:13 compute-1 systemd[1]: libpod-1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e.scope: Deactivated successfully.
Sep 30 21:50:13 compute-1 ovn_controller[94902]: 2025-09-30T21:50:13Z|00680|binding|INFO|Setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 up in Southbound
Sep 30 21:50:13 compute-1 nova_compute[192795]: 2025-09-30 21:50:13.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:13 compute-1 podman[248705]: 2025-09-30 21:50:13.217741427 +0000 UTC m=+0.091811940 container died 1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:50:13 compute-1 NetworkManager[51724]: <info>  [1759269013.2203] device (tapafe949d7-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:50:13 compute-1 NetworkManager[51724]: <info>  [1759269013.2234] device (tapafe949d7-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:50:13 compute-1 systemd-machined[152783]: New machine qemu-77-instance-000000a9.
Sep 30 21:50:13 compute-1 systemd[1]: Started Virtual Machine qemu-77-instance-000000a9.
Sep 30 21:50:13 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e-userdata-shm.mount: Deactivated successfully.
Sep 30 21:50:13 compute-1 systemd[1]: var-lib-containers-storage-overlay-b06eb3879d7d2a6fc107729da638fafea387e1f1624d516f0b50552a0d2b1b5f-merged.mount: Deactivated successfully.
Sep 30 21:50:13 compute-1 podman[248705]: 2025-09-30 21:50:13.285123356 +0000 UTC m=+0.159193899 container cleanup 1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:50:13 compute-1 systemd[1]: libpod-conmon-1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e.scope: Deactivated successfully.
Sep 30 21:50:13 compute-1 podman[248748]: 2025-09-30 21:50:13.357122289 +0000 UTC m=+0.070852163 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:50:13 compute-1 podman[248762]: 2025-09-30 21:50:13.362642948 +0000 UTC m=+0.048548662 container remove 1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.369 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd65ba4-2005-45a2-9b22-8c25066761bb]: (4, ('Tue Sep 30 09:50:13 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 (1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e)\n1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e\nTue Sep 30 09:50:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 (1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e)\n1cb385f1a7d51d3d4090d5597e45db7af5a9cdb295fc24eeedf1d6c38a57162e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.371 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5f0f19-ed35-451f-9d2d-0e3da2d2a003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.372 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1ec18dd-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:13 compute-1 nova_compute[192795]: 2025-09-30 21:50:13.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:13 compute-1 kernel: tapd1ec18dd-20: left promiscuous mode
Sep 30 21:50:13 compute-1 nova_compute[192795]: 2025-09-30 21:50:13.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.381 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0c936e34-a217-44b7-b8dc-e256156de9a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 podman[248744]: 2025-09-30 21:50:13.386232515 +0000 UTC m=+0.111655736 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:50:13 compute-1 nova_compute[192795]: 2025-09-30 21:50:13.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.410 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac97d30-5d05-4911-8d2e-fd0919a75ddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.412 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3c196d-43e9-46c1-9247-8ead7913675b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.436 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ce43ca63-20e6-4906-b176-f3518b0a6849]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562500, 'reachable_time': 27730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248819, 'error': None, 'target': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.439 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.439 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[7f236bf5-f666-42ac-b881-b924e5f365a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.440 103861 INFO neutron.agent.ovn.metadata.agent [-] Port afe949d7-0062-4e9f-8390-230f0f7d8f19 in datapath f39b5b05-2446-4ee5-b89a-a5b71519f1fb unbound from our chassis
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.441 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f39b5b05-2446-4ee5-b89a-a5b71519f1fb
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.458 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d4fd50ea-874a-4512-bf43-277f34f3e78c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.459 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf39b5b05-21 in ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.462 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf39b5b05-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.463 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1bea1d5a-0ee2-46df-8b85-445172cbad71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.464 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c0b51d-7b06-4a84-8ece-b920081982bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.479 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9b92c0-dca2-431d-ba10-b046768a3d1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 podman[248807]: 2025-09-30 21:50:13.49461548 +0000 UTC m=+0.100347599 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250923, tcib_managed=true, container_name=ovn_controller)
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.504 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e13339-0922-4718-ac94-63a011e03ab8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.547 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[283822bb-13d5-4d1a-bd36-4e98b7a89257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.554 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2175a485-538a-4c00-93a4-3c53694e9977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 NetworkManager[51724]: <info>  [1759269013.5566] manager: (tapf39b5b05-20): new Veth device (/org/freedesktop/NetworkManager/Devices/339)
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.595 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[82922128-ce54-4d30-8387-159afd1932ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.598 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5f8080d1-9e3f-45f5-b7be-02d6735e20aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 NetworkManager[51724]: <info>  [1759269013.6294] device (tapf39b5b05-20): carrier: link connected
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.639 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[1a1e0918-083e-4762-85b4-479420a58d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.666 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e30db26b-a2bb-4646-8ee6-a1d1d08f668b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf39b5b05-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:00:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570930, 'reachable_time': 40653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248860, 'error': None, 'target': 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.687 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[140b719c-d155-4460-b06b-c1ad457d1843]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe08:d8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 570930, 'tstamp': 570930}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248861, 'error': None, 'target': 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 nova_compute[192795]: 2025-09-30 21:50:13.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.720 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e5a219bb-4b13-4334-a516-cd071f9fe41f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf39b5b05-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:00:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570930, 'reachable_time': 40653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248862, 'error': None, 'target': 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.764 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2b47bbe9-3ddc-42cf-ad91-a8349a0dec7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 systemd[1]: run-netns-ovnmeta\x2dd1ec18dd\x2d20d4\x2d4643\x2d8e73\x2d7d404d8b8493.mount: Deactivated successfully.
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.847 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[65661324-8e0b-4f57-9361-890a9d62409f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.849 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf39b5b05-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.849 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.850 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf39b5b05-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:13 compute-1 kernel: tapf39b5b05-20: entered promiscuous mode
Sep 30 21:50:13 compute-1 NetworkManager[51724]: <info>  [1759269013.8528] manager: (tapf39b5b05-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Sep 30 21:50:13 compute-1 nova_compute[192795]: 2025-09-30 21:50:13.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.855 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf39b5b05-20, col_values=(('external_ids', {'iface-id': 'bf7b0eda-a578-4475-a0ec-15a8369b7d09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:13 compute-1 ovn_controller[94902]: 2025-09-30T21:50:13Z|00681|binding|INFO|Releasing lport bf7b0eda-a578-4475-a0ec-15a8369b7d09 from this chassis (sb_readonly=0)
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.857 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f39b5b05-2446-4ee5-b89a-a5b71519f1fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f39b5b05-2446-4ee5-b89a-a5b71519f1fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.861 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[90a01665-2f6a-4caf-aa85-429ae8048052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.862 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f39b5b05-2446-4ee5-b89a-a5b71519f1fb
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f39b5b05-2446-4ee5-b89a-a5b71519f1fb.pid.haproxy
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f39b5b05-2446-4ee5-b89a-a5b71519f1fb
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:50:13 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:13.863 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'env', 'PROCESS_TAG=haproxy-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f39b5b05-2446-4ee5-b89a-a5b71519f1fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:50:13 compute-1 nova_compute[192795]: 2025-09-30 21:50:13.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:14 compute-1 podman[248901]: 2025-09-30 21:50:14.25582355 +0000 UTC m=+0.049886348 container create a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 21:50:14 compute-1 systemd[1]: Started libpod-conmon-a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b.scope.
Sep 30 21:50:14 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:50:14 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/131f8393e0387d4750f368491a6a42f46b3d400af5ddba78187b2371c6e0718b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:50:14 compute-1 podman[248901]: 2025-09-30 21:50:14.228483102 +0000 UTC m=+0.022545920 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:50:14 compute-1 podman[248901]: 2025-09-30 21:50:14.33990193 +0000 UTC m=+0.133964728 container init a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.341 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269014.3411305, c6f47087-6db5-4113-b9d1-f72e8a71f342 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.343 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] VM Started (Lifecycle Event)
Sep 30 21:50:14 compute-1 podman[248901]: 2025-09-30 21:50:14.34656604 +0000 UTC m=+0.140628838 container start a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.367 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.372 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269014.341451, c6f47087-6db5-4113-b9d1-f72e8a71f342 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.373 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] VM Paused (Lifecycle Event)
Sep 30 21:50:14 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[248916]: [NOTICE]   (248920) : New worker (248922) forked
Sep 30 21:50:14 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[248916]: [NOTICE]   (248920) : Loading success.
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.396 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.401 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.423 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.466 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-unplugged-77fac564-d2ac-47f9-b08d-18f0ed166918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.466 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.466 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.467 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.467 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] No waiting events found dispatching network-vif-unplugged-77fac564-d2ac-47f9-b08d-18f0ed166918 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.467 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-unplugged-77fac564-d2ac-47f9-b08d-18f0ed166918 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.467 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-plugged-77fac564-d2ac-47f9-b08d-18f0ed166918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.467 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.468 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.468 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.468 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] No waiting events found dispatching network-vif-plugged-77fac564-d2ac-47f9-b08d-18f0ed166918 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.468 2 WARNING nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received unexpected event network-vif-plugged-77fac564-d2ac-47f9-b08d-18f0ed166918 for instance with vm_state active and task_state deleting.
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.468 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-unplugged-efc7a6d8-d415-4f18-b112-76cadddd0255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.469 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.469 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.469 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.469 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] No waiting events found dispatching network-vif-unplugged-efc7a6d8-d415-4f18-b112-76cadddd0255 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.470 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-unplugged-efc7a6d8-d415-4f18-b112-76cadddd0255 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.470 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-plugged-efc7a6d8-d415-4f18-b112-76cadddd0255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.470 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.470 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.470 2 DEBUG oslo_concurrency.lockutils [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.471 2 DEBUG nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] No waiting events found dispatching network-vif-plugged-efc7a6d8-d415-4f18-b112-76cadddd0255 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.471 2 WARNING nova.compute.manager [req-0bfcfaa6-659b-4eff-bc37-04a8405411c0 req-dd9d62e3-0b18-4f6b-aa31-db265f525f15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received unexpected event network-vif-plugged-efc7a6d8-d415-4f18-b112-76cadddd0255 for instance with vm_state active and task_state deleting.
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.980 2 DEBUG nova.network.neutron [req-a50af9e8-0497-4831-84d1-ff5cbfca0dda req-a709af3e-0cf2-498a-af76-c3cf81f9bebd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updated VIF entry in instance network info cache for port 77fac564-d2ac-47f9-b08d-18f0ed166918. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:50:14 compute-1 nova_compute[192795]: 2025-09-30 21:50:14.981 2 DEBUG nova.network.neutron [req-a50af9e8-0497-4831-84d1-ff5cbfca0dda req-a709af3e-0cf2-498a-af76-c3cf81f9bebd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updating instance_info_cache with network_info: [{"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efc7a6d8-d415-4f18-b112-76cadddd0255", "address": "fa:16:3e:de:bf:1e", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fede:bf1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc7a6d8-d4", "ovs_interfaceid": "efc7a6d8-d415-4f18-b112-76cadddd0255", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:15 compute-1 nova_compute[192795]: 2025-09-30 21:50:15.023 2 DEBUG oslo_concurrency.lockutils [req-a50af9e8-0497-4831-84d1-ff5cbfca0dda req-a709af3e-0cf2-498a-af76-c3cf81f9bebd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5303fce8-c159-4964-8a55-bc25f0e493e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:50:15 compute-1 sshd-session[248462]: Failed password for root from 8.210.178.40 port 43776 ssh2
Sep 30 21:50:15 compute-1 nova_compute[192795]: 2025-09-30 21:50:15.214 2 DEBUG nova.compute.manager [req-1325400d-b201-45ad-aab2-fa7c8edbe42f req-f2367d2a-7141-4ae1-b356-9a8d73ea0e31 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-deleted-efc7a6d8-d415-4f18-b112-76cadddd0255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:15 compute-1 nova_compute[192795]: 2025-09-30 21:50:15.215 2 INFO nova.compute.manager [req-1325400d-b201-45ad-aab2-fa7c8edbe42f req-f2367d2a-7141-4ae1-b356-9a8d73ea0e31 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Neutron deleted interface efc7a6d8-d415-4f18-b112-76cadddd0255; detaching it from the instance and deleting it from the info cache
Sep 30 21:50:15 compute-1 nova_compute[192795]: 2025-09-30 21:50:15.215 2 DEBUG nova.network.neutron [req-1325400d-b201-45ad-aab2-fa7c8edbe42f req-f2367d2a-7141-4ae1-b356-9a8d73ea0e31 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updating instance_info_cache with network_info: [{"id": "77fac564-d2ac-47f9-b08d-18f0ed166918", "address": "fa:16:3e:c0:23:78", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fac564-d2", "ovs_interfaceid": "77fac564-d2ac-47f9-b08d-18f0ed166918", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:15 compute-1 nova_compute[192795]: 2025-09-30 21:50:15.239 2 DEBUG nova.compute.manager [req-1325400d-b201-45ad-aab2-fa7c8edbe42f req-f2367d2a-7141-4ae1-b356-9a8d73ea0e31 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Detach interface failed, port_id=efc7a6d8-d415-4f18-b112-76cadddd0255, reason: Instance 5303fce8-c159-4964-8a55-bc25f0e493e1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:50:15 compute-1 nova_compute[192795]: 2025-09-30 21:50:15.335 2 DEBUG nova.network.neutron [req-1ba65c11-2e85-432e-b6e4-98697ba90a9b req-4719d3ec-de71-41f7-9f1c-f325a9a3bb1e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Updated VIF entry in instance network info cache for port afe949d7-0062-4e9f-8390-230f0f7d8f19. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:50:15 compute-1 nova_compute[192795]: 2025-09-30 21:50:15.336 2 DEBUG nova.network.neutron [req-1ba65c11-2e85-432e-b6e4-98697ba90a9b req-4719d3ec-de71-41f7-9f1c-f325a9a3bb1e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Updating instance_info_cache with network_info: [{"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:15 compute-1 nova_compute[192795]: 2025-09-30 21:50:15.355 2 DEBUG oslo_concurrency.lockutils [req-1ba65c11-2e85-432e-b6e4-98697ba90a9b req-4719d3ec-de71-41f7-9f1c-f325a9a3bb1e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:50:16 compute-1 unix_chkpwd[248931]: password check failed for user (root)
Sep 30 21:50:16 compute-1 nova_compute[192795]: 2025-09-30 21:50:16.785 2 DEBUG nova.network.neutron [-] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:16 compute-1 nova_compute[192795]: 2025-09-30 21:50:16.825 2 INFO nova.compute.manager [-] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Took 3.85 seconds to deallocate network for instance.
Sep 30 21:50:16 compute-1 nova_compute[192795]: 2025-09-30 21:50:16.905 2 DEBUG oslo_concurrency.lockutils [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:16 compute-1 nova_compute[192795]: 2025-09-30 21:50:16.906 2 DEBUG oslo_concurrency.lockutils [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:16 compute-1 nova_compute[192795]: 2025-09-30 21:50:16.970 2 DEBUG nova.compute.provider_tree [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:50:16 compute-1 nova_compute[192795]: 2025-09-30 21:50:16.987 2 DEBUG nova.scheduler.client.report [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.019 2 DEBUG oslo_concurrency.lockutils [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.053 2 INFO nova.scheduler.client.report [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance 5303fce8-c159-4964-8a55-bc25f0e493e1
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.172 2 DEBUG oslo_concurrency.lockutils [None req-133ebfe5-d1b6-48e2-8ee9-2d1dbd85f777 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "5303fce8-c159-4964-8a55-bc25f0e493e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.334 2 DEBUG nova.compute.manager [req-e7ce3a7e-5cdc-4221-973b-21d7fbabf7f3 req-df0709e6-bef3-43f7-a26f-a4e0c11c8504 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.334 2 DEBUG oslo_concurrency.lockutils [req-e7ce3a7e-5cdc-4221-973b-21d7fbabf7f3 req-df0709e6-bef3-43f7-a26f-a4e0c11c8504 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.335 2 DEBUG oslo_concurrency.lockutils [req-e7ce3a7e-5cdc-4221-973b-21d7fbabf7f3 req-df0709e6-bef3-43f7-a26f-a4e0c11c8504 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.335 2 DEBUG oslo_concurrency.lockutils [req-e7ce3a7e-5cdc-4221-973b-21d7fbabf7f3 req-df0709e6-bef3-43f7-a26f-a4e0c11c8504 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.335 2 DEBUG nova.compute.manager [req-e7ce3a7e-5cdc-4221-973b-21d7fbabf7f3 req-df0709e6-bef3-43f7-a26f-a4e0c11c8504 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Processing event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.336 2 DEBUG nova.compute.manager [req-e7ce3a7e-5cdc-4221-973b-21d7fbabf7f3 req-df0709e6-bef3-43f7-a26f-a4e0c11c8504 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Received event network-vif-deleted-77fac564-d2ac-47f9-b08d-18f0ed166918 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.336 2 DEBUG nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.340 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269017.3406262, c6f47087-6db5-4113-b9d1-f72e8a71f342 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.341 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] VM Resumed (Lifecycle Event)
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.343 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.347 2 INFO nova.virt.libvirt.driver [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance spawned successfully.
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.348 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.364 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.373 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.378 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.379 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.380 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.381 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.381 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.382 2 DEBUG nova.virt.libvirt.driver [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.407 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.478 2 INFO nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Took 9.92 seconds to spawn the instance on the hypervisor.
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.479 2 DEBUG nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.575 2 INFO nova.compute.manager [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Took 10.59 seconds to build instance.
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.594 2 DEBUG oslo_concurrency.lockutils [None req-5f51d3e7-a285-4da7-9619-33c0442ba173 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.697 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.698 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.715 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:50:17 compute-1 nova_compute[192795]: 2025-09-30 21:50:17.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:18 compute-1 sshd-session[248462]: Failed password for root from 8.210.178.40 port 43776 ssh2
Sep 30 21:50:19 compute-1 nova_compute[192795]: 2025-09-30 21:50:19.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:19 compute-1 nova_compute[192795]: 2025-09-30 21:50:19.441 2 DEBUG nova.compute.manager [req-4f7bde32-24aa-4274-9e29-df6d175642a9 req-694dfa40-99aa-4f93-8b19-23ecbe51a20e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:19 compute-1 nova_compute[192795]: 2025-09-30 21:50:19.442 2 DEBUG oslo_concurrency.lockutils [req-4f7bde32-24aa-4274-9e29-df6d175642a9 req-694dfa40-99aa-4f93-8b19-23ecbe51a20e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:19 compute-1 nova_compute[192795]: 2025-09-30 21:50:19.442 2 DEBUG oslo_concurrency.lockutils [req-4f7bde32-24aa-4274-9e29-df6d175642a9 req-694dfa40-99aa-4f93-8b19-23ecbe51a20e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:19 compute-1 nova_compute[192795]: 2025-09-30 21:50:19.442 2 DEBUG oslo_concurrency.lockutils [req-4f7bde32-24aa-4274-9e29-df6d175642a9 req-694dfa40-99aa-4f93-8b19-23ecbe51a20e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:19 compute-1 nova_compute[192795]: 2025-09-30 21:50:19.442 2 DEBUG nova.compute.manager [req-4f7bde32-24aa-4274-9e29-df6d175642a9 req-694dfa40-99aa-4f93-8b19-23ecbe51a20e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:19 compute-1 nova_compute[192795]: 2025-09-30 21:50:19.442 2 WARNING nova.compute.manager [req-4f7bde32-24aa-4274-9e29-df6d175642a9 req-694dfa40-99aa-4f93-8b19-23ecbe51a20e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state active and task_state None.
Sep 30 21:50:19 compute-1 unix_chkpwd[248932]: password check failed for user (root)
Sep 30 21:50:20 compute-1 nova_compute[192795]: 2025-09-30 21:50:20.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:20 compute-1 podman[248933]: 2025-09-30 21:50:20.276005558 +0000 UTC m=+0.109516787 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Sep 30 21:50:21 compute-1 sshd-session[248462]: Failed password for root from 8.210.178.40 port 43776 ssh2
Sep 30 21:50:22 compute-1 nova_compute[192795]: 2025-09-30 21:50:22.845 2 DEBUG nova.compute.manager [req-a1075f66-bcc1-451d-9d05-4cfb3f845835 req-32429e24-a5aa-4cb6-af19-54412e4ad3c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-changed-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:22 compute-1 nova_compute[192795]: 2025-09-30 21:50:22.845 2 DEBUG nova.compute.manager [req-a1075f66-bcc1-451d-9d05-4cfb3f845835 req-32429e24-a5aa-4cb6-af19-54412e4ad3c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Refreshing instance network info cache due to event network-changed-afe949d7-0062-4e9f-8390-230f0f7d8f19. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:50:22 compute-1 nova_compute[192795]: 2025-09-30 21:50:22.846 2 DEBUG oslo_concurrency.lockutils [req-a1075f66-bcc1-451d-9d05-4cfb3f845835 req-32429e24-a5aa-4cb6-af19-54412e4ad3c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:50:22 compute-1 nova_compute[192795]: 2025-09-30 21:50:22.846 2 DEBUG oslo_concurrency.lockutils [req-a1075f66-bcc1-451d-9d05-4cfb3f845835 req-32429e24-a5aa-4cb6-af19-54412e4ad3c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:50:22 compute-1 nova_compute[192795]: 2025-09-30 21:50:22.846 2 DEBUG nova.network.neutron [req-a1075f66-bcc1-451d-9d05-4cfb3f845835 req-32429e24-a5aa-4cb6-af19-54412e4ad3c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Refreshing network info cache for port afe949d7-0062-4e9f-8390-230f0f7d8f19 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:50:22 compute-1 nova_compute[192795]: 2025-09-30 21:50:22.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:23 compute-1 unix_chkpwd[248953]: password check failed for user (root)
Sep 30 21:50:23 compute-1 nova_compute[192795]: 2025-09-30 21:50:23.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:23 compute-1 nova_compute[192795]: 2025-09-30 21:50:23.723 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:23 compute-1 nova_compute[192795]: 2025-09-30 21:50:23.723 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:50:23 compute-1 nova_compute[192795]: 2025-09-30 21:50:23.748 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:50:24 compute-1 nova_compute[192795]: 2025-09-30 21:50:24.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:25 compute-1 sshd-session[248462]: Failed password for root from 8.210.178.40 port 43776 ssh2
Sep 30 21:50:26 compute-1 sshd-session[248462]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 43776 ssh2 [preauth]
Sep 30 21:50:26 compute-1 sshd-session[248462]: Disconnecting authenticating user root 8.210.178.40 port 43776: Too many authentication failures [preauth]
Sep 30 21:50:26 compute-1 sshd-session[248462]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:26 compute-1 sshd-session[248462]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:50:26 compute-1 podman[248954]: 2025-09-30 21:50:26.241157762 +0000 UTC m=+0.077219466 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Sep 30 21:50:26 compute-1 podman[248955]: 2025-09-30 21:50:26.241650245 +0000 UTC m=+0.069954279 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:50:26 compute-1 podman[248956]: 2025-09-30 21:50:26.268793528 +0000 UTC m=+0.090908365 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 21:50:26 compute-1 nova_compute[192795]: 2025-09-30 21:50:26.838 2 DEBUG nova.network.neutron [req-a1075f66-bcc1-451d-9d05-4cfb3f845835 req-32429e24-a5aa-4cb6-af19-54412e4ad3c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Updated VIF entry in instance network info cache for port afe949d7-0062-4e9f-8390-230f0f7d8f19. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:50:26 compute-1 nova_compute[192795]: 2025-09-30 21:50:26.839 2 DEBUG nova.network.neutron [req-a1075f66-bcc1-451d-9d05-4cfb3f845835 req-32429e24-a5aa-4cb6-af19-54412e4ad3c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Updating instance_info_cache with network_info: [{"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:26 compute-1 nova_compute[192795]: 2025-09-30 21:50:26.863 2 DEBUG oslo_concurrency.lockutils [req-a1075f66-bcc1-451d-9d05-4cfb3f845835 req-32429e24-a5aa-4cb6-af19-54412e4ad3c2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:50:27 compute-1 nova_compute[192795]: 2025-09-30 21:50:27.814 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269012.8124363, 5303fce8-c159-4964-8a55-bc25f0e493e1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:27 compute-1 nova_compute[192795]: 2025-09-30 21:50:27.815 2 INFO nova.compute.manager [-] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] VM Stopped (Lifecycle Event)
Sep 30 21:50:27 compute-1 nova_compute[192795]: 2025-09-30 21:50:27.857 2 DEBUG nova.compute.manager [None req-73dbb337-b9c6-46a8-a92e-6a764e66f9ae - - - - - -] [instance: 5303fce8-c159-4964-8a55-bc25f0e493e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:27 compute-1 unix_chkpwd[249018]: password check failed for user (root)
Sep 30 21:50:27 compute-1 sshd-session[249016]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:27 compute-1 nova_compute[192795]: 2025-09-30 21:50:27.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:29 compute-1 nova_compute[192795]: 2025-09-30 21:50:29.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:29 compute-1 nova_compute[192795]: 2025-09-30 21:50:29.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:29 compute-1 sshd-session[249016]: Failed password for root from 8.210.178.40 port 44914 ssh2
Sep 30 21:50:30 compute-1 ovn_controller[94902]: 2025-09-30T21:50:30Z|00682|binding|INFO|Releasing lport bf7b0eda-a578-4475-a0ec-15a8369b7d09 from this chassis (sb_readonly=0)
Sep 30 21:50:30 compute-1 nova_compute[192795]: 2025-09-30 21:50:30.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:30 compute-1 ovn_controller[94902]: 2025-09-30T21:50:30Z|00683|binding|INFO|Releasing lport bf7b0eda-a578-4475-a0ec-15a8369b7d09 from this chassis (sb_readonly=0)
Sep 30 21:50:30 compute-1 nova_compute[192795]: 2025-09-30 21:50:30.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:31 compute-1 unix_chkpwd[249032]: password check failed for user (root)
Sep 30 21:50:31 compute-1 ovn_controller[94902]: 2025-09-30T21:50:31Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:c9:9c 10.100.0.10
Sep 30 21:50:31 compute-1 ovn_controller[94902]: 2025-09-30T21:50:31Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:c9:9c 10.100.0.10
Sep 30 21:50:32 compute-1 nova_compute[192795]: 2025-09-30 21:50:32.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:32 compute-1 nova_compute[192795]: 2025-09-30 21:50:32.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:33 compute-1 sshd-session[249016]: Failed password for root from 8.210.178.40 port 44914 ssh2
Sep 30 21:50:34 compute-1 nova_compute[192795]: 2025-09-30 21:50:34.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:34 compute-1 unix_chkpwd[249033]: password check failed for user (root)
Sep 30 21:50:35 compute-1 nova_compute[192795]: 2025-09-30 21:50:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:36 compute-1 podman[249034]: 2025-09-30 21:50:36.280524008 +0000 UTC m=+0.110619477 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:50:36 compute-1 nova_compute[192795]: 2025-09-30 21:50:36.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:37 compute-1 sshd-session[249016]: Failed password for root from 8.210.178.40 port 44914 ssh2
Sep 30 21:50:37 compute-1 nova_compute[192795]: 2025-09-30 21:50:37.283 2 INFO nova.compute.manager [None req-20cce841-5fcb-4859-92d0-e7adceeb6c59 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Get console output
Sep 30 21:50:37 compute-1 nova_compute[192795]: 2025-09-30 21:50:37.290 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:50:37 compute-1 nova_compute[192795]: 2025-09-30 21:50:37.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:38 compute-1 unix_chkpwd[249054]: password check failed for user (root)
Sep 30 21:50:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:38.708 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:38.710 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:38.711 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:39 compute-1 nova_compute[192795]: 2025-09-30 21:50:39.116 2 INFO nova.compute.manager [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Rebuilding instance
Sep 30 21:50:39 compute-1 nova_compute[192795]: 2025-09-30 21:50:39.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:39 compute-1 nova_compute[192795]: 2025-09-30 21:50:39.624 2 DEBUG nova.compute.manager [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:39 compute-1 nova_compute[192795]: 2025-09-30 21:50:39.743 2 DEBUG nova.objects.instance [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_requests' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:39 compute-1 nova_compute[192795]: 2025-09-30 21:50:39.754 2 DEBUG nova.objects.instance [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:39 compute-1 nova_compute[192795]: 2025-09-30 21:50:39.765 2 DEBUG nova.objects.instance [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:39 compute-1 nova_compute[192795]: 2025-09-30 21:50:39.781 2 DEBUG nova.objects.instance [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:39 compute-1 nova_compute[192795]: 2025-09-30 21:50:39.799 2 DEBUG nova.objects.instance [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:50:39 compute-1 nova_compute[192795]: 2025-09-30 21:50:39.805 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:50:40 compute-1 sshd-session[249016]: Failed password for root from 8.210.178.40 port 44914 ssh2
Sep 30 21:50:41 compute-1 unix_chkpwd[249055]: password check failed for user (root)
Sep 30 21:50:41 compute-1 kernel: tapafe949d7-00 (unregistering): left promiscuous mode
Sep 30 21:50:42 compute-1 NetworkManager[51724]: <info>  [1759269042.0046] device (tapafe949d7-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00684|binding|INFO|Releasing lport afe949d7-0062-4e9f-8390-230f0f7d8f19 from this chassis (sb_readonly=0)
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00685|binding|INFO|Setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 down in Southbound
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00686|binding|INFO|Removing iface tapafe949d7-00 ovn-installed in OVS
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.034 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:c9:9c 10.100.0.10'], port_security=['fa:16:3e:2b:c9:9c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c6f47087-6db5-4113-b9d1-f72e8a71f342', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'def4f612-8ce8-4228-a0e7-0d189e100661', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b337c30-4f37-407b-8b6c-508e829086f6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=afe949d7-0062-4e9f-8390-230f0f7d8f19) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.037 103861 INFO neutron.agent.ovn.metadata.agent [-] Port afe949d7-0062-4e9f-8390-230f0f7d8f19 in datapath f39b5b05-2446-4ee5-b89a-a5b71519f1fb unbound from our chassis
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.040 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f39b5b05-2446-4ee5-b89a-a5b71519f1fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.042 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[07bd4a9a-52b4-402b-beb6-8401db294d55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.042 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb namespace which is not needed anymore
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a9.scope: Deactivated successfully.
Sep 30 21:50:42 compute-1 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a9.scope: Consumed 14.663s CPU time.
Sep 30 21:50:42 compute-1 systemd-machined[152783]: Machine qemu-77-instance-000000a9 terminated.
Sep 30 21:50:42 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[248916]: [NOTICE]   (248920) : haproxy version is 2.8.14-c23fe91
Sep 30 21:50:42 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[248916]: [NOTICE]   (248920) : path to executable is /usr/sbin/haproxy
Sep 30 21:50:42 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[248916]: [WARNING]  (248920) : Exiting Master process...
Sep 30 21:50:42 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[248916]: [ALERT]    (248920) : Current worker (248922) exited with code 143 (Terminated)
Sep 30 21:50:42 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[248916]: [WARNING]  (248920) : All workers exited. Exiting... (0)
Sep 30 21:50:42 compute-1 systemd[1]: libpod-a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b.scope: Deactivated successfully.
Sep 30 21:50:42 compute-1 conmon[248916]: conmon a459519678c3d6e5a8fb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b.scope/container/memory.events
Sep 30 21:50:42 compute-1 podman[249080]: 2025-09-30 21:50:42.243494293 +0000 UTC m=+0.074511283 container died a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:50:42 compute-1 kernel: tapafe949d7-00: entered promiscuous mode
Sep 30 21:50:42 compute-1 NetworkManager[51724]: <info>  [1759269042.2679] manager: (tapafe949d7-00): new Tun device (/org/freedesktop/NetworkManager/Devices/341)
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00687|binding|INFO|Claiming lport afe949d7-0062-4e9f-8390-230f0f7d8f19 for this chassis.
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00688|binding|INFO|afe949d7-0062-4e9f-8390-230f0f7d8f19: Claiming fa:16:3e:2b:c9:9c 10.100.0.10
Sep 30 21:50:42 compute-1 systemd-udevd[249060]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.281 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:c9:9c 10.100.0.10'], port_security=['fa:16:3e:2b:c9:9c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c6f47087-6db5-4113-b9d1-f72e8a71f342', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'def4f612-8ce8-4228-a0e7-0d189e100661', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b337c30-4f37-407b-8b6c-508e829086f6, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=afe949d7-0062-4e9f-8390-230f0f7d8f19) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b-userdata-shm.mount: Deactivated successfully.
Sep 30 21:50:42 compute-1 kernel: tapafe949d7-00 (unregistering): left promiscuous mode
Sep 30 21:50:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-131f8393e0387d4750f368491a6a42f46b3d400af5ddba78187b2371c6e0718b-merged.mount: Deactivated successfully.
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00689|binding|INFO|Setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 ovn-installed in OVS
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00690|binding|INFO|Setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 up in Southbound
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00691|binding|INFO|Releasing lport afe949d7-0062-4e9f-8390-230f0f7d8f19 from this chassis (sb_readonly=1)
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: hostname: compute-1
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: ethtool ioctl error on tapafe949d7-00: No such device
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00692|if_status|INFO|Dropped 1 log messages in last 615 seconds (most recently, 615 seconds ago) due to excessive rate
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00693|if_status|INFO|Not setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 down as sb is readonly
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00694|binding|INFO|Removing iface tapafe949d7-00 ovn-installed in OVS
Sep 30 21:50:42 compute-1 podman[249080]: 2025-09-30 21:50:42.317207673 +0000 UTC m=+0.148224593 container cleanup a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00695|binding|INFO|Releasing lport afe949d7-0062-4e9f-8390-230f0f7d8f19 from this chassis (sb_readonly=0)
Sep 30 21:50:42 compute-1 ovn_controller[94902]: 2025-09-30T21:50:42Z|00696|binding|INFO|Setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 down in Southbound
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: ethtool ioctl error on tapafe949d7-00: No such device
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: ethtool ioctl error on tapafe949d7-00: No such device
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: ethtool ioctl error on tapafe949d7-00: No such device
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: ethtool ioctl error on tapafe949d7-00: No such device
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.345 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:c9:9c 10.100.0.10'], port_security=['fa:16:3e:2b:c9:9c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c6f47087-6db5-4113-b9d1-f72e8a71f342', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'def4f612-8ce8-4228-a0e7-0d189e100661', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b337c30-4f37-407b-8b6c-508e829086f6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=afe949d7-0062-4e9f-8390-230f0f7d8f19) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:42 compute-1 systemd[1]: libpod-conmon-a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b.scope: Deactivated successfully.
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: ethtool ioctl error on tapafe949d7-00: No such device
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: ethtool ioctl error on tapafe949d7-00: No such device
Sep 30 21:50:42 compute-1 virtnodedevd[192537]: ethtool ioctl error on tapafe949d7-00: No such device
Sep 30 21:50:42 compute-1 podman[249126]: 2025-09-30 21:50:42.416535514 +0000 UTC m=+0.057273357 container remove a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.425 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[32ed9b14-4ad0-4529-aff2-dbc1605160c5]: (4, ('Tue Sep 30 09:50:42 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb (a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b)\na459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b\nTue Sep 30 09:50:42 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb (a459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b)\na459519678c3d6e5a8fb50012f972ec67bb649641b9fdc710480f2900dcfb26b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.427 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[13c67b5b-b1f2-4996-8198-f3775289eeb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.429 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf39b5b05-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 kernel: tapf39b5b05-20: left promiscuous mode
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.465 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[476d751a-811d-4f0a-bddd-b20f6917ebd4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.492 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[efeece80-cf24-40e7-8d47-a380330a8880]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.494 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca47824-b566-4a66-978e-9080af41ea3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.521 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0c8015-4538-4239-855e-4715c353fd04]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570921, 'reachable_time': 43939, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249157, 'error': None, 'target': 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.527 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.527 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[91bac2de-d98f-4681-b6b3-a1bf8ab1860d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 systemd[1]: run-netns-ovnmeta\x2df39b5b05\x2d2446\x2d4ee5\x2db89a\x2da5b71519f1fb.mount: Deactivated successfully.
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.529 103861 INFO neutron.agent.ovn.metadata.agent [-] Port afe949d7-0062-4e9f-8390-230f0f7d8f19 in datapath f39b5b05-2446-4ee5-b89a-a5b71519f1fb unbound from our chassis
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.531 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f39b5b05-2446-4ee5-b89a-a5b71519f1fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.532 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3326d2fe-c186-4c99-89a5-4dfdd53b3122]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.533 103861 INFO neutron.agent.ovn.metadata.agent [-] Port afe949d7-0062-4e9f-8390-230f0f7d8f19 in datapath f39b5b05-2446-4ee5-b89a-a5b71519f1fb unbound from our chassis
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.536 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f39b5b05-2446-4ee5-b89a-a5b71519f1fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:50:42 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:42.537 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb8d163-1957-4d26-bbf5-088619905c36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.563 2 DEBUG nova.compute.manager [req-98ea4d47-00c5-4a42-9f63-47e44a64dcf3 req-424792ae-31b3-41c7-81b4-5b38ffaf9304 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-unplugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.564 2 DEBUG oslo_concurrency.lockutils [req-98ea4d47-00c5-4a42-9f63-47e44a64dcf3 req-424792ae-31b3-41c7-81b4-5b38ffaf9304 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.564 2 DEBUG oslo_concurrency.lockutils [req-98ea4d47-00c5-4a42-9f63-47e44a64dcf3 req-424792ae-31b3-41c7-81b4-5b38ffaf9304 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.564 2 DEBUG oslo_concurrency.lockutils [req-98ea4d47-00c5-4a42-9f63-47e44a64dcf3 req-424792ae-31b3-41c7-81b4-5b38ffaf9304 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.564 2 DEBUG nova.compute.manager [req-98ea4d47-00c5-4a42-9f63-47e44a64dcf3 req-424792ae-31b3-41c7-81b4-5b38ffaf9304 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-unplugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.564 2 WARNING nova.compute.manager [req-98ea4d47-00c5-4a42-9f63-47e44a64dcf3 req-424792ae-31b3-41c7-81b4-5b38ffaf9304 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-unplugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state active and task_state rebuilding.
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.829 2 INFO nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance shutdown successfully after 3 seconds.
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.838 2 INFO nova.virt.libvirt.driver [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance destroyed successfully.
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.847 2 INFO nova.virt.libvirt.driver [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance destroyed successfully.
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.848 2 DEBUG nova.virt.libvirt.vif [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:50:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-565258952',display_name='tempest-TestNetworkAdvancedServerOps-server-565258952',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-565258952',id=169,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOp7qq5pR5n4XGNqI6kSLPTrrkD6Em1jYop0AGJR6ftdSM0oNQoVr+JgCc61Z23g5STfy7N8SKmBL8YamCFCSB1WwpAlW6PwTOkSNAiZLvHWVmc3Z3SkZqG4BYTe7NxzPA==',key_name='tempest-TestNetworkAdvancedServerOps-364751036',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:50:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-q1bnr3be',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:50:38Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c6f47087-6db5-4113-b9d1-f72e8a71f342,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.850 2 DEBUG nova.network.os_vif_util [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.852 2 DEBUG nova.network.os_vif_util [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.854 2 DEBUG os_vif [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.857 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe949d7-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.865 2 INFO os_vif [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00')
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.866 2 INFO nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Deleting instance files /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342_del
Sep 30 21:50:42 compute-1 nova_compute[192795]: 2025-09-30 21:50:42.868 2 INFO nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Deletion of /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342_del complete
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.187 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.188 2 INFO nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Creating image(s)
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.189 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.189 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.190 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.210 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.311 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.313 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.315 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.347 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.431 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.435 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.476 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.477 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.478 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.544 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.545 2 DEBUG nova.virt.disk.api [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.546 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.610 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.611 2 DEBUG nova.virt.disk.api [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.611 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.612 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Ensure instance console log exists: /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.612 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.613 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.613 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.615 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Start _get_guest_xml network_info=[{"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.622 2 WARNING nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.630 2 DEBUG nova.virt.libvirt.host [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.631 2 DEBUG nova.virt.libvirt.host [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.634 2 DEBUG nova.virt.libvirt.host [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.635 2 DEBUG nova.virt.libvirt.host [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.636 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.636 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.637 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.637 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.637 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.638 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.638 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.638 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.638 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.638 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.639 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.639 2 DEBUG nova.virt.hardware [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.639 2 DEBUG nova.objects.instance [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.666 2 DEBUG nova.virt.libvirt.vif [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:50:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-565258952',display_name='tempest-TestNetworkAdvancedServerOps-server-565258952',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-565258952',id=169,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOp7qq5pR5n4XGNqI6kSLPTrrkD6Em1jYop0AGJR6ftdSM0oNQoVr+JgCc61Z23g5STfy7N8SKmBL8YamCFCSB1WwpAlW6PwTOkSNAiZLvHWVmc3Z3SkZqG4BYTe7NxzPA==',key_name='tempest-TestNetworkAdvancedServerOps-364751036',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:50:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-q1bnr3be',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:50:43Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c6f47087-6db5-4113-b9d1-f72e8a71f342,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.667 2 DEBUG nova.network.os_vif_util [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.668 2 DEBUG nova.network.os_vif_util [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.670 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <uuid>c6f47087-6db5-4113-b9d1-f72e8a71f342</uuid>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <name>instance-000000a9</name>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-565258952</nova:name>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:50:43</nova:creationTime>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:50:43 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:50:43 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:50:43 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:50:43 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:50:43 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:50:43 compute-1 nova_compute[192795]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:50:43 compute-1 nova_compute[192795]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="29834554-3ec3-4459-bfde-932aa778e979"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:50:43 compute-1 nova_compute[192795]:         <nova:port uuid="afe949d7-0062-4e9f-8390-230f0f7d8f19">
Sep 30 21:50:43 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <system>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <entry name="serial">c6f47087-6db5-4113-b9d1-f72e8a71f342</entry>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <entry name="uuid">c6f47087-6db5-4113-b9d1-f72e8a71f342</entry>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     </system>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <os>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   </os>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <features>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   </features>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.config"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:2b:c9:9c"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <target dev="tapafe949d7-00"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/console.log" append="off"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <video>
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     </video>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:50:43 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:50:43 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:50:43 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:50:43 compute-1 nova_compute[192795]: </domain>
Sep 30 21:50:43 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.671 2 DEBUG nova.virt.libvirt.vif [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:50:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-565258952',display_name='tempest-TestNetworkAdvancedServerOps-server-565258952',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-565258952',id=169,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOp7qq5pR5n4XGNqI6kSLPTrrkD6Em1jYop0AGJR6ftdSM0oNQoVr+JgCc61Z23g5STfy7N8SKmBL8YamCFCSB1WwpAlW6PwTOkSNAiZLvHWVmc3Z3SkZqG4BYTe7NxzPA==',key_name='tempest-TestNetworkAdvancedServerOps-364751036',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:50:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-q1bnr3be',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:50:43Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c6f47087-6db5-4113-b9d1-f72e8a71f342,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.671 2 DEBUG nova.network.os_vif_util [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.672 2 DEBUG nova.network.os_vif_util [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.672 2 DEBUG os_vif [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.673 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.673 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.678 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafe949d7-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.679 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapafe949d7-00, col_values=(('external_ids', {'iface-id': 'afe949d7-0062-4e9f-8390-230f0f7d8f19', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:c9:9c', 'vm-uuid': 'c6f47087-6db5-4113-b9d1-f72e8a71f342'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:50:43 compute-1 NetworkManager[51724]: <info>  [1759269043.6824] manager: (tapafe949d7-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.688 2 INFO os_vif [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00')
Sep 30 21:50:43 compute-1 sshd-session[249016]: Failed password for root from 8.210.178.40 port 44914 ssh2
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.764 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.764 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.764 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No VIF found with MAC fa:16:3e:2b:c9:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.765 2 INFO nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Using config drive
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.785 2 DEBUG nova.objects.instance [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'ec2_ids' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:43 compute-1 podman[249175]: 2025-09-30 21:50:43.809178409 +0000 UTC m=+0.076097945 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:50:43 compute-1 nova_compute[192795]: 2025-09-30 21:50:43.821 2 DEBUG nova.objects.instance [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'keypairs' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:43 compute-1 podman[249177]: 2025-09-30 21:50:43.835364936 +0000 UTC m=+0.077976826 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:50:43 compute-1 podman[249176]: 2025-09-30 21:50:43.880144785 +0000 UTC m=+0.133347611 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.029 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c6f47087-6db5-4113-b9d1-f72e8a71f342', 'name': 'tempest-TestNetworkAdvancedServerOps-server-565258952', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a9', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'hostId': 'd73ba93ddabd5840a9ca9acf49e5b49c1c85506e029a4e9ede473ad6', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.030 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.033 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.035 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.035 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.036 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.037 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.038 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.038 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.038 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-565258952>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-565258952>]
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.039 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.039 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.039 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-565258952>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-565258952>]
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.041 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.041 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.042 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-565258952>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-565258952>]
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.043 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.043 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.045 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.046 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.048 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.049 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.050 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.050 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-565258952>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-565258952>]
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.051 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.051 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.053 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.053 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.054 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.054 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.056 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.056 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.057 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.057 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.058 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.058 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.059 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.060 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.061 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.062 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.062 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.064 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:50:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:50:44.065 12 DEBUG ceilometer.compute.pollsters [-] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-000000a9, id=c6f47087-6db5-4113-b9d1-f72e8a71f342>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:50:44 compute-1 nova_compute[192795]: 2025-09-30 21:50:44.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:44 compute-1 nova_compute[192795]: 2025-09-30 21:50:44.559 2 INFO nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Creating config drive at /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.config
Sep 30 21:50:44 compute-1 nova_compute[192795]: 2025-09-30 21:50:44.564 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpklhso26o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:44 compute-1 nova_compute[192795]: 2025-09-30 21:50:44.708 2 DEBUG oslo_concurrency.processutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpklhso26o" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:44 compute-1 kernel: tapafe949d7-00: entered promiscuous mode
Sep 30 21:50:44 compute-1 NetworkManager[51724]: <info>  [1759269044.8171] manager: (tapafe949d7-00): new Tun device (/org/freedesktop/NetworkManager/Devices/343)
Sep 30 21:50:44 compute-1 ovn_controller[94902]: 2025-09-30T21:50:44Z|00697|binding|INFO|Claiming lport afe949d7-0062-4e9f-8390-230f0f7d8f19 for this chassis.
Sep 30 21:50:44 compute-1 ovn_controller[94902]: 2025-09-30T21:50:44Z|00698|binding|INFO|afe949d7-0062-4e9f-8390-230f0f7d8f19: Claiming fa:16:3e:2b:c9:9c 10.100.0.10
Sep 30 21:50:44 compute-1 nova_compute[192795]: 2025-09-30 21:50:44.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:44 compute-1 nova_compute[192795]: 2025-09-30 21:50:44.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:44 compute-1 NetworkManager[51724]: <info>  [1759269044.8415] device (tapafe949d7-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:50:44 compute-1 NetworkManager[51724]: <info>  [1759269044.8440] device (tapafe949d7-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:50:44 compute-1 nova_compute[192795]: 2025-09-30 21:50:44.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:44 compute-1 NetworkManager[51724]: <info>  [1759269044.8485] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Sep 30 21:50:44 compute-1 NetworkManager[51724]: <info>  [1759269044.8498] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.858 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:c9:9c 10.100.0.10'], port_security=['fa:16:3e:2b:c9:9c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c6f47087-6db5-4113-b9d1-f72e8a71f342', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'def4f612-8ce8-4228-a0e7-0d189e100661', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b337c30-4f37-407b-8b6c-508e829086f6, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=afe949d7-0062-4e9f-8390-230f0f7d8f19) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.860 103861 INFO neutron.agent.ovn.metadata.agent [-] Port afe949d7-0062-4e9f-8390-230f0f7d8f19 in datapath f39b5b05-2446-4ee5-b89a-a5b71519f1fb bound to our chassis
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.863 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f39b5b05-2446-4ee5-b89a-a5b71519f1fb
Sep 30 21:50:44 compute-1 systemd-machined[152783]: New machine qemu-78-instance-000000a9.
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.881 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba64191-b43a-4fd8-a40e-46c415a4146e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.884 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf39b5b05-21 in ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.886 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf39b5b05-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.886 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6a2bc769-ac18-469b-b274-d842877f1da4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.887 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cbde6b49-b4e8-4d19-ac89-7f4368af7d92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.903 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[02496d58-9043-4c66-b160-f2d98226ae41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.932 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e02e9170-eb68-43a4-a0df-8ba6711a9b40]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:44 compute-1 systemd[1]: Started Virtual Machine qemu-78-instance-000000a9.
Sep 30 21:50:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:44.974 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5617ad-d49f-4ed6-b40a-502fc0495fda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 NetworkManager[51724]: <info>  [1759269045.0025] manager: (tapf39b5b05-20): new Veth device (/org/freedesktop/NetworkManager/Devices/346)
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.001 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3ce8d3-9100-4400-9992-ba1c632ea024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.025 2 DEBUG nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.026 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.026 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.026 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.026 2 DEBUG nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.027 2 WARNING nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state active and task_state rebuild_spawning.
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.027 2 DEBUG nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.027 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.027 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.028 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.028 2 DEBUG nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.029 2 WARNING nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state active and task_state rebuild_spawning.
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.029 2 DEBUG nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.029 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.030 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.030 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.030 2 DEBUG nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.030 2 WARNING nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state active and task_state rebuild_spawning.
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.030 2 DEBUG nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-unplugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.031 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.032 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.032 2 DEBUG oslo_concurrency.lockutils [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.032 2 DEBUG nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-unplugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.032 2 WARNING nova.compute.manager [req-d2c60524-1ea6-48fa-bf0c-74d0633c0df7 req-0111d952-cc65-482b-b7bd-540ab35d99ce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-unplugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state active and task_state rebuild_spawning.
Sep 30 21:50:45 compute-1 ovn_controller[94902]: 2025-09-30T21:50:45Z|00699|binding|INFO|Setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 ovn-installed in OVS
Sep 30 21:50:45 compute-1 ovn_controller[94902]: 2025-09-30T21:50:45Z|00700|binding|INFO|Setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 up in Southbound
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.052 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[be8b8c43-8321-4b97-bc20-d034b0811351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.056 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[51ac57a4-6b68-409e-a916-6ce19a055816]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 unix_chkpwd[249278]: password check failed for user (root)
Sep 30 21:50:45 compute-1 NetworkManager[51724]: <info>  [1759269045.1009] device (tapf39b5b05-20): carrier: link connected
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.109 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9a12b236-1efe-496d-8b15-1571aeea9b5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.137 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cafbcff4-13ec-4e81-ba51-1735769355bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf39b5b05-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:00:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574077, 'reachable_time': 25152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249294, 'error': None, 'target': 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.162 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8c16a65e-eb50-4ee0-9aee-119e134ae06e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe08:d8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574077, 'tstamp': 574077}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249295, 'error': None, 'target': 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.198 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[45dbea9f-20a2-40a9-8daa-f325c36b2d58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf39b5b05-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:00:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574077, 'reachable_time': 25152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249296, 'error': None, 'target': 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.242 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0c078e13-21f8-4b9d-9684-56d172d6a659]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.319 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[723a2fea-4a6d-456f-868d-d7143eef8ee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.321 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf39b5b05-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.322 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.322 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf39b5b05-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:45 compute-1 kernel: tapf39b5b05-20: entered promiscuous mode
Sep 30 21:50:45 compute-1 NetworkManager[51724]: <info>  [1759269045.3305] manager: (tapf39b5b05-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.336 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf39b5b05-20, col_values=(('external_ids', {'iface-id': 'bf7b0eda-a578-4475-a0ec-15a8369b7d09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:45 compute-1 ovn_controller[94902]: 2025-09-30T21:50:45Z|00701|binding|INFO|Releasing lport bf7b0eda-a578-4475-a0ec-15a8369b7d09 from this chassis (sb_readonly=0)
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.340 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f39b5b05-2446-4ee5-b89a-a5b71519f1fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f39b5b05-2446-4ee5-b89a-a5b71519f1fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.341 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4b43b0-5f7a-41e9-b7da-906ac9b04a51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.342 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-f39b5b05-2446-4ee5-b89a-a5b71519f1fb
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/f39b5b05-2446-4ee5-b89a-a5b71519f1fb.pid.haproxy
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID f39b5b05-2446-4ee5-b89a-a5b71519f1fb
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:50:45 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:45.343 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'env', 'PROCESS_TAG=haproxy-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f39b5b05-2446-4ee5-b89a-a5b71519f1fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:45 compute-1 podman[249335]: 2025-09-30 21:50:45.782151421 +0000 UTC m=+0.095353666 container create 05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS)
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.792 2 DEBUG nova.compute.manager [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.795 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.796 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for c6f47087-6db5-4113-b9d1-f72e8a71f342 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.797 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269045.7935972, c6f47087-6db5-4113-b9d1-f72e8a71f342 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.797 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] VM Resumed (Lifecycle Event)
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.805 2 INFO nova.virt.libvirt.driver [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance spawned successfully.
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.806 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:50:45 compute-1 podman[249335]: 2025-09-30 21:50:45.730157417 +0000 UTC m=+0.043359712 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.828 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:45 compute-1 systemd[1]: Started libpod-conmon-05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5.scope.
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.841 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.848 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.849 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.849 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.852 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.852 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.853 2 DEBUG nova.virt.libvirt.driver [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:45 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.895 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.896 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269045.7937276, c6f47087-6db5-4113-b9d1-f72e8a71f342 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.896 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] VM Started (Lifecycle Event)
Sep 30 21:50:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b26c2858af63c41e84bf78a2a4fa8f84aaeb4d7bf5f0e824752ecf0718e92ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:50:45 compute-1 podman[249335]: 2025-09-30 21:50:45.917679999 +0000 UTC m=+0.230882254 container init 05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:50:45 compute-1 podman[249335]: 2025-09-30 21:50:45.927857364 +0000 UTC m=+0.241059589 container start 05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.935 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.940 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:50:45 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[249350]: [NOTICE]   (249354) : New worker (249356) forked
Sep 30 21:50:45 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[249350]: [NOTICE]   (249354) : Loading success.
Sep 30 21:50:45 compute-1 nova_compute[192795]: 2025-09-30 21:50:45.983 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:50:46 compute-1 nova_compute[192795]: 2025-09-30 21:50:46.072 2 DEBUG nova.compute.manager [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:46 compute-1 nova_compute[192795]: 2025-09-30 21:50:46.311 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:46 compute-1 nova_compute[192795]: 2025-09-30 21:50:46.313 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:46 compute-1 nova_compute[192795]: 2025-09-30 21:50:46.314 2 DEBUG nova.objects.instance [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:50:46 compute-1 nova_compute[192795]: 2025-09-30 21:50:46.420 2 DEBUG oslo_concurrency.lockutils [None req-6c76af1f-fddb-4d4f-8437-e7535b5f7fb2 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:47 compute-1 sshd-session[249016]: Failed password for root from 8.210.178.40 port 44914 ssh2
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.459 2 DEBUG nova.compute.manager [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.460 2 DEBUG oslo_concurrency.lockutils [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.460 2 DEBUG oslo_concurrency.lockutils [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.461 2 DEBUG oslo_concurrency.lockutils [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.461 2 DEBUG nova.compute.manager [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.461 2 WARNING nova.compute.manager [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state active and task_state None.
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.462 2 DEBUG nova.compute.manager [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.462 2 DEBUG oslo_concurrency.lockutils [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.462 2 DEBUG oslo_concurrency.lockutils [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.463 2 DEBUG oslo_concurrency.lockutils [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.463 2 DEBUG nova.compute.manager [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.464 2 WARNING nova.compute.manager [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state active and task_state None.
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.464 2 DEBUG nova.compute.manager [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.464 2 DEBUG oslo_concurrency.lockutils [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.464 2 DEBUG oslo_concurrency.lockutils [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.465 2 DEBUG oslo_concurrency.lockutils [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.465 2 DEBUG nova.compute.manager [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:47 compute-1 nova_compute[192795]: 2025-09-30 21:50:47.465 2 WARNING nova.compute.manager [req-dbffcb9f-48fd-4378-986d-32b2b78717f5 req-86fea276-88ce-45ed-a80b-5d64f3f7135e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state active and task_state None.
Sep 30 21:50:48 compute-1 sshd-session[249016]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 44914 ssh2 [preauth]
Sep 30 21:50:48 compute-1 sshd-session[249016]: Disconnecting authenticating user root 8.210.178.40 port 44914: Too many authentication failures [preauth]
Sep 30 21:50:48 compute-1 sshd-session[249016]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:48 compute-1 sshd-session[249016]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:50:48 compute-1 nova_compute[192795]: 2025-09-30 21:50:48.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:49 compute-1 nova_compute[192795]: 2025-09-30 21:50:49.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:49 compute-1 unix_chkpwd[249367]: password check failed for user (root)
Sep 30 21:50:49 compute-1 sshd-session[249365]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:50 compute-1 nova_compute[192795]: 2025-09-30 21:50:50.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:50.419 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:50 compute-1 nova_compute[192795]: 2025-09-30 21:50:50.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:50 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:50.420 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:50:50 compute-1 nova_compute[192795]: 2025-09-30 21:50:50.821 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:50 compute-1 nova_compute[192795]: 2025-09-30 21:50:50.821 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:50 compute-1 nova_compute[192795]: 2025-09-30 21:50:50.840 2 DEBUG nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.257 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.258 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.269 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.270 2 INFO nova.compute.claims [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:50:51 compute-1 podman[249368]: 2025-09-30 21:50:51.283630377 +0000 UTC m=+0.118711626 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:50:51 compute-1 sshd-session[249365]: Failed password for root from 8.210.178.40 port 45704 ssh2
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.507 2 DEBUG nova.compute.provider_tree [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.527 2 DEBUG nova.scheduler.client.report [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.550 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.551 2 DEBUG nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.626 2 DEBUG nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.627 2 DEBUG nova.network.neutron [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.652 2 INFO nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.688 2 DEBUG nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:50:51 compute-1 unix_chkpwd[249390]: password check failed for user (root)
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.927 2 DEBUG nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.930 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.931 2 INFO nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Creating image(s)
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.932 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "/var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.933 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "/var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.935 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "/var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:51 compute-1 nova_compute[192795]: 2025-09-30 21:50:51.968 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.203 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.235s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.204 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.205 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.221 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.284 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.286 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.325 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.327 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.328 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.394 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.395 2 DEBUG nova.virt.disk.api [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Checking if we can resize image /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.395 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.467 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.468 2 DEBUG nova.virt.disk.api [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Cannot resize image /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.469 2 DEBUG nova.objects.instance [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lazy-loading 'migration_context' on Instance uuid dcdbc5a3-6788-4e70-b948-140a8abb3dfa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.490 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.491 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Ensure instance console log exists: /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.491 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.492 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:52 compute-1 nova_compute[192795]: 2025-09-30 21:50:52.492 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:53 compute-1 nova_compute[192795]: 2025-09-30 21:50:53.207 2 DEBUG nova.network.neutron [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Successfully created port: fbb18477-f866-4ee6-b569-c606842c92ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:50:53 compute-1 nova_compute[192795]: 2025-09-30 21:50:53.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:53 compute-1 sshd-session[249365]: Failed password for root from 8.210.178.40 port 45704 ssh2
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.248 2 DEBUG nova.network.neutron [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Successfully updated port: fbb18477-f866-4ee6-b569-c606842c92ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.269 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "refresh_cache-dcdbc5a3-6788-4e70-b948-140a8abb3dfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.269 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquired lock "refresh_cache-dcdbc5a3-6788-4e70-b948-140a8abb3dfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.270 2 DEBUG nova.network.neutron [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.456 2 DEBUG nova.compute.manager [req-1911b4a3-f73e-42e8-af63-109c038d3295 req-debde45c-84bc-4172-9223-b234c58169b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Received event network-changed-fbb18477-f866-4ee6-b569-c606842c92ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.456 2 DEBUG nova.compute.manager [req-1911b4a3-f73e-42e8-af63-109c038d3295 req-debde45c-84bc-4172-9223-b234c58169b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Refreshing instance network info cache due to event network-changed-fbb18477-f866-4ee6-b569-c606842c92ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.457 2 DEBUG oslo_concurrency.lockutils [req-1911b4a3-f73e-42e8-af63-109c038d3295 req-debde45c-84bc-4172-9223-b234c58169b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-dcdbc5a3-6788-4e70-b948-140a8abb3dfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.584 2 DEBUG nova.network.neutron [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.652 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.686 2 WARNING nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 1 instances on the hypervisor.
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.686 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Triggering sync for uuid dcdbc5a3-6788-4e70-b948-140a8abb3dfa _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.687 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Triggering sync for uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.687 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.688 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.688 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:54 compute-1 nova_compute[192795]: 2025-09-30 21:50:54.723 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:55 compute-1 nova_compute[192795]: 2025-09-30 21:50:55.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:55 compute-1 unix_chkpwd[249406]: password check failed for user (root)
Sep 30 21:50:57 compute-1 sshd-session[249365]: Failed password for root from 8.210.178.40 port 45704 ssh2
Sep 30 21:50:57 compute-1 podman[249407]: 2025-09-30 21:50:57.239540872 +0000 UTC m=+0.068165832 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal)
Sep 30 21:50:57 compute-1 podman[249408]: 2025-09-30 21:50:57.253268581 +0000 UTC m=+0.072028935 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:50:57 compute-1 podman[249413]: 2025-09-30 21:50:57.25875379 +0000 UTC m=+0.074092302 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.804 2 DEBUG nova.network.neutron [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Updating instance_info_cache with network_info: [{"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.846 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Releasing lock "refresh_cache-dcdbc5a3-6788-4e70-b948-140a8abb3dfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.849 2 DEBUG nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Instance network_info: |[{"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.852 2 DEBUG oslo_concurrency.lockutils [req-1911b4a3-f73e-42e8-af63-109c038d3295 req-debde45c-84bc-4172-9223-b234c58169b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-dcdbc5a3-6788-4e70-b948-140a8abb3dfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.854 2 DEBUG nova.network.neutron [req-1911b4a3-f73e-42e8-af63-109c038d3295 req-debde45c-84bc-4172-9223-b234c58169b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Refreshing network info cache for port fbb18477-f866-4ee6-b569-c606842c92ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.859 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Start _get_guest_xml network_info=[{"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.866 2 WARNING nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.873 2 DEBUG nova.virt.libvirt.host [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.874 2 DEBUG nova.virt.libvirt.host [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.878 2 DEBUG nova.virt.libvirt.host [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.879 2 DEBUG nova.virt.libvirt.host [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.880 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.881 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.881 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.882 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.882 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.883 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.883 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.883 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.884 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.884 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.885 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.885 2 DEBUG nova.virt.hardware [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.891 2 DEBUG nova.virt.libvirt.vif [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:50:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1112960763',display_name='tempest-TestServerMultinode-server-1112960763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1112960763',id=172,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='03a9c17c97284fddad18a4babc1ac469',ramdisk_id='',reservation_id='r-yxqjm2tg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-104702739',owner_user_name='tempest-TestServerMultinode-104702739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:50:51Z,user_data=None,user_id='39fdea9ccc694a1aa18451c9c7b3bdcc',uuid=dcdbc5a3-6788-4e70-b948-140a8abb3dfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.891 2 DEBUG nova.network.os_vif_util [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converting VIF {"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.892 2 DEBUG nova.network.os_vif_util [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=fbb18477-f866-4ee6-b569-c606842c92ad,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb18477-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.893 2 DEBUG nova.objects.instance [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lazy-loading 'pci_devices' on Instance uuid dcdbc5a3-6788-4e70-b948-140a8abb3dfa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.917 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <uuid>dcdbc5a3-6788-4e70-b948-140a8abb3dfa</uuid>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <name>instance-000000ac</name>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <nova:name>tempest-TestServerMultinode-server-1112960763</nova:name>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:50:57</nova:creationTime>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:50:57 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:50:57 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:50:57 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:50:57 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:50:57 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:50:57 compute-1 nova_compute[192795]:         <nova:user uuid="39fdea9ccc694a1aa18451c9c7b3bdcc">tempest-TestServerMultinode-104702739-project-admin</nova:user>
Sep 30 21:50:57 compute-1 nova_compute[192795]:         <nova:project uuid="03a9c17c97284fddad18a4babc1ac469">tempest-TestServerMultinode-104702739</nova:project>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:50:57 compute-1 nova_compute[192795]:         <nova:port uuid="fbb18477-f866-4ee6-b569-c606842c92ad">
Sep 30 21:50:57 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <system>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <entry name="serial">dcdbc5a3-6788-4e70-b948-140a8abb3dfa</entry>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <entry name="uuid">dcdbc5a3-6788-4e70-b948-140a8abb3dfa</entry>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     </system>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <os>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   </os>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <features>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   </features>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk.config"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:52:5e:54"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <target dev="tapfbb18477-f8"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/console.log" append="off"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <video>
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     </video>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:50:57 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:50:57 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:50:57 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:50:57 compute-1 nova_compute[192795]: </domain>
Sep 30 21:50:57 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.918 2 DEBUG nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Preparing to wait for external event network-vif-plugged-fbb18477-f866-4ee6-b569-c606842c92ad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.919 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.919 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.919 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.920 2 DEBUG nova.virt.libvirt.vif [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:50:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1112960763',display_name='tempest-TestServerMultinode-server-1112960763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1112960763',id=172,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='03a9c17c97284fddad18a4babc1ac469',ramdisk_id='',reservation_id='r-yxqjm2tg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-104702739',owner_user_name='tempest-TestServerMultinode-104702739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:50:51Z,user_data=None,user_id='39fdea9ccc694a1aa18451c9c7b3bdcc',uuid=dcdbc5a3-6788-4e70-b948-140a8abb3dfa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.920 2 DEBUG nova.network.os_vif_util [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converting VIF {"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.921 2 DEBUG nova.network.os_vif_util [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=fbb18477-f866-4ee6-b569-c606842c92ad,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb18477-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.921 2 DEBUG os_vif [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=fbb18477-f866-4ee6-b569-c606842c92ad,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb18477-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.922 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.922 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.925 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbb18477-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.925 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfbb18477-f8, col_values=(('external_ids', {'iface-id': 'fbb18477-f866-4ee6-b569-c606842c92ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:5e:54', 'vm-uuid': 'dcdbc5a3-6788-4e70-b948-140a8abb3dfa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:57 compute-1 NetworkManager[51724]: <info>  [1759269057.9284] manager: (tapfbb18477-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:57 compute-1 nova_compute[192795]: 2025-09-30 21:50:57.938 2 INFO os_vif [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=fbb18477-f866-4ee6-b569-c606842c92ad,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb18477-f8')
Sep 30 21:50:58 compute-1 nova_compute[192795]: 2025-09-30 21:50:58.032 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:50:58 compute-1 nova_compute[192795]: 2025-09-30 21:50:58.033 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:50:58 compute-1 nova_compute[192795]: 2025-09-30 21:50:58.033 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] No VIF found with MAC fa:16:3e:52:5e:54, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:50:58 compute-1 nova_compute[192795]: 2025-09-30 21:50:58.034 2 INFO nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Using config drive
Sep 30 21:50:58 compute-1 nova_compute[192795]: 2025-09-30 21:50:58.731 2 INFO nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Creating config drive at /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk.config
Sep 30 21:50:58 compute-1 nova_compute[192795]: 2025-09-30 21:50:58.741 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwlc46cxu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:58 compute-1 unix_chkpwd[249489]: password check failed for user (root)
Sep 30 21:50:58 compute-1 nova_compute[192795]: 2025-09-30 21:50:58.876 2 DEBUG oslo_concurrency.processutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwlc46cxu" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:58 compute-1 kernel: tapfbb18477-f8: entered promiscuous mode
Sep 30 21:50:58 compute-1 NetworkManager[51724]: <info>  [1759269058.9741] manager: (tapfbb18477-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/349)
Sep 30 21:50:58 compute-1 nova_compute[192795]: 2025-09-30 21:50:58.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:58 compute-1 ovn_controller[94902]: 2025-09-30T21:50:58Z|00702|binding|INFO|Claiming lport fbb18477-f866-4ee6-b569-c606842c92ad for this chassis.
Sep 30 21:50:58 compute-1 ovn_controller[94902]: 2025-09-30T21:50:58Z|00703|binding|INFO|fbb18477-f866-4ee6-b569-c606842c92ad: Claiming fa:16:3e:52:5e:54 10.100.0.10
Sep 30 21:50:58 compute-1 nova_compute[192795]: 2025-09-30 21:50:58.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:59 compute-1 ovn_controller[94902]: 2025-09-30T21:50:58Z|00704|binding|INFO|Setting lport fbb18477-f866-4ee6-b569-c606842c92ad ovn-installed in OVS
Sep 30 21:50:59 compute-1 ovn_controller[94902]: 2025-09-30T21:50:59Z|00705|binding|INFO|Setting lport fbb18477-f866-4ee6-b569-c606842c92ad up in Southbound
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.002 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:5e:54 10.100.0.10'], port_security=['fa:16:3e:52:5e:54 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'dcdbc5a3-6788-4e70-b948-140a8abb3dfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '03a9c17c97284fddad18a4babc1ac469', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ea43e653-1ffa-4770-a68a-2b8fd3d4fb27', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2923dc02-5672-47c9-85d4-ce6c19aec13e, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=fbb18477-f866-4ee6-b569-c606842c92ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.006 103861 INFO neutron.agent.ovn.metadata.agent [-] Port fbb18477-f866-4ee6-b569-c606842c92ad in datapath 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff bound to our chassis
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.011 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff
Sep 30 21:50:59 compute-1 systemd-udevd[249504]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.029 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f7790144-e71b-4b96-8eee-7fd6fc7e43ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.031 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b3b9e17-81 in ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:50:59 compute-1 NetworkManager[51724]: <info>  [1759269059.0341] device (tapfbb18477-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:50:59 compute-1 NetworkManager[51724]: <info>  [1759269059.0351] device (tapfbb18477-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.036 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b3b9e17-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.037 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5752c7-b848-4ffe-bd1d-539fa9efde52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 systemd-machined[152783]: New machine qemu-79-instance-000000ac.
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.041 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1f27c904-1ff4-4b53-91dc-5a37e08c07de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 systemd[1]: Started Virtual Machine qemu-79-instance-000000ac.
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.059 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[34312a74-ae20-4f4b-ac6a-2141cbc25d94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.079 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[837b3443-606f-4234-9975-b996096fff78]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.115 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ca96bd-1d46-4ed7-9bba-0eb6cbc2e756]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 systemd-udevd[249508]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.125 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0fdce45a-9d54-4772-8f32-f42d0503dfbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 NetworkManager[51724]: <info>  [1759269059.1279] manager: (tap7b3b9e17-80): new Veth device (/org/freedesktop/NetworkManager/Devices/350)
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.171 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4a484d68-639a-4ced-acb8-c8500ba9748b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.175 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[df2b956b-98b9-4831-b049-2c6f0ee7141d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 NetworkManager[51724]: <info>  [1759269059.2079] device (tap7b3b9e17-80): carrier: link connected
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.218 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[10c400e8-4e4b-4686-a643-13e67d9d4d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.244 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[91246c1e-5957-4dec-9566-462e7d93aad6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b3b9e17-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:22:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575488, 'reachable_time': 32512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249538, 'error': None, 'target': 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.269 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f904bc49-b273-4eb3-85e8-f658cf26dea1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:229d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575488, 'tstamp': 575488}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249539, 'error': None, 'target': 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.299 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6bcaa107-a3b7-4d69-ba0e-6719a2522c2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b3b9e17-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:22:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575488, 'reachable_time': 32512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249542, 'error': None, 'target': 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.350 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed443bc-6d15-4c45-b242-57acb1b627ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.423 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.440 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee702fd-73ee-4220-8133-69897be7c541]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.442 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b3b9e17-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.442 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.443 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b3b9e17-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:59 compute-1 kernel: tap7b3b9e17-80: entered promiscuous mode
Sep 30 21:50:59 compute-1 NetworkManager[51724]: <info>  [1759269059.4473] manager: (tap7b3b9e17-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.452 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b3b9e17-80, col_values=(('external_ids', {'iface-id': '2454d4c5-8981-43c0-9c8f-e79a575dc065'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:59 compute-1 ovn_controller[94902]: 2025-09-30T21:50:59Z|00706|binding|INFO|Releasing lport 2454d4c5-8981-43c0-9c8f-e79a575dc065 from this chassis (sb_readonly=0)
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.455 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b3b9e17-84a1-4306-91a0-7da7cacfb7ff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b3b9e17-84a1-4306-91a0-7da7cacfb7ff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.456 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c8016d74-67c9-4b55-a358-540812bf09c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.457 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/7b3b9e17-84a1-4306-91a0-7da7cacfb7ff.pid.haproxy
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:50:59 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:50:59.458 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'env', 'PROCESS_TAG=haproxy-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b3b9e17-84a1-4306-91a0-7da7cacfb7ff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:59 compute-1 ovn_controller[94902]: 2025-09-30T21:50:59Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:c9:9c 10.100.0.10
Sep 30 21:50:59 compute-1 ovn_controller[94902]: 2025-09-30T21:50:59Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:c9:9c 10.100.0.10
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.604 2 DEBUG nova.compute.manager [req-43a04df0-f48c-415e-85dd-e665be40ead4 req-fd449c59-8183-4f33-a02a-72b379ff0095 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Received event network-vif-plugged-fbb18477-f866-4ee6-b569-c606842c92ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.605 2 DEBUG oslo_concurrency.lockutils [req-43a04df0-f48c-415e-85dd-e665be40ead4 req-fd449c59-8183-4f33-a02a-72b379ff0095 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.605 2 DEBUG oslo_concurrency.lockutils [req-43a04df0-f48c-415e-85dd-e665be40ead4 req-fd449c59-8183-4f33-a02a-72b379ff0095 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.606 2 DEBUG oslo_concurrency.lockutils [req-43a04df0-f48c-415e-85dd-e665be40ead4 req-fd449c59-8183-4f33-a02a-72b379ff0095 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.606 2 DEBUG nova.compute.manager [req-43a04df0-f48c-415e-85dd-e665be40ead4 req-fd449c59-8183-4f33-a02a-72b379ff0095 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Processing event network-vif-plugged-fbb18477-f866-4ee6-b569-c606842c92ad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.842 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269059.842175, dcdbc5a3-6788-4e70-b948-140a8abb3dfa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.843 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] VM Started (Lifecycle Event)
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.845 2 DEBUG nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.848 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.854 2 INFO nova.virt.libvirt.driver [-] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Instance spawned successfully.
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.854 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.875 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.880 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:50:59 compute-1 podman[249579]: 2025-09-30 21:50:59.887121614 +0000 UTC m=+0.074067460 container create 1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.897 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.898 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.899 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.899 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.900 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.901 2 DEBUG nova.virt.libvirt.driver [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:59 compute-1 systemd[1]: Started libpod-conmon-1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54.scope.
Sep 30 21:50:59 compute-1 podman[249579]: 2025-09-30 21:50:59.843404864 +0000 UTC m=+0.030350710 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.939 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.940 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269059.8423316, dcdbc5a3-6788-4e70-b948-140a8abb3dfa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.940 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] VM Paused (Lifecycle Event)
Sep 30 21:50:59 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:50:59 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d7e4ff844eb6f2f2ec24bde0f2f2e9a0bd26de4848d17bc276101e592e537f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.980 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.986 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269059.847769, dcdbc5a3-6788-4e70-b948-140a8abb3dfa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:59 compute-1 nova_compute[192795]: 2025-09-30 21:50:59.987 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] VM Resumed (Lifecycle Event)
Sep 30 21:50:59 compute-1 podman[249579]: 2025-09-30 21:50:59.991928554 +0000 UTC m=+0.178874400 container init 1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS)
Sep 30 21:50:59 compute-1 podman[249579]: 2025-09-30 21:50:59.997570666 +0000 UTC m=+0.184516492 container start 1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true)
Sep 30 21:51:00 compute-1 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[249594]: [NOTICE]   (249598) : New worker (249600) forked
Sep 30 21:51:00 compute-1 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[249594]: [NOTICE]   (249598) : Loading success.
Sep 30 21:51:00 compute-1 nova_compute[192795]: 2025-09-30 21:51:00.027 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:00 compute-1 nova_compute[192795]: 2025-09-30 21:51:00.033 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:00 compute-1 nova_compute[192795]: 2025-09-30 21:51:00.069 2 INFO nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Took 8.14 seconds to spawn the instance on the hypervisor.
Sep 30 21:51:00 compute-1 nova_compute[192795]: 2025-09-30 21:51:00.069 2 DEBUG nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:00 compute-1 nova_compute[192795]: 2025-09-30 21:51:00.076 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:51:00 compute-1 nova_compute[192795]: 2025-09-30 21:51:00.725 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:00 compute-1 sshd-session[249365]: Failed password for root from 8.210.178.40 port 45704 ssh2
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.116 2 INFO nova.compute.manager [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Took 10.18 seconds to build instance.
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.205 2 DEBUG oslo_concurrency.lockutils [None req-8e612d12-9b40-4bdd-84df-c3b981c8f16f 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.206 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 6.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.206 2 INFO nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.207 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.925 2 DEBUG nova.network.neutron [req-1911b4a3-f73e-42e8-af63-109c038d3295 req-debde45c-84bc-4172-9223-b234c58169b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Updated VIF entry in instance network info cache for port fbb18477-f866-4ee6-b569-c606842c92ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.926 2 DEBUG nova.network.neutron [req-1911b4a3-f73e-42e8-af63-109c038d3295 req-debde45c-84bc-4172-9223-b234c58169b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Updating instance_info_cache with network_info: [{"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:01 compute-1 nova_compute[192795]: 2025-09-30 21:51:01.964 2 DEBUG oslo_concurrency.lockutils [req-1911b4a3-f73e-42e8-af63-109c038d3295 req-debde45c-84bc-4172-9223-b234c58169b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-dcdbc5a3-6788-4e70-b948-140a8abb3dfa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:02 compute-1 unix_chkpwd[249610]: password check failed for user (root)
Sep 30 21:51:02 compute-1 nova_compute[192795]: 2025-09-30 21:51:02.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.574 2 DEBUG nova.compute.manager [req-e51dddf6-b451-416a-b99d-0cbff04a753e req-3ed5b95e-e08e-4069-8fd4-ae3e615e6e2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Received event network-vif-plugged-fbb18477-f866-4ee6-b569-c606842c92ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.575 2 DEBUG oslo_concurrency.lockutils [req-e51dddf6-b451-416a-b99d-0cbff04a753e req-3ed5b95e-e08e-4069-8fd4-ae3e615e6e2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.575 2 DEBUG oslo_concurrency.lockutils [req-e51dddf6-b451-416a-b99d-0cbff04a753e req-3ed5b95e-e08e-4069-8fd4-ae3e615e6e2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.576 2 DEBUG oslo_concurrency.lockutils [req-e51dddf6-b451-416a-b99d-0cbff04a753e req-3ed5b95e-e08e-4069-8fd4-ae3e615e6e2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.576 2 DEBUG nova.compute.manager [req-e51dddf6-b451-416a-b99d-0cbff04a753e req-3ed5b95e-e08e-4069-8fd4-ae3e615e6e2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] No waiting events found dispatching network-vif-plugged-fbb18477-f866-4ee6-b569-c606842c92ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.576 2 WARNING nova.compute.manager [req-e51dddf6-b451-416a-b99d-0cbff04a753e req-3ed5b95e-e08e-4069-8fd4-ae3e615e6e2d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Received unexpected event network-vif-plugged-fbb18477-f866-4ee6-b569-c606842c92ad for instance with vm_state active and task_state None.
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.738 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.739 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.739 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.740 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.873 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.975 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:03 compute-1 nova_compute[192795]: 2025-09-30 21:51:03.977 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.075 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.087 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.165 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.166 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.240 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.512 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.514 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5358MB free_disk=73.27132415771484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.515 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:04 compute-1 nova_compute[192795]: 2025-09-30 21:51:04.515 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:04 compute-1 sshd-session[249365]: Failed password for root from 8.210.178.40 port 45704 ssh2
Sep 30 21:51:05 compute-1 unix_chkpwd[249624]: password check failed for user (root)
Sep 30 21:51:06 compute-1 nova_compute[192795]: 2025-09-30 21:51:06.841 2 INFO nova.compute.manager [None req-43e81619-20fc-44fc-ba1d-7373e9397351 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Get console output
Sep 30 21:51:06 compute-1 nova_compute[192795]: 2025-09-30 21:51:06.852 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:51:06 compute-1 nova_compute[192795]: 2025-09-30 21:51:06.896 2 DEBUG oslo_concurrency.lockutils [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:06 compute-1 nova_compute[192795]: 2025-09-30 21:51:06.897 2 DEBUG oslo_concurrency.lockutils [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:06 compute-1 nova_compute[192795]: 2025-09-30 21:51:06.898 2 DEBUG oslo_concurrency.lockutils [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:06 compute-1 nova_compute[192795]: 2025-09-30 21:51:06.898 2 DEBUG oslo_concurrency.lockutils [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:06 compute-1 nova_compute[192795]: 2025-09-30 21:51:06.900 2 DEBUG oslo_concurrency.lockutils [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.080 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.081 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance dcdbc5a3-6788-4e70-b948-140a8abb3dfa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.081 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.082 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.133 2 INFO nova.compute.manager [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Terminating instance
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.167 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:51:07 compute-1 podman[249625]: 2025-09-30 21:51:07.251785059 +0000 UTC m=+0.082927900 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.305 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.535 2 DEBUG nova.compute.manager [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:51:07 compute-1 kernel: tapfbb18477-f8 (unregistering): left promiscuous mode
Sep 30 21:51:07 compute-1 NetworkManager[51724]: <info>  [1759269067.5612] device (tapfbb18477-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.563 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.564 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:07 compute-1 ovn_controller[94902]: 2025-09-30T21:51:07Z|00707|binding|INFO|Releasing lport fbb18477-f866-4ee6-b569-c606842c92ad from this chassis (sb_readonly=0)
Sep 30 21:51:07 compute-1 ovn_controller[94902]: 2025-09-30T21:51:07Z|00708|binding|INFO|Setting lport fbb18477-f866-4ee6-b569-c606842c92ad down in Southbound
Sep 30 21:51:07 compute-1 ovn_controller[94902]: 2025-09-30T21:51:07Z|00709|binding|INFO|Removing iface tapfbb18477-f8 ovn-installed in OVS
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:07 compute-1 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000ac.scope: Deactivated successfully.
Sep 30 21:51:07 compute-1 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000ac.scope: Consumed 8.494s CPU time.
Sep 30 21:51:07 compute-1 systemd-machined[152783]: Machine qemu-79-instance-000000ac terminated.
Sep 30 21:51:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:07.616 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:5e:54 10.100.0.10'], port_security=['fa:16:3e:52:5e:54 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'dcdbc5a3-6788-4e70-b948-140a8abb3dfa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '03a9c17c97284fddad18a4babc1ac469', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ea43e653-1ffa-4770-a68a-2b8fd3d4fb27', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2923dc02-5672-47c9-85d4-ce6c19aec13e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=fbb18477-f866-4ee6-b569-c606842c92ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:07.619 103861 INFO neutron.agent.ovn.metadata.agent [-] Port fbb18477-f866-4ee6-b569-c606842c92ad in datapath 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff unbound from our chassis
Sep 30 21:51:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:07.621 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:51:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:07.622 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[75826204-4d92-43be-bb86-44e36f1c83a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:07.623 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff namespace which is not needed anymore
Sep 30 21:51:07 compute-1 sshd-session[249365]: Failed password for root from 8.210.178.40 port 45704 ssh2
Sep 30 21:51:07 compute-1 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[249594]: [NOTICE]   (249598) : haproxy version is 2.8.14-c23fe91
Sep 30 21:51:07 compute-1 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[249594]: [NOTICE]   (249598) : path to executable is /usr/sbin/haproxy
Sep 30 21:51:07 compute-1 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[249594]: [WARNING]  (249598) : Exiting Master process...
Sep 30 21:51:07 compute-1 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[249594]: [ALERT]    (249598) : Current worker (249600) exited with code 143 (Terminated)
Sep 30 21:51:07 compute-1 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[249594]: [WARNING]  (249598) : All workers exited. Exiting... (0)
Sep 30 21:51:07 compute-1 systemd[1]: libpod-1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54.scope: Deactivated successfully.
Sep 30 21:51:07 compute-1 podman[249671]: 2025-09-30 21:51:07.792561918 +0000 UTC m=+0.059013055 container died 1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:51:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54-userdata-shm.mount: Deactivated successfully.
Sep 30 21:51:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-5d7e4ff844eb6f2f2ec24bde0f2f2e9a0bd26de4848d17bc276101e592e537f0-merged.mount: Deactivated successfully.
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.854 2 INFO nova.virt.libvirt.driver [-] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Instance destroyed successfully.
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.856 2 DEBUG nova.objects.instance [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lazy-loading 'resources' on Instance uuid dcdbc5a3-6788-4e70-b948-140a8abb3dfa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:07 compute-1 podman[249671]: 2025-09-30 21:51:07.863110312 +0000 UTC m=+0.129561449 container cleanup 1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:51:07 compute-1 systemd[1]: libpod-conmon-1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54.scope: Deactivated successfully.
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.930 2 DEBUG nova.virt.libvirt.vif [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:50:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1112960763',display_name='tempest-TestServerMultinode-server-1112960763',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1112960763',id=172,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:51:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='03a9c17c97284fddad18a4babc1ac469',ramdisk_id='',reservation_id='r-yxqjm2tg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-104702739',owner_user_name='tempest-TestServerMultinode-104702739-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:51:00Z,user_data=None,user_id='39fdea9ccc694a1aa18451c9c7b3bdcc',uuid=dcdbc5a3-6788-4e70-b948-140a8abb3dfa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.931 2 DEBUG nova.network.os_vif_util [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converting VIF {"id": "fbb18477-f866-4ee6-b569-c606842c92ad", "address": "fa:16:3e:52:5e:54", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb18477-f8", "ovs_interfaceid": "fbb18477-f866-4ee6-b569-c606842c92ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.932 2 DEBUG nova.network.os_vif_util [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=fbb18477-f866-4ee6-b569-c606842c92ad,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb18477-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.932 2 DEBUG os_vif [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=fbb18477-f866-4ee6-b569-c606842c92ad,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb18477-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.935 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbb18477-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.943 2 INFO os_vif [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=fbb18477-f866-4ee6-b569-c606842c92ad,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb18477-f8')
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.944 2 INFO nova.virt.libvirt.driver [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Deleting instance files /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa_del
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.945 2 INFO nova.virt.libvirt.driver [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Deletion of /var/lib/nova/instances/dcdbc5a3-6788-4e70-b948-140a8abb3dfa_del complete
Sep 30 21:51:07 compute-1 podman[249711]: 2025-09-30 21:51:07.967965073 +0000 UTC m=+0.067258437 container remove 1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:51:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:07.977 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b62b20dd-1bf2-4ca0-90eb-1a85641973b9]: (4, ('Tue Sep 30 09:51:07 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff (1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54)\n1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54\nTue Sep 30 09:51:07 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff (1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54)\n1138394d4b949edaffe8916b05a29dd6bd585ace1ef3f23d9e89452129000d54\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:07.980 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e5a98caa-9666-4956-ad7f-63540fa0bce0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:07 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:07.982 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b3b9e17-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:07 compute-1 nova_compute[192795]: 2025-09-30 21:51:07.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:07 compute-1 kernel: tap7b3b9e17-80: left promiscuous mode
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.008 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc1634b-ea01-40e2-aa93-a02063ee4389]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.037 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2677589e-4bc7-49af-b0d1-71adad9ebb4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.038 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[844e363d-0d95-4064-a73c-4917e0e0da55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.062 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b061e013-b89e-4e62-a0ce-764c7b5e8ee0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575478, 'reachable_time': 18684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249725, 'error': None, 'target': 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.066 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.066 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[1a578e56-1916-4796-9b56-7b0dfd89209d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:08 compute-1 systemd[1]: run-netns-ovnmeta\x2d7b3b9e17\x2d84a1\x2d4306\x2d91a0\x2d7da7cacfb7ff.mount: Deactivated successfully.
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.204 2 DEBUG nova.compute.manager [req-a9c1baa3-14f6-4867-bcc4-963fbd6c4542 req-482e6e10-f17a-4f2e-9cf1-0883a45271e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Received event network-vif-unplugged-fbb18477-f866-4ee6-b569-c606842c92ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.204 2 DEBUG oslo_concurrency.lockutils [req-a9c1baa3-14f6-4867-bcc4-963fbd6c4542 req-482e6e10-f17a-4f2e-9cf1-0883a45271e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.204 2 DEBUG oslo_concurrency.lockutils [req-a9c1baa3-14f6-4867-bcc4-963fbd6c4542 req-482e6e10-f17a-4f2e-9cf1-0883a45271e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.205 2 DEBUG oslo_concurrency.lockutils [req-a9c1baa3-14f6-4867-bcc4-963fbd6c4542 req-482e6e10-f17a-4f2e-9cf1-0883a45271e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.205 2 DEBUG nova.compute.manager [req-a9c1baa3-14f6-4867-bcc4-963fbd6c4542 req-482e6e10-f17a-4f2e-9cf1-0883a45271e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] No waiting events found dispatching network-vif-unplugged-fbb18477-f866-4ee6-b569-c606842c92ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.205 2 DEBUG nova.compute.manager [req-a9c1baa3-14f6-4867-bcc4-963fbd6c4542 req-482e6e10-f17a-4f2e-9cf1-0883a45271e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Received event network-vif-unplugged-fbb18477-f866-4ee6-b569-c606842c92ad for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.246 2 INFO nova.compute.manager [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Took 0.71 seconds to destroy the instance on the hypervisor.
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.247 2 DEBUG oslo.service.loopingcall [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.248 2 DEBUG nova.compute.manager [-] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.248 2 DEBUG nova.network.neutron [-] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.400 2 DEBUG nova.compute.manager [req-c24982e6-bc0d-4ee3-89ff-ad9db612f947 req-da147167-25ee-4c44-ae5b-0792760a5a63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-changed-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.400 2 DEBUG nova.compute.manager [req-c24982e6-bc0d-4ee3-89ff-ad9db612f947 req-da147167-25ee-4c44-ae5b-0792760a5a63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Refreshing instance network info cache due to event network-changed-afe949d7-0062-4e9f-8390-230f0f7d8f19. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.401 2 DEBUG oslo_concurrency.lockutils [req-c24982e6-bc0d-4ee3-89ff-ad9db612f947 req-da147167-25ee-4c44-ae5b-0792760a5a63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.401 2 DEBUG oslo_concurrency.lockutils [req-c24982e6-bc0d-4ee3-89ff-ad9db612f947 req-da147167-25ee-4c44-ae5b-0792760a5a63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.402 2 DEBUG nova.network.neutron [req-c24982e6-bc0d-4ee3-89ff-ad9db612f947 req-da147167-25ee-4c44-ae5b-0792760a5a63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Refreshing network info cache for port afe949d7-0062-4e9f-8390-230f0f7d8f19 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.422 2 DEBUG oslo_concurrency.lockutils [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.422 2 DEBUG oslo_concurrency.lockutils [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.423 2 DEBUG oslo_concurrency.lockutils [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.423 2 DEBUG oslo_concurrency.lockutils [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.424 2 DEBUG oslo_concurrency.lockutils [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.480 2 INFO nova.compute.manager [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Terminating instance
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.545 2 DEBUG nova.compute.manager [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:51:08 compute-1 kernel: tapafe949d7-00 (unregistering): left promiscuous mode
Sep 30 21:51:08 compute-1 NetworkManager[51724]: <info>  [1759269068.6151] device (tapafe949d7-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:51:08 compute-1 ovn_controller[94902]: 2025-09-30T21:51:08Z|00710|binding|INFO|Releasing lport afe949d7-0062-4e9f-8390-230f0f7d8f19 from this chassis (sb_readonly=0)
Sep 30 21:51:08 compute-1 ovn_controller[94902]: 2025-09-30T21:51:08Z|00711|binding|INFO|Setting lport afe949d7-0062-4e9f-8390-230f0f7d8f19 down in Southbound
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:08 compute-1 ovn_controller[94902]: 2025-09-30T21:51:08Z|00712|binding|INFO|Removing iface tapafe949d7-00 ovn-installed in OVS
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:08 compute-1 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a9.scope: Deactivated successfully.
Sep 30 21:51:08 compute-1 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a9.scope: Consumed 14.102s CPU time.
Sep 30 21:51:08 compute-1 systemd-machined[152783]: Machine qemu-78-instance-000000a9 terminated.
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.744 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:c9:9c 10.100.0.10'], port_security=['fa:16:3e:2b:c9:9c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c6f47087-6db5-4113-b9d1-f72e8a71f342', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'def4f612-8ce8-4228-a0e7-0d189e100661', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b337c30-4f37-407b-8b6c-508e829086f6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=afe949d7-0062-4e9f-8390-230f0f7d8f19) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.745 103861 INFO neutron.agent.ovn.metadata.agent [-] Port afe949d7-0062-4e9f-8390-230f0f7d8f19 in datapath f39b5b05-2446-4ee5-b89a-a5b71519f1fb unbound from our chassis
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.747 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f39b5b05-2446-4ee5-b89a-a5b71519f1fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.748 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef3f382-ed69-4f6e-a9e8-9295d4b3ad45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:08.749 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb namespace which is not needed anymore
Sep 30 21:51:08 compute-1 NetworkManager[51724]: <info>  [1759269068.7681] manager: (tapafe949d7-00): new Tun device (/org/freedesktop/NetworkManager/Devices/352)
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.816 2 INFO nova.virt.libvirt.driver [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance destroyed successfully.
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.817 2 DEBUG nova.objects.instance [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid c6f47087-6db5-4113-b9d1-f72e8a71f342 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:08 compute-1 sshd-session[249365]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 45704 ssh2 [preauth]
Sep 30 21:51:08 compute-1 sshd-session[249365]: Disconnecting authenticating user root 8.210.178.40 port 45704: Too many authentication failures [preauth]
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.835 2 DEBUG nova.virt.libvirt.vif [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:50:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-565258952',display_name='tempest-TestNetworkAdvancedServerOps-server-565258952',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-565258952',id=169,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOp7qq5pR5n4XGNqI6kSLPTrrkD6Em1jYop0AGJR6ftdSM0oNQoVr+JgCc61Z23g5STfy7N8SKmBL8YamCFCSB1WwpAlW6PwTOkSNAiZLvHWVmc3Z3SkZqG4BYTe7NxzPA==',key_name='tempest-TestNetworkAdvancedServerOps-364751036',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:50:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-q1bnr3be',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:50:46Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c6f47087-6db5-4113-b9d1-f72e8a71f342,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:51:08 compute-1 sshd-session[249365]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.836 2 DEBUG nova.network.os_vif_util [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:08 compute-1 sshd-session[249365]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.837 2 DEBUG nova.network.os_vif_util [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.837 2 DEBUG os_vif [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.841 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafe949d7-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.852 2 INFO os_vif [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:c9:9c,bridge_name='br-int',has_traffic_filtering=True,id=afe949d7-0062-4e9f-8390-230f0f7d8f19,network=Network(f39b5b05-2446-4ee5-b89a-a5b71519f1fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafe949d7-00')
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.852 2 INFO nova.virt.libvirt.driver [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Deleting instance files /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342_del
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.854 2 INFO nova.virt.libvirt.driver [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Deletion of /var/lib/nova/instances/c6f47087-6db5-4113-b9d1-f72e8a71f342_del complete
Sep 30 21:51:08 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[249350]: [NOTICE]   (249354) : haproxy version is 2.8.14-c23fe91
Sep 30 21:51:08 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[249350]: [NOTICE]   (249354) : path to executable is /usr/sbin/haproxy
Sep 30 21:51:08 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[249350]: [WARNING]  (249354) : Exiting Master process...
Sep 30 21:51:08 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[249350]: [ALERT]    (249354) : Current worker (249356) exited with code 143 (Terminated)
Sep 30 21:51:08 compute-1 neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb[249350]: [WARNING]  (249354) : All workers exited. Exiting... (0)
Sep 30 21:51:08 compute-1 systemd[1]: libpod-05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5.scope: Deactivated successfully.
Sep 30 21:51:08 compute-1 podman[249764]: 2025-09-30 21:51:08.932510772 +0000 UTC m=+0.050796342 container died 05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.936 2 INFO nova.compute.manager [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Took 0.39 seconds to destroy the instance on the hypervisor.
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.937 2 DEBUG oslo.service.loopingcall [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.938 2 DEBUG nova.compute.manager [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:51:08 compute-1 nova_compute[192795]: 2025-09-30 21:51:08.938 2 DEBUG nova.network.neutron [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:51:08 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5-userdata-shm.mount: Deactivated successfully.
Sep 30 21:51:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-9b26c2858af63c41e84bf78a2a4fa8f84aaeb4d7bf5f0e824752ecf0718e92ec-merged.mount: Deactivated successfully.
Sep 30 21:51:08 compute-1 podman[249764]: 2025-09-30 21:51:08.97801377 +0000 UTC m=+0.096299370 container cleanup 05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:51:08 compute-1 systemd[1]: libpod-conmon-05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5.scope: Deactivated successfully.
Sep 30 21:51:09 compute-1 podman[249793]: 2025-09-30 21:51:09.066458937 +0000 UTC m=+0.053514065 container remove 05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:51:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:09.075 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab17dcc-5cba-449d-bb8f-9a96ee1d22f4]: (4, ('Tue Sep 30 09:51:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb (05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5)\n05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5\nTue Sep 30 09:51:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb (05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5)\n05790d7ec628e3cbed2e7bfd652f27d68c8cfc4887c61c2d789e02dc99fc79d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:09.077 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f290332d-6b7a-49e7-b947-55b38826490b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:09.078 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf39b5b05-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:09 compute-1 nova_compute[192795]: 2025-09-30 21:51:09.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:09 compute-1 kernel: tapf39b5b05-20: left promiscuous mode
Sep 30 21:51:09 compute-1 nova_compute[192795]: 2025-09-30 21:51:09.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:09.110 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e1dd4101-3909-44c2-9a14-1712c17f81fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:09.139 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[52bacdb0-dabd-4787-93b0-b39e7a40d8dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:09.140 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[2e634f50-e7ce-4c05-94eb-2641aa096772]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:09 compute-1 nova_compute[192795]: 2025-09-30 21:51:09.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:09.165 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dd446bb0-6616-44c6-a0a9-43a8a90c0780]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574063, 'reachable_time': 23387, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249808, 'error': None, 'target': 'ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:09.168 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f39b5b05-2446-4ee5-b89a-a5b71519f1fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:51:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:09.168 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[a83ef8fc-7b6d-425b-b5b4-0de67c9b4355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:09 compute-1 systemd[1]: run-netns-ovnmeta\x2df39b5b05\x2d2446\x2d4ee5\x2db89a\x2da5b71519f1fb.mount: Deactivated successfully.
Sep 30 21:51:09 compute-1 nova_compute[192795]: 2025-09-30 21:51:09.564 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:09 compute-1 nova_compute[192795]: 2025-09-30 21:51:09.565 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:09 compute-1 nova_compute[192795]: 2025-09-30 21:51:09.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:09 compute-1 nova_compute[192795]: 2025-09-30 21:51:09.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.148 2 DEBUG nova.network.neutron [-] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.191 2 INFO nova.compute.manager [-] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Took 1.94 seconds to deallocate network for instance.
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.377 2 DEBUG oslo_concurrency.lockutils [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.378 2 DEBUG oslo_concurrency.lockutils [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.440 2 DEBUG nova.compute.manager [req-5f57bb47-38da-44f5-846e-d93565df993d req-acf2c516-19b1-496d-bc3e-5d8e56629952 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-unplugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.441 2 DEBUG oslo_concurrency.lockutils [req-5f57bb47-38da-44f5-846e-d93565df993d req-acf2c516-19b1-496d-bc3e-5d8e56629952 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.441 2 DEBUG oslo_concurrency.lockutils [req-5f57bb47-38da-44f5-846e-d93565df993d req-acf2c516-19b1-496d-bc3e-5d8e56629952 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.441 2 DEBUG oslo_concurrency.lockutils [req-5f57bb47-38da-44f5-846e-d93565df993d req-acf2c516-19b1-496d-bc3e-5d8e56629952 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.442 2 DEBUG nova.compute.manager [req-5f57bb47-38da-44f5-846e-d93565df993d req-acf2c516-19b1-496d-bc3e-5d8e56629952 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-unplugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.442 2 DEBUG nova.compute.manager [req-5f57bb47-38da-44f5-846e-d93565df993d req-acf2c516-19b1-496d-bc3e-5d8e56629952 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-unplugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.446 2 DEBUG nova.compute.manager [req-c3ea9f67-2875-4f69-a805-f9dd91916774 req-ced56873-69de-4cb1-9618-6f398c45d29d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Received event network-vif-plugged-fbb18477-f866-4ee6-b569-c606842c92ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.447 2 DEBUG oslo_concurrency.lockutils [req-c3ea9f67-2875-4f69-a805-f9dd91916774 req-ced56873-69de-4cb1-9618-6f398c45d29d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.447 2 DEBUG oslo_concurrency.lockutils [req-c3ea9f67-2875-4f69-a805-f9dd91916774 req-ced56873-69de-4cb1-9618-6f398c45d29d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.447 2 DEBUG oslo_concurrency.lockutils [req-c3ea9f67-2875-4f69-a805-f9dd91916774 req-ced56873-69de-4cb1-9618-6f398c45d29d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.447 2 DEBUG nova.compute.manager [req-c3ea9f67-2875-4f69-a805-f9dd91916774 req-ced56873-69de-4cb1-9618-6f398c45d29d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] No waiting events found dispatching network-vif-plugged-fbb18477-f866-4ee6-b569-c606842c92ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.447 2 WARNING nova.compute.manager [req-c3ea9f67-2875-4f69-a805-f9dd91916774 req-ced56873-69de-4cb1-9618-6f398c45d29d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Received unexpected event network-vif-plugged-fbb18477-f866-4ee6-b569-c606842c92ad for instance with vm_state deleted and task_state None.
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.474 2 DEBUG nova.compute.provider_tree [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.495 2 DEBUG nova.scheduler.client.report [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.523 2 DEBUG oslo_concurrency.lockutils [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.537 2 DEBUG nova.compute.manager [req-2c314c41-93f0-4f51-ac0e-49901135fb1d req-5ddc48d9-1bad-4145-88a7-bad499288141 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Received event network-vif-deleted-fbb18477-f866-4ee6-b569-c606842c92ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.552 2 INFO nova.scheduler.client.report [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Deleted allocations for instance dcdbc5a3-6788-4e70-b948-140a8abb3dfa
Sep 30 21:51:10 compute-1 nova_compute[192795]: 2025-09-30 21:51:10.655 2 DEBUG oslo_concurrency.lockutils [None req-70391868-f72a-4c10-add0-cf19c663ae31 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "dcdbc5a3-6788-4e70-b948-140a8abb3dfa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:10 compute-1 unix_chkpwd[249811]: password check failed for user (root)
Sep 30 21:51:10 compute-1 sshd-session[249809]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.147 2 DEBUG nova.network.neutron [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.176 2 INFO nova.compute.manager [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Took 2.24 seconds to deallocate network for instance.
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.321 2 DEBUG oslo_concurrency.lockutils [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.321 2 DEBUG oslo_concurrency.lockutils [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.408 2 DEBUG nova.compute.provider_tree [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.422 2 DEBUG nova.scheduler.client.report [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.432 2 DEBUG nova.network.neutron [req-c24982e6-bc0d-4ee3-89ff-ad9db612f947 req-da147167-25ee-4c44-ae5b-0792760a5a63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Updated VIF entry in instance network info cache for port afe949d7-0062-4e9f-8390-230f0f7d8f19. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.433 2 DEBUG nova.network.neutron [req-c24982e6-bc0d-4ee3-89ff-ad9db612f947 req-da147167-25ee-4c44-ae5b-0792760a5a63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Updating instance_info_cache with network_info: [{"id": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "address": "fa:16:3e:2b:c9:9c", "network": {"id": "f39b5b05-2446-4ee5-b89a-a5b71519f1fb", "bridge": "br-int", "label": "tempest-network-smoke--753703675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafe949d7-00", "ovs_interfaceid": "afe949d7-0062-4e9f-8390-230f0f7d8f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.463 2 DEBUG oslo_concurrency.lockutils [req-c24982e6-bc0d-4ee3-89ff-ad9db612f947 req-da147167-25ee-4c44-ae5b-0792760a5a63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c6f47087-6db5-4113-b9d1-f72e8a71f342" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.465 2 DEBUG oslo_concurrency.lockutils [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.507 2 INFO nova.scheduler.client.report [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Deleted allocations for instance c6f47087-6db5-4113-b9d1-f72e8a71f342
Sep 30 21:51:11 compute-1 nova_compute[192795]: 2025-09-30 21:51:11.635 2 DEBUG oslo_concurrency.lockutils [None req-09604b4c-1e6d-4fe9-a9bd-77b0b1e96462 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.559 2 DEBUG nova.compute.manager [req-10137666-3716-4d8f-a9da-c76d05f8573e req-351c898d-0fc6-4be4-bf10-4d8f32195863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.559 2 DEBUG oslo_concurrency.lockutils [req-10137666-3716-4d8f-a9da-c76d05f8573e req-351c898d-0fc6-4be4-bf10-4d8f32195863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.560 2 DEBUG oslo_concurrency.lockutils [req-10137666-3716-4d8f-a9da-c76d05f8573e req-351c898d-0fc6-4be4-bf10-4d8f32195863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.560 2 DEBUG oslo_concurrency.lockutils [req-10137666-3716-4d8f-a9da-c76d05f8573e req-351c898d-0fc6-4be4-bf10-4d8f32195863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c6f47087-6db5-4113-b9d1-f72e8a71f342-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.560 2 DEBUG nova.compute.manager [req-10137666-3716-4d8f-a9da-c76d05f8573e req-351c898d-0fc6-4be4-bf10-4d8f32195863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] No waiting events found dispatching network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.560 2 WARNING nova.compute.manager [req-10137666-3716-4d8f-a9da-c76d05f8573e req-351c898d-0fc6-4be4-bf10-4d8f32195863 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received unexpected event network-vif-plugged-afe949d7-0062-4e9f-8390-230f0f7d8f19 for instance with vm_state deleted and task_state None.
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.672 2 DEBUG nova.compute.manager [req-0b222310-275a-4a9d-b455-f1964aacbff1 req-5f929c82-535b-4bce-a555-0a3cb1b004f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Received event network-vif-deleted-afe949d7-0062-4e9f-8390-230f0f7d8f19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.672 2 INFO nova.compute.manager [req-0b222310-275a-4a9d-b455-f1964aacbff1 req-5f929c82-535b-4bce-a555-0a3cb1b004f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Neutron deleted interface afe949d7-0062-4e9f-8390-230f0f7d8f19; detaching it from the instance and deleting it from the info cache
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.673 2 DEBUG nova.network.neutron [req-0b222310-275a-4a9d-b455-f1964aacbff1 req-5f929c82-535b-4bce-a555-0a3cb1b004f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Sep 30 21:51:12 compute-1 nova_compute[192795]: 2025-09-30 21:51:12.676 2 DEBUG nova.compute.manager [req-0b222310-275a-4a9d-b455-f1964aacbff1 req-5f929c82-535b-4bce-a555-0a3cb1b004f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Detach interface failed, port_id=afe949d7-0062-4e9f-8390-230f0f7d8f19, reason: Instance c6f47087-6db5-4113-b9d1-f72e8a71f342 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:51:12 compute-1 sshd-session[249809]: Failed password for root from 8.210.178.40 port 46332 ssh2
Sep 30 21:51:13 compute-1 nova_compute[192795]: 2025-09-30 21:51:13.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:14 compute-1 nova_compute[192795]: 2025-09-30 21:51:14.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:14 compute-1 podman[249814]: 2025-09-30 21:51:14.263469434 +0000 UTC m=+0.089674661 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:51:14 compute-1 podman[249812]: 2025-09-30 21:51:14.265636573 +0000 UTC m=+0.091658186 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:51:14 compute-1 unix_chkpwd[249869]: password check failed for user (root)
Sep 30 21:51:14 compute-1 podman[249813]: 2025-09-30 21:51:14.335495089 +0000 UTC m=+0.160614837 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:51:15 compute-1 nova_compute[192795]: 2025-09-30 21:51:15.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:16 compute-1 sshd-session[249809]: Failed password for root from 8.210.178.40 port 46332 ssh2
Sep 30 21:51:16 compute-1 nova_compute[192795]: 2025-09-30 21:51:16.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:16 compute-1 nova_compute[192795]: 2025-09-30 21:51:16.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:17 compute-1 unix_chkpwd[249880]: password check failed for user (root)
Sep 30 21:51:18 compute-1 nova_compute[192795]: 2025-09-30 21:51:18.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-1 nova_compute[192795]: 2025-09-30 21:51:19.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-1 nova_compute[192795]: 2025-09-30 21:51:19.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:19 compute-1 nova_compute[192795]: 2025-09-30 21:51:19.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:51:19 compute-1 nova_compute[192795]: 2025-09-30 21:51:19.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:51:19 compute-1 nova_compute[192795]: 2025-09-30 21:51:19.721 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:51:20 compute-1 sshd-session[249809]: Failed password for root from 8.210.178.40 port 46332 ssh2
Sep 30 21:51:21 compute-1 unix_chkpwd[249881]: password check failed for user (root)
Sep 30 21:51:22 compute-1 podman[249882]: 2025-09-30 21:51:22.291086777 +0000 UTC m=+0.120739851 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:51:22 compute-1 nova_compute[192795]: 2025-09-30 21:51:22.851 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269067.8476524, dcdbc5a3-6788-4e70-b948-140a8abb3dfa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:22 compute-1 nova_compute[192795]: 2025-09-30 21:51:22.852 2 INFO nova.compute.manager [-] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] VM Stopped (Lifecycle Event)
Sep 30 21:51:22 compute-1 nova_compute[192795]: 2025-09-30 21:51:22.877 2 DEBUG nova.compute.manager [None req-8b20e00c-b39a-44e2-ab12-8aa5eb1a7169 - - - - - -] [instance: dcdbc5a3-6788-4e70-b948-140a8abb3dfa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:23 compute-1 sshd-session[249809]: Failed password for root from 8.210.178.40 port 46332 ssh2
Sep 30 21:51:23 compute-1 nova_compute[192795]: 2025-09-30 21:51:23.813 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269068.8123562, c6f47087-6db5-4113-b9d1-f72e8a71f342 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:23 compute-1 nova_compute[192795]: 2025-09-30 21:51:23.814 2 INFO nova.compute.manager [-] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] VM Stopped (Lifecycle Event)
Sep 30 21:51:23 compute-1 nova_compute[192795]: 2025-09-30 21:51:23.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:23 compute-1 nova_compute[192795]: 2025-09-30 21:51:23.854 2 DEBUG nova.compute.manager [None req-9c4cd1ab-cf78-4894-b4c1-63890350cd0f - - - - - -] [instance: c6f47087-6db5-4113-b9d1-f72e8a71f342] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:24 compute-1 nova_compute[192795]: 2025-09-30 21:51:24.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:24 compute-1 unix_chkpwd[249904]: password check failed for user (root)
Sep 30 21:51:26 compute-1 sshd-session[249809]: Failed password for root from 8.210.178.40 port 46332 ssh2
Sep 30 21:51:27 compute-1 unix_chkpwd[249905]: password check failed for user (root)
Sep 30 21:51:28 compute-1 podman[249908]: 2025-09-30 21:51:28.215184713 +0000 UTC m=+0.053959648 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Sep 30 21:51:28 compute-1 podman[249907]: 2025-09-30 21:51:28.215250605 +0000 UTC m=+0.057705648 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:51:28 compute-1 podman[249906]: 2025-09-30 21:51:28.230228889 +0000 UTC m=+0.072392675 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Sep 30 21:51:28 compute-1 nova_compute[192795]: 2025-09-30 21:51:28.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:29 compute-1 nova_compute[192795]: 2025-09-30 21:51:29.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:30 compute-1 sshd-session[249809]: Failed password for root from 8.210.178.40 port 46332 ssh2
Sep 30 21:51:31 compute-1 sshd-session[249809]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 46332 ssh2 [preauth]
Sep 30 21:51:31 compute-1 sshd-session[249809]: Disconnecting authenticating user root 8.210.178.40 port 46332: Too many authentication failures [preauth]
Sep 30 21:51:31 compute-1 sshd-session[249809]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:31 compute-1 sshd-session[249809]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:51:32 compute-1 unix_chkpwd[249972]: password check failed for user (root)
Sep 30 21:51:32 compute-1 sshd-session[249970]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:33 compute-1 nova_compute[192795]: 2025-09-30 21:51:33.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:34 compute-1 nova_compute[192795]: 2025-09-30 21:51:34.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:34 compute-1 sshd-session[249970]: Failed password for root from 8.210.178.40 port 46944 ssh2
Sep 30 21:51:36 compute-1 unix_chkpwd[249973]: password check failed for user (root)
Sep 30 21:51:38 compute-1 podman[249974]: 2025-09-30 21:51:38.214140161 +0000 UTC m=+0.054811741 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:51:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:38.710 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:38.711 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:38.711 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:38 compute-1 nova_compute[192795]: 2025-09-30 21:51:38.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:38 compute-1 sshd-session[249970]: Failed password for root from 8.210.178.40 port 46944 ssh2
Sep 30 21:51:39 compute-1 nova_compute[192795]: 2025-09-30 21:51:39.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:39 compute-1 unix_chkpwd[249995]: password check failed for user (root)
Sep 30 21:51:42 compute-1 sshd-session[249970]: Failed password for root from 8.210.178.40 port 46944 ssh2
Sep 30 21:51:43 compute-1 unix_chkpwd[249996]: password check failed for user (root)
Sep 30 21:51:43 compute-1 nova_compute[192795]: 2025-09-30 21:51:43.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:44 compute-1 nova_compute[192795]: 2025-09-30 21:51:44.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:44 compute-1 sshd-session[249970]: Failed password for root from 8.210.178.40 port 46944 ssh2
Sep 30 21:51:44 compute-1 unix_chkpwd[249997]: password check failed for user (root)
Sep 30 21:51:45 compute-1 podman[250000]: 2025-09-30 21:51:45.246025712 +0000 UTC m=+0.067491483 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:51:45 compute-1 podman[249998]: 2025-09-30 21:51:45.252123977 +0000 UTC m=+0.079048085 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:51:45 compute-1 podman[249999]: 2025-09-30 21:51:45.29190091 +0000 UTC m=+0.117486032 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:51:46 compute-1 sshd-session[249970]: Failed password for root from 8.210.178.40 port 46944 ssh2
Sep 30 21:51:48 compute-1 unix_chkpwd[250066]: password check failed for user (root)
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.496 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.497 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.519 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.643 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.644 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.654 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.654 2 INFO nova.compute.claims [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.853 2 DEBUG nova.compute.provider_tree [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.903 2 DEBUG nova.scheduler.client.report [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.946 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:48 compute-1 nova_compute[192795]: 2025-09-30 21:51:48.948 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.061 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.062 2 DEBUG nova.network.neutron [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.090 2 INFO nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.127 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.268 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.271 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.272 2 INFO nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Creating image(s)
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.273 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.273 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.275 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.303 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.407 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.408 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.409 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.423 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.485 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.487 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.546 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.547 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.548 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.609 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.611 2 DEBUG nova.virt.disk.api [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.611 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.679 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.681 2 DEBUG nova.virt.disk.api [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.681 2 DEBUG nova.objects.instance [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.700 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.700 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Ensure instance console log exists: /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.701 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.701 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.702 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:49 compute-1 nova_compute[192795]: 2025-09-30 21:51:49.972 2 DEBUG nova.policy [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:51:50 compute-1 sshd-session[249970]: Failed password for root from 8.210.178.40 port 46944 ssh2
Sep 30 21:51:51 compute-1 sshd-session[249970]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 46944 ssh2 [preauth]
Sep 30 21:51:51 compute-1 sshd-session[249970]: Disconnecting authenticating user root 8.210.178.40 port 46944: Too many authentication failures [preauth]
Sep 30 21:51:51 compute-1 sshd-session[249970]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:51 compute-1 sshd-session[249970]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:51:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:51.582 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:51 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:51.583 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:51:51 compute-1 nova_compute[192795]: 2025-09-30 21:51:51.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:51 compute-1 nova_compute[192795]: 2025-09-30 21:51:51.849 2 DEBUG nova.network.neutron [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Successfully created port: 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:51:52 compute-1 nova_compute[192795]: 2025-09-30 21:51:52.837 2 DEBUG nova.network.neutron [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Successfully created port: caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:51:53 compute-1 unix_chkpwd[250094]: password check failed for user (root)
Sep 30 21:51:53 compute-1 sshd-session[250082]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:53 compute-1 podman[250084]: 2025-09-30 21:51:53.233206012 +0000 UTC m=+0.074502672 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:51:53 compute-1 nova_compute[192795]: 2025-09-30 21:51:53.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:54 compute-1 nova_compute[192795]: 2025-09-30 21:51:54.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:54 compute-1 nova_compute[192795]: 2025-09-30 21:51:54.738 2 DEBUG nova.network.neutron [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Successfully updated port: 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:51:54 compute-1 nova_compute[192795]: 2025-09-30 21:51:54.880 2 DEBUG nova.compute.manager [req-83ef8c95-bebc-40db-8587-58aa5722db42 req-970411d6-5310-4072-87df-37ffc38c72a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-changed-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:54 compute-1 nova_compute[192795]: 2025-09-30 21:51:54.880 2 DEBUG nova.compute.manager [req-83ef8c95-bebc-40db-8587-58aa5722db42 req-970411d6-5310-4072-87df-37ffc38c72a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Refreshing instance network info cache due to event network-changed-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:51:54 compute-1 nova_compute[192795]: 2025-09-30 21:51:54.881 2 DEBUG oslo_concurrency.lockutils [req-83ef8c95-bebc-40db-8587-58aa5722db42 req-970411d6-5310-4072-87df-37ffc38c72a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:54 compute-1 nova_compute[192795]: 2025-09-30 21:51:54.881 2 DEBUG oslo_concurrency.lockutils [req-83ef8c95-bebc-40db-8587-58aa5722db42 req-970411d6-5310-4072-87df-37ffc38c72a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:54 compute-1 nova_compute[192795]: 2025-09-30 21:51:54.881 2 DEBUG nova.network.neutron [req-83ef8c95-bebc-40db-8587-58aa5722db42 req-970411d6-5310-4072-87df-37ffc38c72a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Refreshing network info cache for port 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:51:55 compute-1 nova_compute[192795]: 2025-09-30 21:51:55.153 2 DEBUG nova.network.neutron [req-83ef8c95-bebc-40db-8587-58aa5722db42 req-970411d6-5310-4072-87df-37ffc38c72a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:51:55 compute-1 sshd-session[250082]: Failed password for root from 8.210.178.40 port 47512 ssh2
Sep 30 21:51:55 compute-1 nova_compute[192795]: 2025-09-30 21:51:55.683 2 DEBUG nova.network.neutron [req-83ef8c95-bebc-40db-8587-58aa5722db42 req-970411d6-5310-4072-87df-37ffc38c72a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:55 compute-1 nova_compute[192795]: 2025-09-30 21:51:55.732 2 DEBUG oslo_concurrency.lockutils [req-83ef8c95-bebc-40db-8587-58aa5722db42 req-970411d6-5310-4072-87df-37ffc38c72a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:55 compute-1 nova_compute[192795]: 2025-09-30 21:51:55.760 2 DEBUG nova.network.neutron [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Successfully updated port: caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:51:55 compute-1 nova_compute[192795]: 2025-09-30 21:51:55.816 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:55 compute-1 nova_compute[192795]: 2025-09-30 21:51:55.816 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:55 compute-1 nova_compute[192795]: 2025-09-30 21:51:55.816 2 DEBUG nova.network.neutron [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:51:56 compute-1 nova_compute[192795]: 2025-09-30 21:51:56.024 2 DEBUG nova.network.neutron [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:51:56 compute-1 unix_chkpwd[250105]: password check failed for user (root)
Sep 30 21:51:57 compute-1 nova_compute[192795]: 2025-09-30 21:51:57.017 2 DEBUG nova.compute.manager [req-d1e6f25e-dbc9-4174-bece-f925aba3f0f9 req-0df2e7a2-d8c9-4ef3-b6b2-0f87a028057e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-changed-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:57 compute-1 nova_compute[192795]: 2025-09-30 21:51:57.018 2 DEBUG nova.compute.manager [req-d1e6f25e-dbc9-4174-bece-f925aba3f0f9 req-0df2e7a2-d8c9-4ef3-b6b2-0f87a028057e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Refreshing instance network info cache due to event network-changed-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:51:57 compute-1 nova_compute[192795]: 2025-09-30 21:51:57.019 2 DEBUG oslo_concurrency.lockutils [req-d1e6f25e-dbc9-4174-bece-f925aba3f0f9 req-0df2e7a2-d8c9-4ef3-b6b2-0f87a028057e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:58 compute-1 sshd-session[250082]: Failed password for root from 8.210.178.40 port 47512 ssh2
Sep 30 21:51:58 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:51:58.586 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:58 compute-1 nova_compute[192795]: 2025-09-30 21:51:58.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:59 compute-1 nova_compute[192795]: 2025-09-30 21:51:59.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:59 compute-1 podman[250108]: 2025-09-30 21:51:59.214324376 +0000 UTC m=+0.053185937 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent)
Sep 30 21:51:59 compute-1 podman[250106]: 2025-09-30 21:51:59.227538642 +0000 UTC m=+0.063989999 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Sep 30 21:51:59 compute-1 podman[250107]: 2025-09-30 21:51:59.237309506 +0000 UTC m=+0.075396757 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:52:00 compute-1 unix_chkpwd[250165]: password check failed for user (root)
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.480 2 DEBUG nova.network.neutron [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Updating instance_info_cache with network_info: [{"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.515 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.515 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Instance network_info: |[{"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.516 2 DEBUG oslo_concurrency.lockutils [req-d1e6f25e-dbc9-4174-bece-f925aba3f0f9 req-0df2e7a2-d8c9-4ef3-b6b2-0f87a028057e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.516 2 DEBUG nova.network.neutron [req-d1e6f25e-dbc9-4174-bece-f925aba3f0f9 req-0df2e7a2-d8c9-4ef3-b6b2-0f87a028057e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Refreshing network info cache for port caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.519 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Start _get_guest_xml network_info=[{"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.524 2 WARNING nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.531 2 DEBUG nova.virt.libvirt.host [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.532 2 DEBUG nova.virt.libvirt.host [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.535 2 DEBUG nova.virt.libvirt.host [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.536 2 DEBUG nova.virt.libvirt.host [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.537 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.538 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.538 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.538 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.539 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.539 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.539 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.540 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.540 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.540 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.540 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.541 2 DEBUG nova.virt.hardware [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.545 2 DEBUG nova.virt.libvirt.vif [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-568444683',display_name='tempest-TestGettingAddress-server-568444683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-568444683',id=176,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-dg3z9z13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:49Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.545 2 DEBUG nova.network.os_vif_util [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.546 2 DEBUG nova.network.os_vif_util [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:cc:8d,bridge_name='br-int',has_traffic_filtering=True,id=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f5bcdeb-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.547 2 DEBUG nova.virt.libvirt.vif [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-568444683',display_name='tempest-TestGettingAddress-server-568444683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-568444683',id=176,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-dg3z9z13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:49Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.547 2 DEBUG nova.network.os_vif_util [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.547 2 DEBUG nova.network.os_vif_util [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ca:29,bridge_name='br-int',has_traffic_filtering=True,id=caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba4afa-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.548 2 DEBUG nova.objects.instance [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.578 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <uuid>3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37</uuid>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <name>instance-000000b0</name>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <nova:name>tempest-TestGettingAddress-server-568444683</nova:name>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:52:00</nova:creationTime>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:52:00 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         <nova:port uuid="8f5bcdeb-0e1f-4339-9b7e-10aa78da6494">
Sep 30 21:52:00 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         <nova:port uuid="caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df">
Sep 30 21:52:00 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:feef:ca29" ipVersion="6"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <system>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <entry name="serial">3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37</entry>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <entry name="uuid">3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37</entry>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </system>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <os>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   </os>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <features>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   </features>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk.config"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:5f:cc:8d"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <target dev="tap8f5bcdeb-0e"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:ef:ca:29"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <target dev="tapcaba4afa-56"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/console.log" append="off"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <video>
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </video>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:52:00 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:52:00 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:52:00 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:52:00 compute-1 nova_compute[192795]: </domain>
Sep 30 21:52:00 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.580 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Preparing to wait for external event network-vif-plugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.580 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.580 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.581 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.581 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Preparing to wait for external event network-vif-plugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.581 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.581 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.582 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.582 2 DEBUG nova.virt.libvirt.vif [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-568444683',display_name='tempest-TestGettingAddress-server-568444683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-568444683',id=176,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-dg3z9z13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:49Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.583 2 DEBUG nova.network.os_vif_util [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.583 2 DEBUG nova.network.os_vif_util [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:cc:8d,bridge_name='br-int',has_traffic_filtering=True,id=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f5bcdeb-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.584 2 DEBUG os_vif [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:cc:8d,bridge_name='br-int',has_traffic_filtering=True,id=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f5bcdeb-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.584 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.585 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.589 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f5bcdeb-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.590 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8f5bcdeb-0e, col_values=(('external_ids', {'iface-id': '8f5bcdeb-0e1f-4339-9b7e-10aa78da6494', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:cc:8d', 'vm-uuid': '3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:00 compute-1 NetworkManager[51724]: <info>  [1759269120.5928] manager: (tap8f5bcdeb-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.601 2 INFO os_vif [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:cc:8d,bridge_name='br-int',has_traffic_filtering=True,id=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f5bcdeb-0e')
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.602 2 DEBUG nova.virt.libvirt.vif [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-568444683',display_name='tempest-TestGettingAddress-server-568444683',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-568444683',id=176,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-dg3z9z13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:49Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.602 2 DEBUG nova.network.os_vif_util [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.603 2 DEBUG nova.network.os_vif_util [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ca:29,bridge_name='br-int',has_traffic_filtering=True,id=caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba4afa-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.603 2 DEBUG os_vif [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ca:29,bridge_name='br-int',has_traffic_filtering=True,id=caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba4afa-56') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.604 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.605 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.607 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcaba4afa-56, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.607 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcaba4afa-56, col_values=(('external_ids', {'iface-id': 'caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:ca:29', 'vm-uuid': '3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:00 compute-1 NetworkManager[51724]: <info>  [1759269120.6096] manager: (tapcaba4afa-56): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.616 2 INFO os_vif [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ca:29,bridge_name='br-int',has_traffic_filtering=True,id=caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba4afa-56')
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.716 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.780 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.781 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.781 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:5f:cc:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.781 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:ef:ca:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:52:00 compute-1 nova_compute[192795]: 2025-09-30 21:52:00.782 2 INFO nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Using config drive
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.203 2 INFO nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Creating config drive at /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk.config
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.212 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd2kjup_3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.347 2 DEBUG oslo_concurrency.processutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd2kjup_3" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.4348] manager: (tap8f5bcdeb-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/355)
Sep 30 21:52:01 compute-1 kernel: tap8f5bcdeb-0e: entered promiscuous mode
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 ovn_controller[94902]: 2025-09-30T21:52:01Z|00713|binding|INFO|Claiming lport 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 for this chassis.
Sep 30 21:52:01 compute-1 ovn_controller[94902]: 2025-09-30T21:52:01Z|00714|binding|INFO|8f5bcdeb-0e1f-4339-9b7e-10aa78da6494: Claiming fa:16:3e:5f:cc:8d 10.100.0.3
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.4590] manager: (tapcaba4afa-56): new Tun device (/org/freedesktop/NetworkManager/Devices/356)
Sep 30 21:52:01 compute-1 kernel: tapcaba4afa-56: entered promiscuous mode
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.4630] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.4637] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Sep 30 21:52:01 compute-1 systemd-udevd[250192]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:52:01 compute-1 systemd-udevd[250193]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.4928] device (tapcaba4afa-56): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.4937] device (tap8f5bcdeb-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.4944] device (tapcaba4afa-56): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.4949] device (tap8f5bcdeb-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:52:01 compute-1 systemd-machined[152783]: New machine qemu-80-instance-000000b0.
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.528 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:cc:8d 10.100.0.3'], port_security=['fa:16:3e:5f:cc:8d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '15ea3623-c6c2-499c-8e00-9a1f5fdaf5c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79474bd3-ac2c-4f66-83f8-3a487e22d9d3, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.530 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 in datapath 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc bound to our chassis
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.531 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc
Sep 30 21:52:01 compute-1 systemd[1]: Started Virtual Machine qemu-80-instance-000000b0.
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.547 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[03c26f1b-9c14-480c-9c64-3f6294014332]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.549 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4c85d4d4-31 in ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.553 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4c85d4d4-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.553 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0af064-c596-4c0f-beb2-bdd94f303fed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.554 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3015bfea-8446-4000-8b5a-8170835dd13c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.569 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[183dc02f-8bde-4643-aff0-ffc5e203b972]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.602 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9ae81e-4b7b-4ee3-b929-37cf70284ed2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.635 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[965d8762-e658-472a-a9c7-e8c7f2f38a39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.6522] manager: (tap4c85d4d4-30): new Veth device (/org/freedesktop/NetworkManager/Devices/359)
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.660 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9f0286-46ed-43aa-9d78-6c8f373a9320]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.708 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[755502f6-47bf-44ee-a24a-4f2ad59bdc1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.735 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9a0252-5465-497f-bc96-478f7a9b1958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_controller[94902]: 2025-09-30T21:52:01Z|00715|binding|INFO|Setting lport 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 up in Southbound
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 ovn_controller[94902]: 2025-09-30T21:52:01Z|00716|binding|INFO|Claiming lport caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df for this chassis.
Sep 30 21:52:01 compute-1 ovn_controller[94902]: 2025-09-30T21:52:01Z|00717|binding|INFO|caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df: Claiming fa:16:3e:ef:ca:29 2001:db8::f816:3eff:feef:ca29
Sep 30 21:52:01 compute-1 ovn_controller[94902]: 2025-09-30T21:52:01Z|00718|binding|INFO|Setting lport 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 ovn-installed in OVS
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.7724] device (tap4c85d4d4-30): carrier: link connected
Sep 30 21:52:01 compute-1 ovn_controller[94902]: 2025-09-30T21:52:01Z|00719|binding|INFO|Setting lport caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df ovn-installed in OVS
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.781 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[263f7176-2bd7-4bc0-a4c5-cd59ba9e7191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.803 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e066c268-4a1f-4f86-822b-ad1914b47a81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c85d4d4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:b3:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581744, 'reachable_time': 41295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250229, 'error': None, 'target': 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.823 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ee030ec2-aef3-44a9-b9ca-91ed94b48004]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4d:b3ff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 581744, 'tstamp': 581744}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250230, 'error': None, 'target': 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.826 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:ca:29 2001:db8::f816:3eff:feef:ca29'], port_security=['fa:16:3e:ef:ca:29 2001:db8::f816:3eff:feef:ca29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feef:ca29/64', 'neutron:device_id': '3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '15ea3623-c6c2-499c-8e00-9a1f5fdaf5c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ebdd42c-51b5-4b83-8a22-52988a595a24, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:01 compute-1 ovn_controller[94902]: 2025-09-30T21:52:01Z|00720|binding|INFO|Setting lport caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df up in Southbound
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.849 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[693e4e8a-d491-47c0-9f21-f304104122cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c85d4d4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:b3:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581744, 'reachable_time': 41295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250231, 'error': None, 'target': 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.889 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e8569859-c694-4451-a17f-c49e5938378b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.957 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1d1a0bd8-8a9e-4451-b111-9936af082a19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.958 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c85d4d4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.958 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.959 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c85d4d4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 kernel: tap4c85d4d4-30: entered promiscuous mode
Sep 30 21:52:01 compute-1 NetworkManager[51724]: <info>  [1759269121.9628] manager: (tap4c85d4d4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.966 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4c85d4d4-30, col_values=(('external_ids', {'iface-id': 'edbccbea-ad79-490c-a6b0-46510606db95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 ovn_controller[94902]: 2025-09-30T21:52:01Z|00721|binding|INFO|Releasing lport edbccbea-ad79-490c-a6b0-46510606db95 from this chassis (sb_readonly=0)
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.969 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.970 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ec3143-cf60-4f32-8dbe-860eb190674b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.970 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc.pid.haproxy
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:52:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:01.973 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'env', 'PROCESS_TAG=haproxy-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:52:01 compute-1 nova_compute[192795]: 2025-09-30 21:52:01.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.060 2 DEBUG nova.compute.manager [req-83584ac1-ce51-4130-b522-4cf2e35bf36e req-c8ec8878-ba9a-4b58-a9fa-2019bd5a50e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-plugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.060 2 DEBUG oslo_concurrency.lockutils [req-83584ac1-ce51-4130-b522-4cf2e35bf36e req-c8ec8878-ba9a-4b58-a9fa-2019bd5a50e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.061 2 DEBUG oslo_concurrency.lockutils [req-83584ac1-ce51-4130-b522-4cf2e35bf36e req-c8ec8878-ba9a-4b58-a9fa-2019bd5a50e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.061 2 DEBUG oslo_concurrency.lockutils [req-83584ac1-ce51-4130-b522-4cf2e35bf36e req-c8ec8878-ba9a-4b58-a9fa-2019bd5a50e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.061 2 DEBUG nova.compute.manager [req-83584ac1-ce51-4130-b522-4cf2e35bf36e req-c8ec8878-ba9a-4b58-a9fa-2019bd5a50e1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Processing event network-vif-plugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:52:02 compute-1 podman[250271]: 2025-09-30 21:52:02.387929979 +0000 UTC m=+0.055113368 container create 4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:52:02 compute-1 systemd[1]: Started libpod-conmon-4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c.scope.
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.437 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269122.4367092, 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.438 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] VM Started (Lifecycle Event)
Sep 30 21:52:02 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:52:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b1ac97407be1f8fa2e516e26bc25bd7b6c20af341535a9c9d1b355772125b4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:52:02 compute-1 podman[250271]: 2025-09-30 21:52:02.357979701 +0000 UTC m=+0.025163130 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.461 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:02 compute-1 podman[250271]: 2025-09-30 21:52:02.466545442 +0000 UTC m=+0.133728851 container init 4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.468 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269122.436865, 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.468 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] VM Paused (Lifecycle Event)
Sep 30 21:52:02 compute-1 podman[250271]: 2025-09-30 21:52:02.473839729 +0000 UTC m=+0.141023118 container start 4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.493 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:02 compute-1 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[250287]: [NOTICE]   (250291) : New worker (250293) forked
Sep 30 21:52:02 compute-1 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[250287]: [NOTICE]   (250291) : Loading success.
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.499 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.531 103861 INFO neutron.agent.ovn.metadata.agent [-] Port caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df in datapath cd6da069-7a88-49b7-bea7-1ceb7132f614 unbound from our chassis
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.534 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.536 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd6da069-7a88-49b7-bea7-1ceb7132f614
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.554 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[00a24ed5-ac14-43fb-ba5c-41b43e1f81ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.555 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd6da069-71 in ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.557 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd6da069-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.557 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f24e68cb-ced8-4f07-8702-2292306f0daa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.558 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[05569ede-876b-44c6-9544-d0b5d8531148]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.570 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[71b4f016-b5cf-45a8-a41b-124dd22dc4d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.597 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a865cf88-b385-4a75-aaf8-7e3777489233]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 sshd-session[250082]: Failed password for root from 8.210.178.40 port 47512 ssh2
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.624 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[13fae539-0d41-4252-a870-0b09dd7cd219]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.631 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[48959942-c4b8-4903-9fde-511acfbb88e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 NetworkManager[51724]: <info>  [1759269122.6333] manager: (tapcd6da069-70): new Veth device (/org/freedesktop/NetworkManager/Devices/361)
Sep 30 21:52:02 compute-1 systemd-udevd[250215]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.662 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[9e3463c1-885b-4e0a-8d84-0529b28545f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.666 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[235d2daa-7fcf-41c6-ba92-56ecdacbbb6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 NetworkManager[51724]: <info>  [1759269122.6848] device (tapcd6da069-70): carrier: link connected
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.688 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[60e23426-62e8-4489-9679-7402c1d3638b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.706 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[48fc5a5a-040a-48bc-9fb0-a2f8c19d45ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd6da069-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:39:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581835, 'reachable_time': 41460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250312, 'error': None, 'target': 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.720 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fe585cc9-42df-420d-9e05-7d51bf2554eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:3971'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 581835, 'tstamp': 581835}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250313, 'error': None, 'target': 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.734 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c453fe76-cd62-4a01-8103-a1eeb09d51b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd6da069-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:39:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 225], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581835, 'reachable_time': 41460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250314, 'error': None, 'target': 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.767 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fd829471-005b-4511-8bc8-bee32ce6609b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.797 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6c90c5-3f2d-467a-9d6f-907f190c25db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.799 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd6da069-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.799 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.799 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd6da069-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:02 compute-1 NetworkManager[51724]: <info>  [1759269122.8022] manager: (tapcd6da069-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Sep 30 21:52:02 compute-1 kernel: tapcd6da069-70: entered promiscuous mode
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.806 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd6da069-70, col_values=(('external_ids', {'iface-id': 'e5655641-a4f8-4024-be6c-065dbfac4615'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:02 compute-1 ovn_controller[94902]: 2025-09-30T21:52:02Z|00722|binding|INFO|Releasing lport e5655641-a4f8-4024-be6c-065dbfac4615 from this chassis (sb_readonly=0)
Sep 30 21:52:02 compute-1 nova_compute[192795]: 2025-09-30 21:52:02.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.818 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd6da069-7a88-49b7-bea7-1ceb7132f614.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd6da069-7a88-49b7-bea7-1ceb7132f614.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.819 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[30e31f05-5565-4e7e-b28f-43c04c8f67ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.820 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-cd6da069-7a88-49b7-bea7-1ceb7132f614
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/cd6da069-7a88-49b7-bea7-1ceb7132f614.pid.haproxy
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID cd6da069-7a88-49b7-bea7-1ceb7132f614
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:52:02 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:02.820 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'env', 'PROCESS_TAG=haproxy-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd6da069-7a88-49b7-bea7-1ceb7132f614.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:52:03 compute-1 nova_compute[192795]: 2025-09-30 21:52:03.020 2 DEBUG nova.network.neutron [req-d1e6f25e-dbc9-4174-bece-f925aba3f0f9 req-0df2e7a2-d8c9-4ef3-b6b2-0f87a028057e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Updated VIF entry in instance network info cache for port caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:03 compute-1 nova_compute[192795]: 2025-09-30 21:52:03.021 2 DEBUG nova.network.neutron [req-d1e6f25e-dbc9-4174-bece-f925aba3f0f9 req-0df2e7a2-d8c9-4ef3-b6b2-0f87a028057e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Updating instance_info_cache with network_info: [{"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:03 compute-1 nova_compute[192795]: 2025-09-30 21:52:03.054 2 DEBUG oslo_concurrency.lockutils [req-d1e6f25e-dbc9-4174-bece-f925aba3f0f9 req-0df2e7a2-d8c9-4ef3-b6b2-0f87a028057e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:03 compute-1 podman[250344]: 2025-09-30 21:52:03.200277029 +0000 UTC m=+0.050696679 container create 36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:52:03 compute-1 systemd[1]: Started libpod-conmon-36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832.scope.
Sep 30 21:52:03 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:52:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/051a6de123c1fc68de44ee6639d25de6fd6e0ee1919ea2afbfa54984fdd94399/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:52:03 compute-1 podman[250344]: 2025-09-30 21:52:03.177802793 +0000 UTC m=+0.028222463 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:52:03 compute-1 podman[250344]: 2025-09-30 21:52:03.278493101 +0000 UTC m=+0.128912771 container init 36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:52:03 compute-1 podman[250344]: 2025-09-30 21:52:03.285779408 +0000 UTC m=+0.136199078 container start 36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:52:03 compute-1 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[250359]: [NOTICE]   (250363) : New worker (250365) forked
Sep 30 21:52:03 compute-1 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[250359]: [NOTICE]   (250363) : Loading success.
Sep 30 21:52:03 compute-1 unix_chkpwd[250374]: password check failed for user (root)
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.691 2 DEBUG nova.compute.manager [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-plugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.692 2 DEBUG oslo_concurrency.lockutils [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.692 2 DEBUG oslo_concurrency.lockutils [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.693 2 DEBUG oslo_concurrency.lockutils [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.694 2 DEBUG nova.compute.manager [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] No event matching network-vif-plugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 in dict_keys([('network-vif-plugged', 'caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.694 2 WARNING nova.compute.manager [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received unexpected event network-vif-plugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 for instance with vm_state building and task_state spawning.
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.695 2 DEBUG nova.compute.manager [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-plugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.695 2 DEBUG oslo_concurrency.lockutils [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.696 2 DEBUG oslo_concurrency.lockutils [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.696 2 DEBUG oslo_concurrency.lockutils [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.697 2 DEBUG nova.compute.manager [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Processing event network-vif-plugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.697 2 DEBUG nova.compute.manager [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-plugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.698 2 DEBUG oslo_concurrency.lockutils [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.699 2 DEBUG oslo_concurrency.lockutils [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.699 2 DEBUG oslo_concurrency.lockutils [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.700 2 DEBUG nova.compute.manager [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] No waiting events found dispatching network-vif-plugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.700 2 WARNING nova.compute.manager [req-e07d2388-2b07-4f8b-b1b6-5dfcb8aac92f req-6980b3ed-d3d1-4632-b63c-dda18211dd6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received unexpected event network-vif-plugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df for instance with vm_state building and task_state spawning.
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.701 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.710 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269124.7102087, 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.710 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] VM Resumed (Lifecycle Event)
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.713 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.717 2 INFO nova.virt.libvirt.driver [-] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Instance spawned successfully.
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.719 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.795 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.801 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.869 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.874 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.874 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.874 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.875 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.875 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:04 compute-1 nova_compute[192795]: 2025-09-30 21:52:04.876 2 DEBUG nova.virt.libvirt.driver [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.175 2 INFO nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Took 15.91 seconds to spawn the instance on the hypervisor.
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.176 2 DEBUG nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.531 2 INFO nova.compute.manager [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Took 16.95 seconds to build instance.
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.565 2 DEBUG oslo_concurrency.lockutils [None req-b053c84b-7ec1-4283-9eba-f6826150cbb9 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.739 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.739 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.740 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.740 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:52:05 compute-1 sshd-session[250082]: Failed password for root from 8.210.178.40 port 47512 ssh2
Sep 30 21:52:05 compute-1 nova_compute[192795]: 2025-09-30 21:52:05.927 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.029 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.031 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.095 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.330 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.332 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5563MB free_disk=73.29587936401367GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.333 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.333 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.425 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.426 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.426 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.456 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.490 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.491 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.511 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.542 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.597 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.665 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.760 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:52:06 compute-1 nova_compute[192795]: 2025-09-30 21:52:06.761 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.428s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:06 compute-1 unix_chkpwd[250382]: password check failed for user (root)
Sep 30 21:52:07 compute-1 nova_compute[192795]: 2025-09-30 21:52:07.858 2 DEBUG nova.compute.manager [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.022 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.023 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.079 2 DEBUG nova.objects.instance [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5e3582bd-9296-455b-8ecc-fd02bf833409 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.105 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.106 2 INFO nova.compute.claims [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.107 2 DEBUG nova.objects.instance [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid 5e3582bd-9296-455b-8ecc-fd02bf833409 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.150 2 DEBUG nova.objects.instance [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5e3582bd-9296-455b-8ecc-fd02bf833409 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.259 2 INFO nova.compute.resource_tracker [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updating resource usage from migration 4430a82d-d558-4782-8f49-ef3733161e1d
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.260 2 DEBUG nova.compute.resource_tracker [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Starting to track incoming migration 4430a82d-d558-4782-8f49-ef3733161e1d with flavor c9779bca-1eb6-4567-a36c-b452abeafc70 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.372 2 DEBUG nova.compute.provider_tree [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.394 2 DEBUG nova.scheduler.client.report [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.438 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.439 2 INFO nova.compute.manager [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Migrating
Sep 30 21:52:08 compute-1 nova_compute[192795]: 2025-09-30 21:52:08.764 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:08 compute-1 sshd-session[250082]: Failed password for root from 8.210.178.40 port 47512 ssh2
Sep 30 21:52:09 compute-1 nova_compute[192795]: 2025-09-30 21:52:09.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:09 compute-1 podman[250383]: 2025-09-30 21:52:09.294434556 +0000 UTC m=+0.119255130 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid)
Sep 30 21:52:09 compute-1 nova_compute[192795]: 2025-09-30 21:52:09.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:09 compute-1 nova_compute[192795]: 2025-09-30 21:52:09.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:10 compute-1 unix_chkpwd[250403]: password check failed for user (root)
Sep 30 21:52:10 compute-1 nova_compute[192795]: 2025-09-30 21:52:10.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:10 compute-1 nova_compute[192795]: 2025-09-30 21:52:10.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:10 compute-1 nova_compute[192795]: 2025-09-30 21:52:10.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:10 compute-1 sshd-session[250404]: Accepted publickey for nova from 192.168.122.102 port 35466 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:52:10 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:52:10 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:52:10 compute-1 systemd-logind[793]: New session 62 of user nova.
Sep 30 21:52:10 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:52:10 compute-1 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:52:10 compute-1 systemd[250408]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:52:11 compute-1 systemd[250408]: Queued start job for default target Main User Target.
Sep 30 21:52:11 compute-1 systemd[250408]: Created slice User Application Slice.
Sep 30 21:52:11 compute-1 systemd[250408]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:52:11 compute-1 systemd[250408]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:52:11 compute-1 systemd[250408]: Reached target Paths.
Sep 30 21:52:11 compute-1 systemd[250408]: Reached target Timers.
Sep 30 21:52:11 compute-1 systemd[250408]: Starting D-Bus User Message Bus Socket...
Sep 30 21:52:11 compute-1 systemd[250408]: Starting Create User's Volatile Files and Directories...
Sep 30 21:52:11 compute-1 systemd[250408]: Finished Create User's Volatile Files and Directories.
Sep 30 21:52:11 compute-1 systemd[250408]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:52:11 compute-1 systemd[250408]: Reached target Sockets.
Sep 30 21:52:11 compute-1 systemd[250408]: Reached target Basic System.
Sep 30 21:52:11 compute-1 systemd[250408]: Reached target Main User Target.
Sep 30 21:52:11 compute-1 systemd[250408]: Startup finished in 161ms.
Sep 30 21:52:11 compute-1 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:52:11 compute-1 systemd[1]: Started Session 62 of User nova.
Sep 30 21:52:11 compute-1 sshd-session[250404]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:52:11 compute-1 sshd-session[250423]: Received disconnect from 192.168.122.102 port 35466:11: disconnected by user
Sep 30 21:52:11 compute-1 sshd-session[250423]: Disconnected from user nova 192.168.122.102 port 35466
Sep 30 21:52:11 compute-1 sshd-session[250404]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:52:11 compute-1 systemd[1]: session-62.scope: Deactivated successfully.
Sep 30 21:52:11 compute-1 systemd-logind[793]: Session 62 logged out. Waiting for processes to exit.
Sep 30 21:52:11 compute-1 systemd-logind[793]: Removed session 62.
Sep 30 21:52:11 compute-1 nova_compute[192795]: 2025-09-30 21:52:11.345 2 DEBUG nova.compute.manager [req-050e9241-59bd-42b3-af5c-f7f256512d42 req-432a5ccd-03f2-459d-b64e-e76aae4aa19a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-changed-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:11 compute-1 nova_compute[192795]: 2025-09-30 21:52:11.345 2 DEBUG nova.compute.manager [req-050e9241-59bd-42b3-af5c-f7f256512d42 req-432a5ccd-03f2-459d-b64e-e76aae4aa19a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Refreshing instance network info cache due to event network-changed-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:11 compute-1 nova_compute[192795]: 2025-09-30 21:52:11.346 2 DEBUG oslo_concurrency.lockutils [req-050e9241-59bd-42b3-af5c-f7f256512d42 req-432a5ccd-03f2-459d-b64e-e76aae4aa19a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:11 compute-1 nova_compute[192795]: 2025-09-30 21:52:11.346 2 DEBUG oslo_concurrency.lockutils [req-050e9241-59bd-42b3-af5c-f7f256512d42 req-432a5ccd-03f2-459d-b64e-e76aae4aa19a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:11 compute-1 nova_compute[192795]: 2025-09-30 21:52:11.346 2 DEBUG nova.network.neutron [req-050e9241-59bd-42b3-af5c-f7f256512d42 req-432a5ccd-03f2-459d-b64e-e76aae4aa19a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Refreshing network info cache for port 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:11 compute-1 sshd-session[250425]: Accepted publickey for nova from 192.168.122.102 port 35470 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:52:11 compute-1 systemd-logind[793]: New session 64 of user nova.
Sep 30 21:52:11 compute-1 systemd[1]: Started Session 64 of User nova.
Sep 30 21:52:11 compute-1 sshd-session[250425]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:52:11 compute-1 sshd-session[250428]: Received disconnect from 192.168.122.102 port 35470:11: disconnected by user
Sep 30 21:52:11 compute-1 sshd-session[250428]: Disconnected from user nova 192.168.122.102 port 35470
Sep 30 21:52:11 compute-1 sshd-session[250425]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:52:11 compute-1 systemd[1]: session-64.scope: Deactivated successfully.
Sep 30 21:52:11 compute-1 systemd-logind[793]: Session 64 logged out. Waiting for processes to exit.
Sep 30 21:52:11 compute-1 systemd-logind[793]: Removed session 64.
Sep 30 21:52:12 compute-1 sshd-session[250082]: Failed password for root from 8.210.178.40 port 47512 ssh2
Sep 30 21:52:13 compute-1 sshd-session[250082]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 47512 ssh2 [preauth]
Sep 30 21:52:13 compute-1 sshd-session[250082]: Disconnecting authenticating user root 8.210.178.40 port 47512: Too many authentication failures [preauth]
Sep 30 21:52:13 compute-1 sshd-session[250082]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:13 compute-1 sshd-session[250082]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:52:13 compute-1 nova_compute[192795]: 2025-09-30 21:52:13.687 2 DEBUG nova.network.neutron [req-050e9241-59bd-42b3-af5c-f7f256512d42 req-432a5ccd-03f2-459d-b64e-e76aae4aa19a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Updated VIF entry in instance network info cache for port 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:13 compute-1 nova_compute[192795]: 2025-09-30 21:52:13.689 2 DEBUG nova.network.neutron [req-050e9241-59bd-42b3-af5c-f7f256512d42 req-432a5ccd-03f2-459d-b64e-e76aae4aa19a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Updating instance_info_cache with network_info: [{"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:13 compute-1 nova_compute[192795]: 2025-09-30 21:52:13.727 2 DEBUG oslo_concurrency.lockutils [req-050e9241-59bd-42b3-af5c-f7f256512d42 req-432a5ccd-03f2-459d-b64e-e76aae4aa19a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:14 compute-1 nova_compute[192795]: 2025-09-30 21:52:14.135 2 DEBUG nova.compute.manager [req-15ca6746-e60a-4506-857a-b0dfabafeb5f req-50f76b44-70f2-4cd4-89eb-840d1e9d508a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-vif-unplugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:14 compute-1 nova_compute[192795]: 2025-09-30 21:52:14.136 2 DEBUG oslo_concurrency.lockutils [req-15ca6746-e60a-4506-857a-b0dfabafeb5f req-50f76b44-70f2-4cd4-89eb-840d1e9d508a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:14 compute-1 nova_compute[192795]: 2025-09-30 21:52:14.136 2 DEBUG oslo_concurrency.lockutils [req-15ca6746-e60a-4506-857a-b0dfabafeb5f req-50f76b44-70f2-4cd4-89eb-840d1e9d508a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:14 compute-1 nova_compute[192795]: 2025-09-30 21:52:14.136 2 DEBUG oslo_concurrency.lockutils [req-15ca6746-e60a-4506-857a-b0dfabafeb5f req-50f76b44-70f2-4cd4-89eb-840d1e9d508a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:14 compute-1 nova_compute[192795]: 2025-09-30 21:52:14.137 2 DEBUG nova.compute.manager [req-15ca6746-e60a-4506-857a-b0dfabafeb5f req-50f76b44-70f2-4cd4-89eb-840d1e9d508a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] No waiting events found dispatching network-vif-unplugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:14 compute-1 nova_compute[192795]: 2025-09-30 21:52:14.137 2 WARNING nova.compute.manager [req-15ca6746-e60a-4506-857a-b0dfabafeb5f req-50f76b44-70f2-4cd4-89eb-840d1e9d508a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received unexpected event network-vif-unplugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e for instance with vm_state active and task_state resize_migrating.
Sep 30 21:52:14 compute-1 nova_compute[192795]: 2025-09-30 21:52:14.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:14 compute-1 sshd-session[250432]: Accepted publickey for nova from 192.168.122.102 port 51470 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:52:14 compute-1 systemd-logind[793]: New session 65 of user nova.
Sep 30 21:52:14 compute-1 systemd[1]: Started Session 65 of User nova.
Sep 30 21:52:14 compute-1 sshd-session[250432]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:52:15 compute-1 sshd-session[250435]: Received disconnect from 192.168.122.102 port 51470:11: disconnected by user
Sep 30 21:52:15 compute-1 sshd-session[250435]: Disconnected from user nova 192.168.122.102 port 51470
Sep 30 21:52:15 compute-1 sshd-session[250432]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:52:15 compute-1 systemd[1]: session-65.scope: Deactivated successfully.
Sep 30 21:52:15 compute-1 systemd-logind[793]: Session 65 logged out. Waiting for processes to exit.
Sep 30 21:52:15 compute-1 systemd-logind[793]: Removed session 65.
Sep 30 21:52:15 compute-1 sshd-session[250437]: Accepted publickey for nova from 192.168.122.102 port 51472 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:52:15 compute-1 systemd-logind[793]: New session 66 of user nova.
Sep 30 21:52:15 compute-1 systemd[1]: Started Session 66 of User nova.
Sep 30 21:52:15 compute-1 sshd-session[250437]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:52:15 compute-1 unix_chkpwd[250470]: password check failed for user (root)
Sep 30 21:52:15 compute-1 sshd-session[250430]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:15 compute-1 podman[250439]: 2025-09-30 21:52:15.43446379 +0000 UTC m=+0.086207137 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:52:15 compute-1 sshd-session[250469]: Received disconnect from 192.168.122.102 port 51472:11: disconnected by user
Sep 30 21:52:15 compute-1 sshd-session[250469]: Disconnected from user nova 192.168.122.102 port 51472
Sep 30 21:52:15 compute-1 podman[250442]: 2025-09-30 21:52:15.45519181 +0000 UTC m=+0.102768175 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:52:15 compute-1 sshd-session[250437]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:52:15 compute-1 systemd[1]: session-66.scope: Deactivated successfully.
Sep 30 21:52:15 compute-1 systemd-logind[793]: Session 66 logged out. Waiting for processes to exit.
Sep 30 21:52:15 compute-1 systemd-logind[793]: Removed session 66.
Sep 30 21:52:15 compute-1 podman[250441]: 2025-09-30 21:52:15.474607674 +0000 UTC m=+0.126789874 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:52:15 compute-1 sshd-session[250508]: Accepted publickey for nova from 192.168.122.102 port 51474 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:52:15 compute-1 systemd-logind[793]: New session 67 of user nova.
Sep 30 21:52:15 compute-1 nova_compute[192795]: 2025-09-30 21:52:15.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:15 compute-1 systemd[1]: Started Session 67 of User nova.
Sep 30 21:52:15 compute-1 sshd-session[250508]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:52:15 compute-1 sshd-session[250511]: Received disconnect from 192.168.122.102 port 51474:11: disconnected by user
Sep 30 21:52:15 compute-1 sshd-session[250511]: Disconnected from user nova 192.168.122.102 port 51474
Sep 30 21:52:15 compute-1 sshd-session[250508]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:52:15 compute-1 systemd[1]: session-67.scope: Deactivated successfully.
Sep 30 21:52:15 compute-1 systemd-logind[793]: Session 67 logged out. Waiting for processes to exit.
Sep 30 21:52:15 compute-1 systemd-logind[793]: Removed session 67.
Sep 30 21:52:16 compute-1 nova_compute[192795]: 2025-09-30 21:52:16.307 2 DEBUG nova.compute.manager [req-fb80130d-2e33-4292-a63f-846469ce3230 req-30932a3d-c661-43a2-a107-8897699acb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:16 compute-1 nova_compute[192795]: 2025-09-30 21:52:16.307 2 DEBUG oslo_concurrency.lockutils [req-fb80130d-2e33-4292-a63f-846469ce3230 req-30932a3d-c661-43a2-a107-8897699acb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:16 compute-1 nova_compute[192795]: 2025-09-30 21:52:16.307 2 DEBUG oslo_concurrency.lockutils [req-fb80130d-2e33-4292-a63f-846469ce3230 req-30932a3d-c661-43a2-a107-8897699acb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:16 compute-1 nova_compute[192795]: 2025-09-30 21:52:16.307 2 DEBUG oslo_concurrency.lockutils [req-fb80130d-2e33-4292-a63f-846469ce3230 req-30932a3d-c661-43a2-a107-8897699acb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:16 compute-1 nova_compute[192795]: 2025-09-30 21:52:16.308 2 DEBUG nova.compute.manager [req-fb80130d-2e33-4292-a63f-846469ce3230 req-30932a3d-c661-43a2-a107-8897699acb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] No waiting events found dispatching network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:16 compute-1 nova_compute[192795]: 2025-09-30 21:52:16.308 2 WARNING nova.compute.manager [req-fb80130d-2e33-4292-a63f-846469ce3230 req-30932a3d-c661-43a2-a107-8897699acb5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received unexpected event network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e for instance with vm_state active and task_state resize_migrated.
Sep 30 21:52:16 compute-1 nova_compute[192795]: 2025-09-30 21:52:16.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:16 compute-1 nova_compute[192795]: 2025-09-30 21:52:16.915 2 INFO nova.network.neutron [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updating port ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 21:52:17 compute-1 ovn_controller[94902]: 2025-09-30T21:52:17Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:cc:8d 10.100.0.3
Sep 30 21:52:17 compute-1 ovn_controller[94902]: 2025-09-30T21:52:17Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:cc:8d 10.100.0.3
Sep 30 21:52:17 compute-1 sshd-session[250430]: Failed password for root from 8.210.178.40 port 48168 ssh2
Sep 30 21:52:18 compute-1 nova_compute[192795]: 2025-09-30 21:52:18.003 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:18 compute-1 nova_compute[192795]: 2025-09-30 21:52:18.004 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:18 compute-1 nova_compute[192795]: 2025-09-30 21:52:18.004 2 DEBUG nova.network.neutron [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:52:18 compute-1 nova_compute[192795]: 2025-09-30 21:52:18.201 2 DEBUG nova.compute.manager [req-78dc99d4-06d8-48b3-adb8-feb1001226aa req-72d6d7ff-b671-4bbe-a007-a4c8471a372d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-changed-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:18 compute-1 nova_compute[192795]: 2025-09-30 21:52:18.201 2 DEBUG nova.compute.manager [req-78dc99d4-06d8-48b3-adb8-feb1001226aa req-72d6d7ff-b671-4bbe-a007-a4c8471a372d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Refreshing instance network info cache due to event network-changed-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:18 compute-1 nova_compute[192795]: 2025-09-30 21:52:18.202 2 DEBUG oslo_concurrency.lockutils [req-78dc99d4-06d8-48b3-adb8-feb1001226aa req-72d6d7ff-b671-4bbe-a007-a4c8471a372d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:18 compute-1 unix_chkpwd[250528]: password check failed for user (root)
Sep 30 21:52:19 compute-1 nova_compute[192795]: 2025-09-30 21:52:19.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:19 compute-1 nova_compute[192795]: 2025-09-30 21:52:19.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:19 compute-1 nova_compute[192795]: 2025-09-30 21:52:19.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:52:19 compute-1 nova_compute[192795]: 2025-09-30 21:52:19.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:52:19 compute-1 nova_compute[192795]: 2025-09-30 21:52:19.722 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:19 compute-1 nova_compute[192795]: 2025-09-30 21:52:19.940 2 DEBUG nova.network.neutron [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updating instance_info_cache with network_info: [{"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:19 compute-1 nova_compute[192795]: 2025-09-30 21:52:19.986 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:19 compute-1 nova_compute[192795]: 2025-09-30 21:52:19.989 2 DEBUG oslo_concurrency.lockutils [req-78dc99d4-06d8-48b3-adb8-feb1001226aa req-72d6d7ff-b671-4bbe-a007-a4c8471a372d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:19 compute-1 nova_compute[192795]: 2025-09-30 21:52:19.989 2 DEBUG nova.network.neutron [req-78dc99d4-06d8-48b3-adb8-feb1001226aa req-72d6d7ff-b671-4bbe-a007-a4c8471a372d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Refreshing network info cache for port ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.148 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.150 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.150 2 INFO nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Creating image(s)
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.151 2 DEBUG nova.objects.instance [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5e3582bd-9296-455b-8ecc-fd02bf833409 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.182 2 DEBUG oslo_concurrency.processutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.246 2 DEBUG oslo_concurrency.processutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.247 2 DEBUG nova.virt.disk.api [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.247 2 DEBUG oslo_concurrency.processutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.312 2 DEBUG oslo_concurrency.processutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.313 2 DEBUG nova.virt.disk.api [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.347 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.347 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Ensure instance console log exists: /var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.348 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.348 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.349 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.351 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Start _get_guest_xml network_info=[{"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--695170609", "vif_mac": "fa:16:3e:6d:67:d1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.356 2 WARNING nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.362 2 DEBUG nova.virt.libvirt.host [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.362 2 DEBUG nova.virt.libvirt.host [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.366 2 DEBUG nova.virt.libvirt.host [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.366 2 DEBUG nova.virt.libvirt.host [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.367 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.368 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9779bca-1eb6-4567-a36c-b452abeafc70',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.368 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.368 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.369 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.369 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.369 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.369 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.369 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.370 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.370 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.370 2 DEBUG nova.virt.hardware [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.370 2 DEBUG nova.objects.instance [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5e3582bd-9296-455b-8ecc-fd02bf833409 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.397 2 DEBUG oslo_concurrency.processutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.458 2 DEBUG oslo_concurrency.processutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.460 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.460 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.462 2 DEBUG oslo_concurrency.lockutils [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.463 2 DEBUG nova.virt.libvirt.vif [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1538252087',display_name='tempest-TestNetworkAdvancedServerOps-server-1538252087',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1538252087',id=175,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK64bE32ze2bid9dTwEhXemRPMHSKtSVUl9lfzt4XAJoS5dSKgAmI3qPCfSTAYLYKMKvsuZ8HypOczDqbqsWuJ67Tb4znaS44MoU7MaaI4ew894i8cIyDlKPTNVSQKAcBg==',key_name='tempest-TestNetworkAdvancedServerOps-471135388',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:51:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-a5qtxdyq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:52:16Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=5e3582bd-9296-455b-8ecc-fd02bf833409,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--695170609", "vif_mac": "fa:16:3e:6d:67:d1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.464 2 DEBUG nova.network.os_vif_util [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--695170609", "vif_mac": "fa:16:3e:6d:67:d1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.465 2 DEBUG nova.network.os_vif_util [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:67:d1,bridge_name='br-int',has_traffic_filtering=True,id=ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e,network=Network(a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae5d6d4d-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.468 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <uuid>5e3582bd-9296-455b-8ecc-fd02bf833409</uuid>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <name>instance-000000af</name>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <memory>196608</memory>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1538252087</nova:name>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:52:20</nova:creationTime>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <nova:flavor name="m1.micro">
Sep 30 21:52:20 compute-1 nova_compute[192795]:         <nova:memory>192</nova:memory>
Sep 30 21:52:20 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:52:20 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:52:20 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:52:20 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:52:20 compute-1 nova_compute[192795]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:52:20 compute-1 nova_compute[192795]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:52:20 compute-1 nova_compute[192795]:         <nova:port uuid="ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e">
Sep 30 21:52:20 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <system>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <entry name="serial">5e3582bd-9296-455b-8ecc-fd02bf833409</entry>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <entry name="uuid">5e3582bd-9296-455b-8ecc-fd02bf833409</entry>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     </system>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <os>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   </os>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <features>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   </features>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/disk.config"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:6d:67:d1"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <target dev="tapae5d6d4d-3f"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409/console.log" append="off"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <video>
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     </video>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:52:20 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:52:20 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:52:20 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:52:20 compute-1 nova_compute[192795]: </domain>
Sep 30 21:52:20 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.470 2 DEBUG nova.virt.libvirt.vif [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1538252087',display_name='tempest-TestNetworkAdvancedServerOps-server-1538252087',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1538252087',id=175,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK64bE32ze2bid9dTwEhXemRPMHSKtSVUl9lfzt4XAJoS5dSKgAmI3qPCfSTAYLYKMKvsuZ8HypOczDqbqsWuJ67Tb4znaS44MoU7MaaI4ew894i8cIyDlKPTNVSQKAcBg==',key_name='tempest-TestNetworkAdvancedServerOps-471135388',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:51:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-a5qtxdyq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:52:16Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=5e3582bd-9296-455b-8ecc-fd02bf833409,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--695170609", "vif_mac": "fa:16:3e:6d:67:d1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.471 2 DEBUG nova.network.os_vif_util [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--695170609", "vif_mac": "fa:16:3e:6d:67:d1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.472 2 DEBUG nova.network.os_vif_util [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:67:d1,bridge_name='br-int',has_traffic_filtering=True,id=ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e,network=Network(a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae5d6d4d-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.473 2 DEBUG os_vif [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:67:d1,bridge_name='br-int',has_traffic_filtering=True,id=ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e,network=Network(a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae5d6d4d-3f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.474 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.475 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.480 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae5d6d4d-3f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.481 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapae5d6d4d-3f, col_values=(('external_ids', {'iface-id': 'ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:67:d1', 'vm-uuid': '5e3582bd-9296-455b-8ecc-fd02bf833409'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:20 compute-1 sshd-session[250430]: Failed password for root from 8.210.178.40 port 48168 ssh2
Sep 30 21:52:20 compute-1 NetworkManager[51724]: <info>  [1759269140.4845] manager: (tapae5d6d4d-3f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/363)
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.493 2 INFO os_vif [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:67:d1,bridge_name='br-int',has_traffic_filtering=True,id=ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e,network=Network(a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae5d6d4d-3f')
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.554 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.554 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.555 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No VIF found with MAC fa:16:3e:6d:67:d1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.555 2 INFO nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Using config drive
Sep 30 21:52:20 compute-1 kernel: tapae5d6d4d-3f: entered promiscuous mode
Sep 30 21:52:20 compute-1 NetworkManager[51724]: <info>  [1759269140.6270] manager: (tapae5d6d4d-3f): new Tun device (/org/freedesktop/NetworkManager/Devices/364)
Sep 30 21:52:20 compute-1 ovn_controller[94902]: 2025-09-30T21:52:20Z|00723|binding|INFO|Claiming lport ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e for this chassis.
Sep 30 21:52:20 compute-1 ovn_controller[94902]: 2025-09-30T21:52:20Z|00724|binding|INFO|ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e: Claiming fa:16:3e:6d:67:d1 10.100.0.3
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:20 compute-1 ovn_controller[94902]: 2025-09-30T21:52:20Z|00725|binding|INFO|Setting lport ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e ovn-installed in OVS
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:20 compute-1 ovn_controller[94902]: 2025-09-30T21:52:20Z|00726|binding|INFO|Setting lport ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e up in Southbound
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.646 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:67:d1 10.100.0.3'], port_security=['fa:16:3e:6d:67:d1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '5e3582bd-9296-455b-8ecc-fd02bf833409', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ecdc4bda-6319-47ff-b06c-4aaecbba1494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76968323-ccfb-4fa7-a1e9-1797fe98366d, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.647 103861 INFO neutron.agent.ovn.metadata.agent [-] Port ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e in datapath a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2 bound to our chassis
Sep 30 21:52:20 compute-1 nova_compute[192795]: 2025-09-30 21:52:20.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.650 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2
Sep 30 21:52:20 compute-1 systemd-udevd[250556]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.665 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f90f713a-2e74-4e8e-ae5d-d7b2c8d2235a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.665 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1db7ad3-41 in ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.667 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1db7ad3-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.667 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[dcce673a-1639-4aff-b87d-10e448409a6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.668 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ec9ce787-9bf1-4343-889b-1badcbd82837]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 systemd-machined[152783]: New machine qemu-81-instance-000000af.
Sep 30 21:52:20 compute-1 NetworkManager[51724]: <info>  [1759269140.6746] device (tapae5d6d4d-3f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:52:20 compute-1 NetworkManager[51724]: <info>  [1759269140.6757] device (tapae5d6d4d-3f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.680 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[878e3b7d-dbcb-4828-8eb9-7f379952a5f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 systemd[1]: Started Virtual Machine qemu-81-instance-000000af.
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.704 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb484a1-4d95-4db5-b79b-ca99d7727d02]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.733 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[263dc565-912b-4c6e-86c2-059541f1be0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 NetworkManager[51724]: <info>  [1759269140.7411] manager: (tapa1db7ad3-40): new Veth device (/org/freedesktop/NetworkManager/Devices/365)
Sep 30 21:52:20 compute-1 systemd-udevd[250560]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.741 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f75c7509-c4f1-44ff-b753-af823ba4d307]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.778 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[99270985-373d-40a0-b716-e9e7b5538e2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.782 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[01c01fe1-9243-4473-aa79-ffd291bde9f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 NetworkManager[51724]: <info>  [1759269140.8097] device (tapa1db7ad3-40): carrier: link connected
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.822 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[270aa73c-f0b9-4e21-8456-612b8c188c5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.841 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3b4b40-a357-4ae0-8773-7616a5dda36a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1db7ad3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:05:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583648, 'reachable_time': 20304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250589, 'error': None, 'target': 'ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.860 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6b04e608-dbe9-42c4-adf7-a9768da6ab77]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe57:511'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583648, 'tstamp': 583648}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250590, 'error': None, 'target': 'ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.877 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a2cd3970-13e7-4e55-a226-e10bf87ba42d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1db7ad3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:05:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583648, 'reachable_time': 20304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250591, 'error': None, 'target': 'ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:20 compute-1 unix_chkpwd[250593]: password check failed for user (root)
Sep 30 21:52:20 compute-1 sshd-session[250529]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 21:52:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:20.914 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[728a7f0c-568c-438b-a68b-f90b256b7762]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:21.015 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c96e1672-a739-4d9c-a2eb-4856679a8722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:21.017 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1db7ad3-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:21.017 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:21.018 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1db7ad3-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.029 2 DEBUG nova.compute.manager [req-6c8b2072-3574-49ef-83f5-8b9c20cd160a req-2cdc172f-25dd-47c2-82fc-9b248ebd200d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.030 2 DEBUG oslo_concurrency.lockutils [req-6c8b2072-3574-49ef-83f5-8b9c20cd160a req-2cdc172f-25dd-47c2-82fc-9b248ebd200d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.030 2 DEBUG oslo_concurrency.lockutils [req-6c8b2072-3574-49ef-83f5-8b9c20cd160a req-2cdc172f-25dd-47c2-82fc-9b248ebd200d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.030 2 DEBUG oslo_concurrency.lockutils [req-6c8b2072-3574-49ef-83f5-8b9c20cd160a req-2cdc172f-25dd-47c2-82fc-9b248ebd200d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.030 2 DEBUG nova.compute.manager [req-6c8b2072-3574-49ef-83f5-8b9c20cd160a req-2cdc172f-25dd-47c2-82fc-9b248ebd200d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] No waiting events found dispatching network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.031 2 WARNING nova.compute.manager [req-6c8b2072-3574-49ef-83f5-8b9c20cd160a req-2cdc172f-25dd-47c2-82fc-9b248ebd200d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received unexpected event network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e for instance with vm_state active and task_state resize_finish.
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:21 compute-1 kernel: tapa1db7ad3-40: entered promiscuous mode
Sep 30 21:52:21 compute-1 NetworkManager[51724]: <info>  [1759269141.0609] manager: (tapa1db7ad3-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:21.064 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1db7ad3-40, col_values=(('external_ids', {'iface-id': '036a214f-99a1-419d-ac57-f1fcb0c213a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:21 compute-1 ovn_controller[94902]: 2025-09-30T21:52:21Z|00727|binding|INFO|Releasing lport 036a214f-99a1-419d-ac57-f1fcb0c213a5 from this chassis (sb_readonly=0)
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:21.068 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:21.069 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[be9ebd98-46d1-43b6-a920-d719cd957975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:21.069 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2.pid.haproxy
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:52:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:21.070 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2', 'env', 'PROCESS_TAG=haproxy-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.336 2 DEBUG nova.network.neutron [req-78dc99d4-06d8-48b3-adb8-feb1001226aa req-72d6d7ff-b671-4bbe-a007-a4c8471a372d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updated VIF entry in instance network info cache for port ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.339 2 DEBUG nova.network.neutron [req-78dc99d4-06d8-48b3-adb8-feb1001226aa req-72d6d7ff-b671-4bbe-a007-a4c8471a372d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updating instance_info_cache with network_info: [{"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.365 2 DEBUG oslo_concurrency.lockutils [req-78dc99d4-06d8-48b3-adb8-feb1001226aa req-72d6d7ff-b671-4bbe-a007-a4c8471a372d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.367 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.367 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.369 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5e3582bd-9296-455b-8ecc-fd02bf833409 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:21 compute-1 podman[250631]: 2025-09-30 21:52:21.474405244 +0000 UTC m=+0.049439845 container create 92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:52:21 compute-1 systemd[1]: Started libpod-conmon-92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0.scope.
Sep 30 21:52:21 compute-1 podman[250631]: 2025-09-30 21:52:21.448015181 +0000 UTC m=+0.023049802 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:52:21 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:52:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61c57b028eee152626e8354c31002b857ed95743cc675be12e9fd2e3f75c392f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:52:21 compute-1 podman[250631]: 2025-09-30 21:52:21.569018469 +0000 UTC m=+0.144053090 container init 92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:52:21 compute-1 podman[250631]: 2025-09-30 21:52:21.575071102 +0000 UTC m=+0.150105703 container start 92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:52:21 compute-1 neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2[250647]: [NOTICE]   (250651) : New worker (250653) forked
Sep 30 21:52:21 compute-1 neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2[250647]: [NOTICE]   (250651) : Loading success.
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.643 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "014e9b2f-936f-480e-a320-2d20f6fa98ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.644 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.645 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269141.643407, 5e3582bd-9296-455b-8ecc-fd02bf833409 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.645 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] VM Resumed (Lifecycle Event)
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.646 2 DEBUG nova.compute.manager [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.651 2 INFO nova.virt.libvirt.driver [-] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Instance running successfully.
Sep 30 21:52:21 compute-1 virtqemud[192217]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.653 2 DEBUG nova.virt.libvirt.guest [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.653 2 DEBUG nova.virt.libvirt.driver [None req-2f784ad6-e38a-4c41-90bf-74c62436be13 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.674 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.677 2 DEBUG nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.681 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.747 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.748 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269141.6440408, 5e3582bd-9296-455b-8ecc-fd02bf833409 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.748 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] VM Started (Lifecycle Event)
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.780 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.784 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.872 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.873 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.881 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:52:21 compute-1 nova_compute[192795]: 2025-09-30 21:52:21.881 2 INFO nova.compute.claims [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.049 2 DEBUG nova.compute.provider_tree [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.070 2 DEBUG nova.scheduler.client.report [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.106 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.107 2 DEBUG nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:52:22 compute-1 unix_chkpwd[250663]: password check failed for user (root)
Sep 30 21:52:22 compute-1 sshd-session[250529]: Failed password for root from 185.156.73.233 port 63344 ssh2
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.239 2 DEBUG nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.241 2 DEBUG nova.network.neutron [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.266 2 INFO nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.292 2 DEBUG nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.484 2 DEBUG nova.policy [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.512 2 DEBUG nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.514 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.515 2 INFO nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Creating image(s)
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.516 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "/var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.516 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "/var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.517 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "/var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.550 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:22 compute-1 sshd-session[250529]: Connection closed by authenticating user root 185.156.73.233 port 63344 [preauth]
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.653 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.654 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.655 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.672 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.711 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updating instance_info_cache with network_info: [{"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.755 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.755 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.765 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.765 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.809 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.810 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.811 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.873 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.875 2 DEBUG nova.virt.disk.api [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Checking if we can resize image /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.875 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.934 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.936 2 DEBUG nova.virt.disk.api [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Cannot resize image /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:52:22 compute-1 nova_compute[192795]: 2025-09-30 21:52:22.936 2 DEBUG nova.objects.instance [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lazy-loading 'migration_context' on Instance uuid 014e9b2f-936f-480e-a320-2d20f6fa98ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.031 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.032 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Ensure instance console log exists: /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.033 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.033 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.034 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.225 2 DEBUG nova.compute.manager [req-9b3a39f8-4d2e-41d4-bef6-433839b63ca2 req-43d2aff7-ab9a-4382-aeff-2fc9308277a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.225 2 DEBUG oslo_concurrency.lockutils [req-9b3a39f8-4d2e-41d4-bef6-433839b63ca2 req-43d2aff7-ab9a-4382-aeff-2fc9308277a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.226 2 DEBUG oslo_concurrency.lockutils [req-9b3a39f8-4d2e-41d4-bef6-433839b63ca2 req-43d2aff7-ab9a-4382-aeff-2fc9308277a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.226 2 DEBUG oslo_concurrency.lockutils [req-9b3a39f8-4d2e-41d4-bef6-433839b63ca2 req-43d2aff7-ab9a-4382-aeff-2fc9308277a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.227 2 DEBUG nova.compute.manager [req-9b3a39f8-4d2e-41d4-bef6-433839b63ca2 req-43d2aff7-ab9a-4382-aeff-2fc9308277a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] No waiting events found dispatching network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.227 2 WARNING nova.compute.manager [req-9b3a39f8-4d2e-41d4-bef6-433839b63ca2 req-43d2aff7-ab9a-4382-aeff-2fc9308277a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received unexpected event network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e for instance with vm_state resized and task_state None.
Sep 30 21:52:23 compute-1 nova_compute[192795]: 2025-09-30 21:52:23.798 2 DEBUG nova.network.neutron [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Successfully created port: 8806e5a6-3463-43ee-aef5-01483480da59 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:52:24 compute-1 nova_compute[192795]: 2025-09-30 21:52:24.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:24 compute-1 podman[250679]: 2025-09-30 21:52:24.21800677 +0000 UTC m=+0.057347359 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Sep 30 21:52:24 compute-1 sshd-session[250430]: Failed password for root from 8.210.178.40 port 48168 ssh2
Sep 30 21:52:24 compute-1 nova_compute[192795]: 2025-09-30 21:52:24.827 2 DEBUG nova.network.neutron [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Successfully updated port: 8806e5a6-3463-43ee-aef5-01483480da59 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:52:24 compute-1 nova_compute[192795]: 2025-09-30 21:52:24.914 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "refresh_cache-014e9b2f-936f-480e-a320-2d20f6fa98ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:24 compute-1 nova_compute[192795]: 2025-09-30 21:52:24.914 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquired lock "refresh_cache-014e9b2f-936f-480e-a320-2d20f6fa98ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:24 compute-1 nova_compute[192795]: 2025-09-30 21:52:24.915 2 DEBUG nova.network.neutron [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:52:24 compute-1 nova_compute[192795]: 2025-09-30 21:52:24.972 2 DEBUG nova.compute.manager [req-7bc2d840-4886-4a79-ab10-39ab36a9e560 req-7af80cab-7823-486c-a2ed-b93472fe0fd3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received event network-changed-8806e5a6-3463-43ee-aef5-01483480da59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:24 compute-1 nova_compute[192795]: 2025-09-30 21:52:24.972 2 DEBUG nova.compute.manager [req-7bc2d840-4886-4a79-ab10-39ab36a9e560 req-7af80cab-7823-486c-a2ed-b93472fe0fd3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Refreshing instance network info cache due to event network-changed-8806e5a6-3463-43ee-aef5-01483480da59. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:24 compute-1 nova_compute[192795]: 2025-09-30 21:52:24.973 2 DEBUG oslo_concurrency.lockutils [req-7bc2d840-4886-4a79-ab10-39ab36a9e560 req-7af80cab-7823-486c-a2ed-b93472fe0fd3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-014e9b2f-936f-480e-a320-2d20f6fa98ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:25 compute-1 nova_compute[192795]: 2025-09-30 21:52:25.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:25 compute-1 unix_chkpwd[250699]: password check failed for user (root)
Sep 30 21:52:25 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:52:25 compute-1 systemd[250408]: Activating special unit Exit the Session...
Sep 30 21:52:25 compute-1 systemd[250408]: Stopped target Main User Target.
Sep 30 21:52:25 compute-1 systemd[250408]: Stopped target Basic System.
Sep 30 21:52:25 compute-1 systemd[250408]: Stopped target Paths.
Sep 30 21:52:25 compute-1 systemd[250408]: Stopped target Sockets.
Sep 30 21:52:25 compute-1 systemd[250408]: Stopped target Timers.
Sep 30 21:52:25 compute-1 systemd[250408]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:52:25 compute-1 systemd[250408]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:52:25 compute-1 systemd[250408]: Closed D-Bus User Message Bus Socket.
Sep 30 21:52:25 compute-1 systemd[250408]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:52:25 compute-1 systemd[250408]: Removed slice User Application Slice.
Sep 30 21:52:25 compute-1 systemd[250408]: Reached target Shutdown.
Sep 30 21:52:25 compute-1 systemd[250408]: Finished Exit the Session.
Sep 30 21:52:25 compute-1 systemd[250408]: Reached target Exit the Session.
Sep 30 21:52:25 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:52:25 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:52:25 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:52:25 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:52:25 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:52:25 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:52:25 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:52:25 compute-1 nova_compute[192795]: 2025-09-30 21:52:25.959 2 DEBUG nova.network.neutron [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:52:26 compute-1 nova_compute[192795]: 2025-09-30 21:52:26.750 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:27 compute-1 sshd-session[250430]: Failed password for root from 8.210.178.40 port 48168 ssh2
Sep 30 21:52:27 compute-1 nova_compute[192795]: 2025-09-30 21:52:27.955 2 DEBUG nova.network.neutron [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Updating instance_info_cache with network_info: [{"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:27 compute-1 nova_compute[192795]: 2025-09-30 21:52:27.992 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Releasing lock "refresh_cache-014e9b2f-936f-480e-a320-2d20f6fa98ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:27 compute-1 nova_compute[192795]: 2025-09-30 21:52:27.992 2 DEBUG nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Instance network_info: |[{"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:52:27 compute-1 nova_compute[192795]: 2025-09-30 21:52:27.993 2 DEBUG oslo_concurrency.lockutils [req-7bc2d840-4886-4a79-ab10-39ab36a9e560 req-7af80cab-7823-486c-a2ed-b93472fe0fd3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-014e9b2f-936f-480e-a320-2d20f6fa98ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:27 compute-1 nova_compute[192795]: 2025-09-30 21:52:27.993 2 DEBUG nova.network.neutron [req-7bc2d840-4886-4a79-ab10-39ab36a9e560 req-7af80cab-7823-486c-a2ed-b93472fe0fd3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Refreshing network info cache for port 8806e5a6-3463-43ee-aef5-01483480da59 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:27 compute-1 nova_compute[192795]: 2025-09-30 21:52:27.996 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Start _get_guest_xml network_info=[{"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.003 2 WARNING nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.010 2 DEBUG nova.virt.libvirt.host [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.012 2 DEBUG nova.virt.libvirt.host [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.016 2 DEBUG nova.virt.libvirt.host [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.017 2 DEBUG nova.virt.libvirt.host [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.018 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.019 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.019 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.019 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.020 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.020 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.020 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.020 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.021 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.021 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.021 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.021 2 DEBUG nova.virt.hardware [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.025 2 DEBUG nova.virt.libvirt.vif [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:52:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-70187108',display_name='tempest-TestServerBasicOps-server-70187108',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-70187108',id=178,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIyzriIScYOmBfIK6W7fMuNgPa5v7NwPylJUkiBv2Gd8HXqCDL8rRINEcrCbJXn/NhtVhKHpMMvZmHxLI0nQZPUJoIp7f9iKXMPElhWdDUN1uySAtHG0Gra6rGJdPHWsuQ==',key_name='tempest-TestServerBasicOps-1989430457',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf9b66a62c37489792c3bdff7dfdb47f',ramdisk_id='',reservation_id='r-0pneisxq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1078278889',owner_user_name='tempest-TestServerBasicOps-1078278889-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:52:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d5901e3d1f454c3ebc6d467bb263431f',uuid=014e9b2f-936f-480e-a320-2d20f6fa98ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.026 2 DEBUG nova.network.os_vif_util [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Converting VIF {"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.026 2 DEBUG nova.network.os_vif_util [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:7f:ec,bridge_name='br-int',has_traffic_filtering=True,id=8806e5a6-3463-43ee-aef5-01483480da59,network=Network(2ea0f3f8-3465-4da1-824e-159a41073230),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8806e5a6-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.027 2 DEBUG nova.objects.instance [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lazy-loading 'pci_devices' on Instance uuid 014e9b2f-936f-480e-a320-2d20f6fa98ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.050 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <uuid>014e9b2f-936f-480e-a320-2d20f6fa98ce</uuid>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <name>instance-000000b2</name>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <nova:name>tempest-TestServerBasicOps-server-70187108</nova:name>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:52:28</nova:creationTime>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:52:28 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:52:28 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:52:28 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:52:28 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:52:28 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:52:28 compute-1 nova_compute[192795]:         <nova:user uuid="d5901e3d1f454c3ebc6d467bb263431f">tempest-TestServerBasicOps-1078278889-project-member</nova:user>
Sep 30 21:52:28 compute-1 nova_compute[192795]:         <nova:project uuid="bf9b66a62c37489792c3bdff7dfdb47f">tempest-TestServerBasicOps-1078278889</nova:project>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:52:28 compute-1 nova_compute[192795]:         <nova:port uuid="8806e5a6-3463-43ee-aef5-01483480da59">
Sep 30 21:52:28 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <system>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <entry name="serial">014e9b2f-936f-480e-a320-2d20f6fa98ce</entry>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <entry name="uuid">014e9b2f-936f-480e-a320-2d20f6fa98ce</entry>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     </system>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <os>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   </os>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <features>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   </features>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.config"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:32:7f:ec"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <target dev="tap8806e5a6-34"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/console.log" append="off"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <video>
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     </video>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:52:28 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:52:28 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:52:28 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:52:28 compute-1 nova_compute[192795]: </domain>
Sep 30 21:52:28 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.051 2 DEBUG nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Preparing to wait for external event network-vif-plugged-8806e5a6-3463-43ee-aef5-01483480da59 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.052 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.052 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.052 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.053 2 DEBUG nova.virt.libvirt.vif [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:52:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-70187108',display_name='tempest-TestServerBasicOps-server-70187108',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-70187108',id=178,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIyzriIScYOmBfIK6W7fMuNgPa5v7NwPylJUkiBv2Gd8HXqCDL8rRINEcrCbJXn/NhtVhKHpMMvZmHxLI0nQZPUJoIp7f9iKXMPElhWdDUN1uySAtHG0Gra6rGJdPHWsuQ==',key_name='tempest-TestServerBasicOps-1989430457',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf9b66a62c37489792c3bdff7dfdb47f',ramdisk_id='',reservation_id='r-0pneisxq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1078278889',owner_user_name='tempest-TestServerBasicOps-1078278889-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:52:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d5901e3d1f454c3ebc6d467bb263431f',uuid=014e9b2f-936f-480e-a320-2d20f6fa98ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.053 2 DEBUG nova.network.os_vif_util [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Converting VIF {"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.054 2 DEBUG nova.network.os_vif_util [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:7f:ec,bridge_name='br-int',has_traffic_filtering=True,id=8806e5a6-3463-43ee-aef5-01483480da59,network=Network(2ea0f3f8-3465-4da1-824e-159a41073230),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8806e5a6-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.054 2 DEBUG os_vif [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:7f:ec,bridge_name='br-int',has_traffic_filtering=True,id=8806e5a6-3463-43ee-aef5-01483480da59,network=Network(2ea0f3f8-3465-4da1-824e-159a41073230),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8806e5a6-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.055 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.055 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.058 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8806e5a6-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.059 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8806e5a6-34, col_values=(('external_ids', {'iface-id': '8806e5a6-3463-43ee-aef5-01483480da59', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:7f:ec', 'vm-uuid': '014e9b2f-936f-480e-a320-2d20f6fa98ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:28 compute-1 NetworkManager[51724]: <info>  [1759269148.0628] manager: (tap8806e5a6-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.069 2 INFO os_vif [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:7f:ec,bridge_name='br-int',has_traffic_filtering=True,id=8806e5a6-3463-43ee-aef5-01483480da59,network=Network(2ea0f3f8-3465-4da1-824e-159a41073230),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8806e5a6-34')
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.323 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.324 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.325 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] No VIF found with MAC fa:16:3e:32:7f:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:52:28 compute-1 nova_compute[192795]: 2025-09-30 21:52:28.326 2 INFO nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Using config drive
Sep 30 21:52:29 compute-1 unix_chkpwd[250704]: password check failed for user (root)
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.198 2 INFO nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Creating config drive at /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.config
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.205 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiehs4s96 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.339 2 DEBUG oslo_concurrency.processutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiehs4s96" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:29 compute-1 kernel: tap8806e5a6-34: entered promiscuous mode
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:29 compute-1 ovn_controller[94902]: 2025-09-30T21:52:29Z|00728|binding|INFO|Claiming lport 8806e5a6-3463-43ee-aef5-01483480da59 for this chassis.
Sep 30 21:52:29 compute-1 ovn_controller[94902]: 2025-09-30T21:52:29Z|00729|binding|INFO|8806e5a6-3463-43ee-aef5-01483480da59: Claiming fa:16:3e:32:7f:ec 10.100.0.12
Sep 30 21:52:29 compute-1 NetworkManager[51724]: <info>  [1759269149.4454] manager: (tap8806e5a6-34): new Tun device (/org/freedesktop/NetworkManager/Devices/368)
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.461 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:7f:ec 10.100.0.12'], port_security=['fa:16:3e:32:7f:ec 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ea0f3f8-3465-4da1-824e-159a41073230', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5aaeaec0-d649-4040-b726-c1b3a3e877c9 6eb38129-ddeb-4fde-9f6f-50f70e37a068', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f762d1d8-ecd8-4f73-b47e-4f506fe1de65, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=8806e5a6-3463-43ee-aef5-01483480da59) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:29 compute-1 ovn_controller[94902]: 2025-09-30T21:52:29Z|00730|binding|INFO|Setting lport 8806e5a6-3463-43ee-aef5-01483480da59 ovn-installed in OVS
Sep 30 21:52:29 compute-1 ovn_controller[94902]: 2025-09-30T21:52:29Z|00731|binding|INFO|Setting lport 8806e5a6-3463-43ee-aef5-01483480da59 up in Southbound
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.465 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 8806e5a6-3463-43ee-aef5-01483480da59 in datapath 2ea0f3f8-3465-4da1-824e-159a41073230 bound to our chassis
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.467 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2ea0f3f8-3465-4da1-824e-159a41073230
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.486 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ab71ffd3-c7a4-414c-86e5-afe040767dd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.488 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2ea0f3f8-31 in ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.491 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2ea0f3f8-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.492 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[abdb41a0-5ecf-4008-be05-7733ad3d16a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.493 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[430c6836-0304-49e7-9c9f-23a3ea6c4110]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 systemd-udevd[250767]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.509 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[5e7e9580-fb83-4829-b645-baa4e0808df2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 systemd-machined[152783]: New machine qemu-82-instance-000000b2.
Sep 30 21:52:29 compute-1 NetworkManager[51724]: <info>  [1759269149.5200] device (tap8806e5a6-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:52:29 compute-1 NetworkManager[51724]: <info>  [1759269149.5215] device (tap8806e5a6-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:52:29 compute-1 systemd[1]: Started Virtual Machine qemu-82-instance-000000b2.
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.527 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c22d3d-6a12-43b5-a4c9-d3124a3dca09]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 podman[250715]: 2025-09-30 21:52:29.535372406 +0000 UTC m=+0.112239611 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Sep 30 21:52:29 compute-1 podman[250716]: 2025-09-30 21:52:29.543459464 +0000 UTC m=+0.119343342 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:52:29 compute-1 podman[250717]: 2025-09-30 21:52:29.552807647 +0000 UTC m=+0.110801592 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.571 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb3dc20-f875-4d1d-872e-cd9f8b798ccf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 NetworkManager[51724]: <info>  [1759269149.5822] manager: (tap2ea0f3f8-30): new Veth device (/org/freedesktop/NetworkManager/Devices/369)
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.580 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b3793553-461b-4ec9-b1bf-75346212e783]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 systemd-udevd[250783]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.614 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[5e0c6921-e4a0-4f95-9d47-b761b1c15365]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.618 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[210eabc5-5008-4ca5-96f6-1027024a39e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 NetworkManager[51724]: <info>  [1759269149.6456] device (tap2ea0f3f8-30): carrier: link connected
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.653 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[e5828f70-0222-4c27-94bb-d8d8a9711a23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.671 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c6cff617-49a6-4b42-90cc-3c0f4f17e7da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ea0f3f8-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:93:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584531, 'reachable_time': 16051, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250813, 'error': None, 'target': 'ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.695 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[934d489e-f685-4236-a76d-59596a789b9b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:9314'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 584531, 'tstamp': 584531}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250814, 'error': None, 'target': 'ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.733 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c57ad180-06c8-40c0-9b06-6c8e5bab07e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ea0f3f8-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:93:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584531, 'reachable_time': 16051, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250816, 'error': None, 'target': 'ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.784 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[107ec995-8b4e-494b-88d0-c5ca751097e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.899 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4be91e15-b10c-46fd-ac2b-36b095189a75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.901 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ea0f3f8-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.901 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.901 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ea0f3f8-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:29 compute-1 NetworkManager[51724]: <info>  [1759269149.9047] manager: (tap2ea0f3f8-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/370)
Sep 30 21:52:29 compute-1 kernel: tap2ea0f3f8-30: entered promiscuous mode
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.908 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2ea0f3f8-30, col_values=(('external_ids', {'iface-id': 'f1ad069a-d0ee-45c6-8d17-4565ef1a2c70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:29 compute-1 ovn_controller[94902]: 2025-09-30T21:52:29Z|00732|binding|INFO|Releasing lport f1ad069a-d0ee-45c6-8d17-4565ef1a2c70 from this chassis (sb_readonly=0)
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.938 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2ea0f3f8-3465-4da1-824e-159a41073230.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2ea0f3f8-3465-4da1-824e-159a41073230.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.944 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8a809e-ef99-430c-abd3-a69dbb09ccc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.945 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-2ea0f3f8-3465-4da1-824e-159a41073230
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/2ea0f3f8-3465-4da1-824e-159a41073230.pid.haproxy
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 2ea0f3f8-3465-4da1-824e-159a41073230
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.946 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230', 'env', 'PROCESS_TAG=haproxy-2ea0f3f8-3465-4da1-824e-159a41073230', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2ea0f3f8-3465-4da1-824e-159a41073230.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:52:29 compute-1 nova_compute[192795]: 2025-09-30 21:52:29.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:29.986 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.276 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269150.276243, 014e9b2f-936f-480e-a320-2d20f6fa98ce => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.277 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] VM Started (Lifecycle Event)
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.320 2 DEBUG nova.compute.manager [req-36517617-d14b-426a-9751-488a6c7ace86 req-b375236e-f145-465b-868f-4c507431af5b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received event network-vif-plugged-8806e5a6-3463-43ee-aef5-01483480da59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.321 2 DEBUG oslo_concurrency.lockutils [req-36517617-d14b-426a-9751-488a6c7ace86 req-b375236e-f145-465b-868f-4c507431af5b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.321 2 DEBUG oslo_concurrency.lockutils [req-36517617-d14b-426a-9751-488a6c7ace86 req-b375236e-f145-465b-868f-4c507431af5b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.321 2 DEBUG oslo_concurrency.lockutils [req-36517617-d14b-426a-9751-488a6c7ace86 req-b375236e-f145-465b-868f-4c507431af5b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.321 2 DEBUG nova.compute.manager [req-36517617-d14b-426a-9751-488a6c7ace86 req-b375236e-f145-465b-868f-4c507431af5b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Processing event network-vif-plugged-8806e5a6-3463-43ee-aef5-01483480da59 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.323 2 DEBUG nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.325 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.330 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.334 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.339 2 INFO nova.virt.libvirt.driver [-] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Instance spawned successfully.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.340 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.363 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.364 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269150.2763965, 014e9b2f-936f-480e-a320-2d20f6fa98ce => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.364 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] VM Paused (Lifecycle Event)
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.381 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.382 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.382 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.383 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.383 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.385 2 DEBUG nova.virt.libvirt.driver [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.405 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:30 compute-1 podman[250851]: 2025-09-30 21:52:30.406113772 +0000 UTC m=+0.080389961 container create 93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.419 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269150.329099, 014e9b2f-936f-480e-a320-2d20f6fa98ce => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.420 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] VM Resumed (Lifecycle Event)
Sep 30 21:52:30 compute-1 systemd[1]: Started libpod-conmon-93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e.scope.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.460 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:30 compute-1 podman[250851]: 2025-09-30 21:52:30.376090102 +0000 UTC m=+0.050366321 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.471 2 DEBUG oslo_concurrency.lockutils [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.474 2 DEBUG oslo_concurrency.lockutils [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.474 2 DEBUG oslo_concurrency.lockutils [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.474 2 DEBUG oslo_concurrency.lockutils [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.475 2 DEBUG oslo_concurrency.lockutils [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:30 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.486 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:52:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb2ee61a0b6e72faafb0f269f5f3787cccf2dd964ab23777f44061cfad8bd2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:52:30 compute-1 podman[250851]: 2025-09-30 21:52:30.501472596 +0000 UTC m=+0.175748815 container init 93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:52:30 compute-1 podman[250851]: 2025-09-30 21:52:30.50865485 +0000 UTC m=+0.182931039 container start 93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.521 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.541 2 INFO nova.compute.manager [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Terminating instance
Sep 30 21:52:30 compute-1 neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230[250864]: [NOTICE]   (250868) : New worker (250870) forked
Sep 30 21:52:30 compute-1 neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230[250864]: [NOTICE]   (250868) : Loading success.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.557 2 INFO nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Took 8.04 seconds to spawn the instance on the hypervisor.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.559 2 DEBUG nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.572 2 DEBUG nova.compute.manager [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.593 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:52:30 compute-1 kernel: tap8f5bcdeb-0e (unregistering): left promiscuous mode
Sep 30 21:52:30 compute-1 NetworkManager[51724]: <info>  [1759269150.6056] device (tap8f5bcdeb-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:52:30 compute-1 ovn_controller[94902]: 2025-09-30T21:52:30Z|00733|binding|INFO|Releasing lport 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 from this chassis (sb_readonly=0)
Sep 30 21:52:30 compute-1 ovn_controller[94902]: 2025-09-30T21:52:30Z|00734|binding|INFO|Setting lport 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 down in Southbound
Sep 30 21:52:30 compute-1 ovn_controller[94902]: 2025-09-30T21:52:30Z|00735|binding|INFO|Removing iface tap8f5bcdeb-0e ovn-installed in OVS
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.623 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:cc:8d 10.100.0.3'], port_security=['fa:16:3e:5f:cc:8d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15ea3623-c6c2-499c-8e00-9a1f5fdaf5c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79474bd3-ac2c-4f66-83f8-3a487e22d9d3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.625 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 in datapath 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc unbound from our chassis
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.629 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.631 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d1721110-4548-464f-9b54-354a2d7d702d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.634 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc namespace which is not needed anymore
Sep 30 21:52:30 compute-1 kernel: tapcaba4afa-56 (unregistering): left promiscuous mode
Sep 30 21:52:30 compute-1 NetworkManager[51724]: <info>  [1759269150.6402] device (tapcaba4afa-56): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:52:30 compute-1 ovn_controller[94902]: 2025-09-30T21:52:30Z|00736|binding|INFO|Releasing lport caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df from this chassis (sb_readonly=0)
Sep 30 21:52:30 compute-1 ovn_controller[94902]: 2025-09-30T21:52:30Z|00737|binding|INFO|Setting lport caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df down in Southbound
Sep 30 21:52:30 compute-1 ovn_controller[94902]: 2025-09-30T21:52:30Z|00738|binding|INFO|Removing iface tapcaba4afa-56 ovn-installed in OVS
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.683 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:ca:29 2001:db8::f816:3eff:feef:ca29'], port_security=['fa:16:3e:ef:ca:29 2001:db8::f816:3eff:feef:ca29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feef:ca29/64', 'neutron:device_id': '3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15ea3623-c6c2-499c-8e00-9a1f5fdaf5c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ebdd42c-51b5-4b83-8a22-52988a595a24, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:30 compute-1 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Sep 30 21:52:30 compute-1 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000b0.scope: Consumed 14.532s CPU time.
Sep 30 21:52:30 compute-1 systemd-machined[152783]: Machine qemu-80-instance-000000b0 terminated.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.700 2 INFO nova.compute.manager [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Took 8.87 seconds to build instance.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.731 2 DEBUG oslo_concurrency.lockutils [None req-6898837a-141b-47d4-b7d6-2e72a077832b d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:30 compute-1 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[250287]: [NOTICE]   (250291) : haproxy version is 2.8.14-c23fe91
Sep 30 21:52:30 compute-1 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[250287]: [NOTICE]   (250291) : path to executable is /usr/sbin/haproxy
Sep 30 21:52:30 compute-1 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[250287]: [WARNING]  (250291) : Exiting Master process...
Sep 30 21:52:30 compute-1 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[250287]: [ALERT]    (250291) : Current worker (250293) exited with code 143 (Terminated)
Sep 30 21:52:30 compute-1 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[250287]: [WARNING]  (250291) : All workers exited. Exiting... (0)
Sep 30 21:52:30 compute-1 systemd[1]: libpod-4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c.scope: Deactivated successfully.
Sep 30 21:52:30 compute-1 podman[250902]: 2025-09-30 21:52:30.801984319 +0000 UTC m=+0.057551955 container died 4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 21:52:30 compute-1 NetworkManager[51724]: <info>  [1759269150.8116] manager: (tapcaba4afa-56): new Tun device (/org/freedesktop/NetworkManager/Devices/371)
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 systemd[1]: var-lib-containers-storage-overlay-4b1ac97407be1f8fa2e516e26bc25bd7b6c20af341535a9c9d1b355772125b4a-merged.mount: Deactivated successfully.
Sep 30 21:52:30 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:52:30 compute-1 podman[250902]: 2025-09-30 21:52:30.862111972 +0000 UTC m=+0.117679588 container cleanup 4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.875 2 INFO nova.virt.libvirt.driver [-] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Instance destroyed successfully.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.877 2 DEBUG nova.objects.instance [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:30 compute-1 systemd[1]: libpod-conmon-4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c.scope: Deactivated successfully.
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.898 2 DEBUG nova.virt.libvirt.vif [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-568444683',display_name='tempest-TestGettingAddress-server-568444683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-568444683',id=176,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:52:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-dg3z9z13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:52:05Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.898 2 DEBUG nova.network.os_vif_util [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "address": "fa:16:3e:5f:cc:8d", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f5bcdeb-0e", "ovs_interfaceid": "8f5bcdeb-0e1f-4339-9b7e-10aa78da6494", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.899 2 DEBUG nova.network.os_vif_util [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:cc:8d,bridge_name='br-int',has_traffic_filtering=True,id=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f5bcdeb-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.899 2 DEBUG os_vif [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:cc:8d,bridge_name='br-int',has_traffic_filtering=True,id=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f5bcdeb-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.901 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f5bcdeb-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.910 2 INFO os_vif [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:cc:8d,bridge_name='br-int',has_traffic_filtering=True,id=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f5bcdeb-0e')
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.911 2 DEBUG nova.virt.libvirt.vif [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-568444683',display_name='tempest-TestGettingAddress-server-568444683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-568444683',id=176,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:52:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-dg3z9z13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:52:05Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.911 2 DEBUG nova.network.os_vif_util [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.912 2 DEBUG nova.network.os_vif_util [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ca:29,bridge_name='br-int',has_traffic_filtering=True,id=caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba4afa-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.912 2 DEBUG os_vif [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ca:29,bridge_name='br-int',has_traffic_filtering=True,id=caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba4afa-56') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.913 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcaba4afa-56, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.920 2 INFO os_vif [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ca:29,bridge_name='br-int',has_traffic_filtering=True,id=caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcaba4afa-56')
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.921 2 INFO nova.virt.libvirt.driver [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Deleting instance files /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37_del
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.921 2 INFO nova.virt.libvirt.driver [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Deletion of /var/lib/nova/instances/3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37_del complete
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.936 2 DEBUG nova.network.neutron [req-7bc2d840-4886-4a79-ab10-39ab36a9e560 req-7af80cab-7823-486c-a2ed-b93472fe0fd3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Updated VIF entry in instance network info cache for port 8806e5a6-3463-43ee-aef5-01483480da59. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.937 2 DEBUG nova.network.neutron [req-7bc2d840-4886-4a79-ab10-39ab36a9e560 req-7af80cab-7823-486c-a2ed-b93472fe0fd3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Updating instance_info_cache with network_info: [{"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:30 compute-1 podman[250956]: 2025-09-30 21:52:30.951945947 +0000 UTC m=+0.055138379 container remove 4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.958 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3998fa5c-7bb8-4329-a238-7942b8af4145]: (4, ('Tue Sep 30 09:52:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc (4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c)\n4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c\nTue Sep 30 09:52:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc (4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c)\n4b0a4068e0bdb2a0ecf032e59f2d3ce94609cd04a40f324a2789c7a747b3a91c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.960 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bc654f9e-0ac8-48dd-bd42-883ea7cf5846]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.962 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c85d4d4-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:30 compute-1 kernel: tap4c85d4d4-30: left promiscuous mode
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.979 2 DEBUG oslo_concurrency.lockutils [req-7bc2d840-4886-4a79-ab10-39ab36a9e560 req-7af80cab-7823-486c-a2ed-b93472fe0fd3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-014e9b2f-936f-480e-a320-2d20f6fa98ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:30 compute-1 nova_compute[192795]: 2025-09-30 21:52:30.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:30.989 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[815257c8-7c71-44ff-b7dc-ea89b869ad48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.020 2 INFO nova.compute.manager [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Took 0.45 seconds to destroy the instance on the hypervisor.
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.020 2 DEBUG oslo.service.loopingcall [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.021 2 DEBUG nova.compute.manager [-] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.021 2 DEBUG nova.network.neutron [-] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.024 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0c589975-525d-4991-8dce-fe79e0f2f957]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.026 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b335f747-0a2a-49fc-b68a-9a97938c64b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.051 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[426d5c29-9e22-4e3f-8bf2-7203f6bebf0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581730, 'reachable_time': 22076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250972, 'error': None, 'target': 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 systemd[1]: run-netns-ovnmeta\x2d4c85d4d4\x2d3eb7\x2d42ce\x2dbfd6\x2d6de03fb781cc.mount: Deactivated successfully.
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.070 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.070 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[aae5d861-0534-4d0d-ab7a-7b7e711e5b17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.071 103861 INFO neutron.agent.ovn.metadata.agent [-] Port caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df in datapath cd6da069-7a88-49b7-bea7-1ceb7132f614 unbound from our chassis
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.073 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd6da069-7a88-49b7-bea7-1ceb7132f614, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.074 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9dcc24d1-42a4-487d-b34a-8a6376924adf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.075 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 namespace which is not needed anymore
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.184 2 DEBUG nova.compute.manager [req-cdb8870c-91fd-439a-b737-75f4f604cf37 req-1f951611-7deb-4350-b41a-73347b27f334 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-unplugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.184 2 DEBUG oslo_concurrency.lockutils [req-cdb8870c-91fd-439a-b737-75f4f604cf37 req-1f951611-7deb-4350-b41a-73347b27f334 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.185 2 DEBUG oslo_concurrency.lockutils [req-cdb8870c-91fd-439a-b737-75f4f604cf37 req-1f951611-7deb-4350-b41a-73347b27f334 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.185 2 DEBUG oslo_concurrency.lockutils [req-cdb8870c-91fd-439a-b737-75f4f604cf37 req-1f951611-7deb-4350-b41a-73347b27f334 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.185 2 DEBUG nova.compute.manager [req-cdb8870c-91fd-439a-b737-75f4f604cf37 req-1f951611-7deb-4350-b41a-73347b27f334 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] No waiting events found dispatching network-vif-unplugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.185 2 DEBUG nova.compute.manager [req-cdb8870c-91fd-439a-b737-75f4f604cf37 req-1f951611-7deb-4350-b41a-73347b27f334 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-unplugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:52:31 compute-1 sshd-session[250430]: Failed password for root from 8.210.178.40 port 48168 ssh2
Sep 30 21:52:31 compute-1 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[250359]: [NOTICE]   (250363) : haproxy version is 2.8.14-c23fe91
Sep 30 21:52:31 compute-1 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[250359]: [NOTICE]   (250363) : path to executable is /usr/sbin/haproxy
Sep 30 21:52:31 compute-1 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[250359]: [WARNING]  (250363) : Exiting Master process...
Sep 30 21:52:31 compute-1 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[250359]: [WARNING]  (250363) : Exiting Master process...
Sep 30 21:52:31 compute-1 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[250359]: [ALERT]    (250363) : Current worker (250365) exited with code 143 (Terminated)
Sep 30 21:52:31 compute-1 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[250359]: [WARNING]  (250363) : All workers exited. Exiting... (0)
Sep 30 21:52:31 compute-1 systemd[1]: libpod-36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832.scope: Deactivated successfully.
Sep 30 21:52:31 compute-1 conmon[250359]: conmon 36a08452d6c3a2f2efc1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832.scope/container/memory.events
Sep 30 21:52:31 compute-1 podman[250990]: 2025-09-30 21:52:31.245680407 +0000 UTC m=+0.062944130 container died 36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:52:31 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832-userdata-shm.mount: Deactivated successfully.
Sep 30 21:52:31 compute-1 systemd[1]: var-lib-containers-storage-overlay-051a6de123c1fc68de44ee6639d25de6fd6e0ee1919ea2afbfa54984fdd94399-merged.mount: Deactivated successfully.
Sep 30 21:52:31 compute-1 podman[250990]: 2025-09-30 21:52:31.320468986 +0000 UTC m=+0.137732689 container cleanup 36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:52:31 compute-1 systemd[1]: libpod-conmon-36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832.scope: Deactivated successfully.
Sep 30 21:52:31 compute-1 podman[251018]: 2025-09-30 21:52:31.397767382 +0000 UTC m=+0.050082432 container remove 36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.404 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cc22edf2-0fa9-4e39-b5ee-ec2cbfc69bd3]: (4, ('Tue Sep 30 09:52:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 (36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832)\n36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832\nTue Sep 30 09:52:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 (36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832)\n36a08452d6c3a2f2efc1bf46badded1a13bacd27d076bcc2ab1de6491ead3832\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.406 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a17eb24a-2cfd-4ad9-a923-d22d56413f26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.408 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd6da069-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:31 compute-1 kernel: tapcd6da069-70: left promiscuous mode
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:31 compute-1 nova_compute[192795]: 2025-09-30 21:52:31.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.431 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1a762002-3dc6-47c9-8bfe-868f019b8a80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.457 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a59727a3-0348-4e0c-97cb-2bafb243e7ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.459 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0c8f45-b395-4433-9cda-f82195800213]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.482 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9e568786-4ae7-4489-900a-dcd81c1ba8fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581829, 'reachable_time': 39955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251035, 'error': None, 'target': 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.484 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:52:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:31.484 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[98f8a3bf-47cb-43d1-97d9-7949d8a9fdbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:31 compute-1 systemd[1]: run-netns-ovnmeta\x2dcd6da069\x2d7a88\x2d49b7\x2dbea7\x2d1ceb7132f614.mount: Deactivated successfully.
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.029 2 DEBUG nova.compute.manager [req-447ae777-ad68-4e79-97cd-07584424510d req-bc6cbc13-5874-4b22-b002-cf2cbc3c4061 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-deleted-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.029 2 INFO nova.compute.manager [req-447ae777-ad68-4e79-97cd-07584424510d req-bc6cbc13-5874-4b22-b002-cf2cbc3c4061 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Neutron deleted interface 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494; detaching it from the instance and deleting it from the info cache
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.030 2 DEBUG nova.network.neutron [req-447ae777-ad68-4e79-97cd-07584424510d req-bc6cbc13-5874-4b22-b002-cf2cbc3c4061 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Updating instance_info_cache with network_info: [{"id": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "address": "fa:16:3e:ef:ca:29", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feef:ca29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcaba4afa-56", "ovs_interfaceid": "caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.071 2 DEBUG nova.compute.manager [req-447ae777-ad68-4e79-97cd-07584424510d req-bc6cbc13-5874-4b22-b002-cf2cbc3c4061 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Detach interface failed, port_id=8f5bcdeb-0e1f-4339-9b7e-10aa78da6494, reason: Instance 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.362 2 DEBUG nova.network.neutron [-] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.392 2 INFO nova.compute.manager [-] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Took 1.37 seconds to deallocate network for instance.
Sep 30 21:52:32 compute-1 unix_chkpwd[251036]: password check failed for user (root)
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.430 2 DEBUG nova.compute.manager [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-changed-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.431 2 DEBUG nova.compute.manager [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Refreshing instance network info cache due to event network-changed-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.431 2 DEBUG oslo_concurrency.lockutils [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.431 2 DEBUG oslo_concurrency.lockutils [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.432 2 DEBUG nova.network.neutron [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Refreshing network info cache for port 8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.517 2 DEBUG oslo_concurrency.lockutils [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.518 2 DEBUG oslo_concurrency.lockutils [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.567 2 DEBUG nova.network.neutron [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:52:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:32.595 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.607 2 DEBUG nova.compute.provider_tree [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.627 2 DEBUG nova.scheduler.client.report [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.655 2 DEBUG oslo_concurrency.lockutils [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.714 2 INFO nova.scheduler.client.report [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.867 2 DEBUG oslo_concurrency.lockutils [None req-245892ba-f1e1-4c2a-a413-dc93c3dfcf27 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.393s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.952 2 DEBUG nova.network.neutron [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.991 2 DEBUG oslo_concurrency.lockutils [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.992 2 DEBUG nova.compute.manager [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received event network-vif-plugged-8806e5a6-3463-43ee-aef5-01483480da59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.992 2 DEBUG oslo_concurrency.lockutils [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.992 2 DEBUG oslo_concurrency.lockutils [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.993 2 DEBUG oslo_concurrency.lockutils [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.993 2 DEBUG nova.compute.manager [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] No waiting events found dispatching network-vif-plugged-8806e5a6-3463-43ee-aef5-01483480da59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:32 compute-1 nova_compute[192795]: 2025-09-30 21:52:32.993 2 WARNING nova.compute.manager [req-573818f6-f798-4080-b81e-65695e568edb req-3d8da7c8-7194-4625-b300-aaa07f413fde dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received unexpected event network-vif-plugged-8806e5a6-3463-43ee-aef5-01483480da59 for instance with vm_state active and task_state None.
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.260 2 DEBUG nova.compute.manager [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-plugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.261 2 DEBUG oslo_concurrency.lockutils [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.264 2 DEBUG oslo_concurrency.lockutils [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.264 2 DEBUG oslo_concurrency.lockutils [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.264 2 DEBUG nova.compute.manager [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] No waiting events found dispatching network-vif-plugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.265 2 WARNING nova.compute.manager [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received unexpected event network-vif-plugged-8f5bcdeb-0e1f-4339-9b7e-10aa78da6494 for instance with vm_state deleted and task_state None.
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.265 2 DEBUG nova.compute.manager [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-unplugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.266 2 DEBUG oslo_concurrency.lockutils [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.266 2 DEBUG oslo_concurrency.lockutils [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.266 2 DEBUG oslo_concurrency.lockutils [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.267 2 DEBUG nova.compute.manager [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] No waiting events found dispatching network-vif-unplugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.267 2 WARNING nova.compute.manager [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received unexpected event network-vif-unplugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df for instance with vm_state deleted and task_state None.
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.267 2 DEBUG nova.compute.manager [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-plugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.268 2 DEBUG oslo_concurrency.lockutils [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.268 2 DEBUG oslo_concurrency.lockutils [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.268 2 DEBUG oslo_concurrency.lockutils [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.269 2 DEBUG nova.compute.manager [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] No waiting events found dispatching network-vif-plugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:33 compute-1 nova_compute[192795]: 2025-09-30 21:52:33.269 2 WARNING nova.compute.manager [req-466a5415-2e98-44c7-ba8a-9012dbe574ec req-5de8834c-0cdf-44d1-abc9-b3ec48da8c09 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received unexpected event network-vif-plugged-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df for instance with vm_state deleted and task_state None.
Sep 30 21:52:33 compute-1 sshd-session[250430]: Failed password for root from 8.210.178.40 port 48168 ssh2
Sep 30 21:52:33 compute-1 ovn_controller[94902]: 2025-09-30T21:52:33Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:67:d1 10.100.0.3
Sep 30 21:52:34 compute-1 sshd-session[250430]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 48168 ssh2 [preauth]
Sep 30 21:52:34 compute-1 sshd-session[250430]: Disconnecting authenticating user root 8.210.178.40 port 48168: Too many authentication failures [preauth]
Sep 30 21:52:34 compute-1 sshd-session[250430]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:34 compute-1 sshd-session[250430]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:52:34 compute-1 nova_compute[192795]: 2025-09-30 21:52:34.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:34 compute-1 nova_compute[192795]: 2025-09-30 21:52:34.247 2 DEBUG nova.compute.manager [req-f845c3b1-3152-4d36-a7b1-f311befb3ebc req-6f64efb2-c3f0-40c8-8ff1-c3094550a844 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Received event network-vif-deleted-caba4afa-56c2-4b6b-bc7f-0c7e94b0f4df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:35 compute-1 unix_chkpwd[251046]: password check failed for user (root)
Sep 30 21:52:35 compute-1 sshd-session[251044]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:35 compute-1 nova_compute[192795]: 2025-09-30 21:52:35.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:36 compute-1 nova_compute[192795]: 2025-09-30 21:52:36.343 2 DEBUG nova.compute.manager [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received event network-changed-8806e5a6-3463-43ee-aef5-01483480da59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:36 compute-1 nova_compute[192795]: 2025-09-30 21:52:36.343 2 DEBUG nova.compute.manager [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Refreshing instance network info cache due to event network-changed-8806e5a6-3463-43ee-aef5-01483480da59. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:36 compute-1 nova_compute[192795]: 2025-09-30 21:52:36.344 2 DEBUG oslo_concurrency.lockutils [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-014e9b2f-936f-480e-a320-2d20f6fa98ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:36 compute-1 nova_compute[192795]: 2025-09-30 21:52:36.344 2 DEBUG oslo_concurrency.lockutils [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-014e9b2f-936f-480e-a320-2d20f6fa98ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:36 compute-1 nova_compute[192795]: 2025-09-30 21:52:36.344 2 DEBUG nova.network.neutron [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Refreshing network info cache for port 8806e5a6-3463-43ee-aef5-01483480da59 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:37 compute-1 nova_compute[192795]: 2025-09-30 21:52:37.982 2 DEBUG nova.network.neutron [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Updated VIF entry in instance network info cache for port 8806e5a6-3463-43ee-aef5-01483480da59. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:37 compute-1 nova_compute[192795]: 2025-09-30 21:52:37.983 2 DEBUG nova.network.neutron [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Updating instance_info_cache with network_info: [{"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:38 compute-1 nova_compute[192795]: 2025-09-30 21:52:38.035 2 DEBUG oslo_concurrency.lockutils [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-014e9b2f-936f-480e-a320-2d20f6fa98ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:38 compute-1 sshd-session[251044]: Failed password for root from 8.210.178.40 port 48828 ssh2
Sep 30 21:52:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:38.712 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:38.714 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:38.717 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:39 compute-1 unix_chkpwd[251047]: password check failed for user (root)
Sep 30 21:52:39 compute-1 nova_compute[192795]: 2025-09-30 21:52:39.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:39 compute-1 nova_compute[192795]: 2025-09-30 21:52:39.728 2 INFO nova.compute.manager [None req-3bfc2386-64d9-4c39-a7f5-c44853a9ae06 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Get console output
Sep 30 21:52:39 compute-1 nova_compute[192795]: 2025-09-30 21:52:39.736 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:52:40 compute-1 podman[251048]: 2025-09-30 21:52:40.249887332 +0000 UTC m=+0.082675493 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.510 2 DEBUG nova.compute.manager [req-dec74165-a959-451e-88e2-40402dc98461 req-4d10acc0-e7f5-475b-836d-01e74c8d361a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-changed-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.511 2 DEBUG nova.compute.manager [req-dec74165-a959-451e-88e2-40402dc98461 req-4d10acc0-e7f5-475b-836d-01e74c8d361a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Refreshing instance network info cache due to event network-changed-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.512 2 DEBUG oslo_concurrency.lockutils [req-dec74165-a959-451e-88e2-40402dc98461 req-4d10acc0-e7f5-475b-836d-01e74c8d361a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.512 2 DEBUG oslo_concurrency.lockutils [req-dec74165-a959-451e-88e2-40402dc98461 req-4d10acc0-e7f5-475b-836d-01e74c8d361a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.513 2 DEBUG nova.network.neutron [req-dec74165-a959-451e-88e2-40402dc98461 req-4d10acc0-e7f5-475b-836d-01e74c8d361a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Refreshing network info cache for port ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.612 2 DEBUG oslo_concurrency.lockutils [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "5e3582bd-9296-455b-8ecc-fd02bf833409" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.613 2 DEBUG oslo_concurrency.lockutils [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.614 2 DEBUG oslo_concurrency.lockutils [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.615 2 DEBUG oslo_concurrency.lockutils [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.615 2 DEBUG oslo_concurrency.lockutils [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.639 2 INFO nova.compute.manager [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Terminating instance
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.653 2 DEBUG nova.compute.manager [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:52:40 compute-1 kernel: tapae5d6d4d-3f (unregistering): left promiscuous mode
Sep 30 21:52:40 compute-1 NetworkManager[51724]: <info>  [1759269160.6802] device (tapae5d6d4d-3f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:40 compute-1 ovn_controller[94902]: 2025-09-30T21:52:40Z|00739|binding|INFO|Releasing lport ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e from this chassis (sb_readonly=0)
Sep 30 21:52:40 compute-1 ovn_controller[94902]: 2025-09-30T21:52:40Z|00740|binding|INFO|Setting lport ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e down in Southbound
Sep 30 21:52:40 compute-1 ovn_controller[94902]: 2025-09-30T21:52:40Z|00741|binding|INFO|Removing iface tapae5d6d4d-3f ovn-installed in OVS
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:40.709 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:67:d1 10.100.0.3'], port_security=['fa:16:3e:6d:67:d1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '5e3582bd-9296-455b-8ecc-fd02bf833409', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ecdc4bda-6319-47ff-b06c-4aaecbba1494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76968323-ccfb-4fa7-a1e9-1797fe98366d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:40.710 103861 INFO neutron.agent.ovn.metadata.agent [-] Port ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e in datapath a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2 unbound from our chassis
Sep 30 21:52:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:40.713 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:52:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:40.715 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6c03f128-20db-4a23-b535-025f54aff5ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:40 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:40.716 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2 namespace which is not needed anymore
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:40 compute-1 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000af.scope: Deactivated successfully.
Sep 30 21:52:40 compute-1 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000af.scope: Consumed 13.639s CPU time.
Sep 30 21:52:40 compute-1 systemd-machined[152783]: Machine qemu-81-instance-000000af terminated.
Sep 30 21:52:40 compute-1 neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2[250647]: [NOTICE]   (250651) : haproxy version is 2.8.14-c23fe91
Sep 30 21:52:40 compute-1 neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2[250647]: [NOTICE]   (250651) : path to executable is /usr/sbin/haproxy
Sep 30 21:52:40 compute-1 neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2[250647]: [WARNING]  (250651) : Exiting Master process...
Sep 30 21:52:40 compute-1 neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2[250647]: [WARNING]  (250651) : Exiting Master process...
Sep 30 21:52:40 compute-1 neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2[250647]: [ALERT]    (250651) : Current worker (250653) exited with code 143 (Terminated)
Sep 30 21:52:40 compute-1 neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2[250647]: [WARNING]  (250651) : All workers exited. Exiting... (0)
Sep 30 21:52:40 compute-1 systemd[1]: libpod-92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0.scope: Deactivated successfully.
Sep 30 21:52:40 compute-1 podman[251091]: 2025-09-30 21:52:40.892174731 +0000 UTC m=+0.045777786 container died 92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:40 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0-userdata-shm.mount: Deactivated successfully.
Sep 30 21:52:40 compute-1 systemd[1]: var-lib-containers-storage-overlay-61c57b028eee152626e8354c31002b857ed95743cc675be12e9fd2e3f75c392f-merged.mount: Deactivated successfully.
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.936 2 INFO nova.virt.libvirt.driver [-] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Instance destroyed successfully.
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.937 2 DEBUG nova.objects.instance [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid 5e3582bd-9296-455b-8ecc-fd02bf833409 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:40 compute-1 podman[251091]: 2025-09-30 21:52:40.941846642 +0000 UTC m=+0.095449697 container cleanup 92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:52:40 compute-1 systemd[1]: libpod-conmon-92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0.scope: Deactivated successfully.
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.955 2 DEBUG nova.virt.libvirt.vif [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1538252087',display_name='tempest-TestNetworkAdvancedServerOps-server-1538252087',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1538252087',id=175,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK64bE32ze2bid9dTwEhXemRPMHSKtSVUl9lfzt4XAJoS5dSKgAmI3qPCfSTAYLYKMKvsuZ8HypOczDqbqsWuJ67Tb4znaS44MoU7MaaI4ew894i8cIyDlKPTNVSQKAcBg==',key_name='tempest-TestNetworkAdvancedServerOps-471135388',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:52:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-a5qtxdyq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:52:26Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=5e3582bd-9296-455b-8ecc-fd02bf833409,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.955 2 DEBUG nova.network.os_vif_util [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.956 2 DEBUG nova.network.os_vif_util [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6d:67:d1,bridge_name='br-int',has_traffic_filtering=True,id=ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e,network=Network(a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae5d6d4d-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.956 2 DEBUG os_vif [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:67:d1,bridge_name='br-int',has_traffic_filtering=True,id=ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e,network=Network(a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae5d6d4d-3f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.958 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae5d6d4d-3f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.967 2 INFO os_vif [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:67:d1,bridge_name='br-int',has_traffic_filtering=True,id=ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e,network=Network(a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae5d6d4d-3f')
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.968 2 INFO nova.virt.libvirt.driver [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Deleting instance files /var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409_del
Sep 30 21:52:40 compute-1 nova_compute[192795]: 2025-09-30 21:52:40.976 2 INFO nova.virt.libvirt.driver [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Deletion of /var/lib/nova/instances/5e3582bd-9296-455b-8ecc-fd02bf833409_del complete
Sep 30 21:52:41 compute-1 sshd-session[251044]: Failed password for root from 8.210.178.40 port 48828 ssh2
Sep 30 21:52:41 compute-1 podman[251133]: 2025-09-30 21:52:41.02288376 +0000 UTC m=+0.048608623 container remove 92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:52:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:41.033 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[139fa98b-62bf-48a9-899e-bcc3a59a1854]: (4, ('Tue Sep 30 09:52:40 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2 (92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0)\n92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0\nTue Sep 30 09:52:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2 (92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0)\n92e4d5e4efda80e6daf887dd4b88714b51c1991aabe96cd033a9a1ddd8ac19c0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:41.036 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c0799bd0-f346-4425-866f-c52922d96209]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:41.037 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1db7ad3-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:41 compute-1 nova_compute[192795]: 2025-09-30 21:52:41.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:41 compute-1 kernel: tapa1db7ad3-40: left promiscuous mode
Sep 30 21:52:41 compute-1 nova_compute[192795]: 2025-09-30 21:52:41.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:41.076 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a418066d-e6e6-4f9a-bea4-1a902c029f09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:41 compute-1 nova_compute[192795]: 2025-09-30 21:52:41.081 2 INFO nova.compute.manager [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:52:41 compute-1 nova_compute[192795]: 2025-09-30 21:52:41.081 2 DEBUG oslo.service.loopingcall [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:52:41 compute-1 nova_compute[192795]: 2025-09-30 21:52:41.081 2 DEBUG nova.compute.manager [-] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:52:41 compute-1 nova_compute[192795]: 2025-09-30 21:52:41.081 2 DEBUG nova.network.neutron [-] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:52:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:41.108 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[b82b1e0b-e37b-4a6b-9196-ef430a74182d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:41.111 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4c66c647-44f5-4022-87bb-ae88e1973119]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:41.130 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d5ca5a12-948c-4141-a3a1-a081ca9427af]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583640, 'reachable_time': 23198, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251147, 'error': None, 'target': 'ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:41 compute-1 systemd[1]: run-netns-ovnmeta\x2da1db7ad3\x2d4f01\x2d4560\x2da5a0\x2dc12dd1e80fb2.mount: Deactivated successfully.
Sep 30 21:52:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:41.136 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:52:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:52:41.136 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab521ba-c5e5-452c-9af2-b41cf5da6ee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:42 compute-1 ovn_controller[94902]: 2025-09-30T21:52:42Z|00742|binding|INFO|Releasing lport f1ad069a-d0ee-45c6-8d17-4565ef1a2c70 from this chassis (sb_readonly=0)
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:42 compute-1 unix_chkpwd[251158]: password check failed for user (root)
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.546 2 DEBUG nova.network.neutron [-] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.572 2 INFO nova.compute.manager [-] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Took 1.49 seconds to deallocate network for instance.
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.618 2 DEBUG nova.compute.manager [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-vif-unplugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.618 2 DEBUG oslo_concurrency.lockutils [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.619 2 DEBUG oslo_concurrency.lockutils [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.619 2 DEBUG oslo_concurrency.lockutils [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.620 2 DEBUG nova.compute.manager [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] No waiting events found dispatching network-vif-unplugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.620 2 DEBUG nova.compute.manager [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-vif-unplugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.621 2 DEBUG nova.compute.manager [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.621 2 DEBUG oslo_concurrency.lockutils [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.621 2 DEBUG oslo_concurrency.lockutils [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.622 2 DEBUG oslo_concurrency.lockutils [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.622 2 DEBUG nova.compute.manager [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] No waiting events found dispatching network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.623 2 WARNING nova.compute.manager [req-4d295e9f-44d5-49cd-aaef-5c7c4fa66e69 req-e4bb0e7f-2093-446b-8aac-0d723684d67a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received unexpected event network-vif-plugged-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e for instance with vm_state active and task_state deleting.
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.679 2 DEBUG oslo_concurrency.lockutils [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.679 2 DEBUG oslo_concurrency.lockutils [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.685 2 DEBUG oslo_concurrency.lockutils [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.726 2 INFO nova.scheduler.client.report [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Deleted allocations for instance 5e3582bd-9296-455b-8ecc-fd02bf833409
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.802 2 DEBUG nova.network.neutron [req-dec74165-a959-451e-88e2-40402dc98461 req-4d10acc0-e7f5-475b-836d-01e74c8d361a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updated VIF entry in instance network info cache for port ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.803 2 DEBUG nova.network.neutron [req-dec74165-a959-451e-88e2-40402dc98461 req-4d10acc0-e7f5-475b-836d-01e74c8d361a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Updating instance_info_cache with network_info: [{"id": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "address": "fa:16:3e:6d:67:d1", "network": {"id": "a1db7ad3-4f01-4560-a5a0-c12dd1e80fb2", "bridge": "br-int", "label": "tempest-network-smoke--695170609", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae5d6d4d-3f", "ovs_interfaceid": "ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.804 2 DEBUG oslo_concurrency.lockutils [None req-959f8102-5098-45af-8908-76342ab504fa 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "5e3582bd-9296-455b-8ecc-fd02bf833409" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:42 compute-1 nova_compute[192795]: 2025-09-30 21:52:42.829 2 DEBUG oslo_concurrency.lockutils [req-dec74165-a959-451e-88e2-40402dc98461 req-4d10acc0-e7f5-475b-836d-01e74c8d361a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5e3582bd-9296-455b-8ecc-fd02bf833409" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.031 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'name': 'tempest-TestServerBasicOps-server-70187108', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b2', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'hostId': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.035 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 014e9b2f-936f-480e-a320-2d20f6fa98ce / tap8806e5a6-34 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.035 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97a23d3b-b99c-4810-8d63-3cc54d153858', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.032573', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb5f2280-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': 'ede923f451ed0dcc2acb6541225527a761c6425ddba92ca3567451e9af4d3f3a'}]}, 'timestamp': '2025-09-30 21:52:44.036407', '_unique_id': '36aaf77879ff4e63be2714f9c60bc56b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.039 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bd53682-a930-4dc8-94c8-9aa1a4de120b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.039035', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb5f9ac6-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': 'a676f8a990c9d6a69a740055a34884ddfa424437853168b5258b76bdb7ce7648'}]}, 'timestamp': '2025-09-30 21:52:44.039412', '_unique_id': 'd2147d66d431445f9722410435d8bf8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.040 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.058 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.write.bytes volume: 72773632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.059 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa29153a-bd52-41fd-a1aa-d6df9f780392', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72773632, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-vda', 'timestamp': '2025-09-30T21:52:44.041257', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb62b38c-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': 'e2c40fd4dd7060e80f0a03227d82ca4b75daffedb8b37242df2c31a7ad2ecdba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-sda', 'timestamp': '2025-09-30T21:52:44.041257', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb62c1a6-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': '892deb08cd6bddc4f2d5d5136fa9f8591a4e6484f3a256db733c4a19fae03321'}]}, 'timestamp': '2025-09-30 21:52:44.060035', '_unique_id': '157674007426435dbb81a4f4ff289016'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.061 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.062 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.062 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.062 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-70187108>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-70187108>]
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.062 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.073 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.074 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55502d90-c028-4911-a9fa-47ac17b08c74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-vda', 'timestamp': '2025-09-30T21:52:44.063043', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb64dde2-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.795834948, 'message_signature': 'd30e31100e7d70dc2127275b443c429a8a14cb4e5b16de18bbae8c9b71773a4e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-sda', 'timestamp': '2025-09-30T21:52:44.063043', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb64f200-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.795834948, 'message_signature': '1cf67844acc7b61d88fdf44eb03b6b9f56a6f6d7a729030877a8621b328524a7'}]}, 'timestamp': '2025-09-30 21:52:44.074419', '_unique_id': '33d182b542d74a3b8151d6330da096c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.075 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9159b44b-61f8-4a94-ad63-1d3e28bbd633', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.077107', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb6568fc-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': '336ffc2998d602ae980b013868dfe18c4661c94c41bb2a8ff516346fcf7c9c3b'}]}, 'timestamp': '2025-09-30 21:52:44.077438', '_unique_id': '6586a8516dee4e75bfa487fa0b1dede3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.077 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.079 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.read.latency volume: 638804040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.079 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.read.latency volume: 64018710 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0258bed6-5aff-488a-a665-5061f538bace', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 638804040, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-vda', 'timestamp': '2025-09-30T21:52:44.079121', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb65b6ea-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': '8126a9b2cde9d06b561a4a9957128c3499b0aaa7ea50f77a3c8eb6a503a6947f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 64018710, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-sda', 'timestamp': '2025-09-30T21:52:44.079121', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb65c1f8-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': '424981d83badf9f7ed5a3e2d98bde095540dd17191e3397ec1ef34504afb8e6c'}]}, 'timestamp': '2025-09-30 21:52:44.079664', '_unique_id': '886387a3123c499d8ba92e5664fa31bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.080 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.081 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.read.bytes volume: 29399552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.081 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a87486a-8cb2-4781-901b-a412dcdd457b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29399552, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-vda', 'timestamp': '2025-09-30T21:52:44.081406', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb66102c-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': '4d3ea93a3ee5d898e6ba31130f88d5787e39737e9e6eef4ea757771cf829776f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-sda', 'timestamp': '2025-09-30T21:52:44.081406', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb6619dc-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': '5662de588b4db5a64475778622dc8511852f565f7bb0b1e14b1de432287620e7'}]}, 'timestamp': '2025-09-30 21:52:44.081912', '_unique_id': 'a896c5507db74097869fc6cc2c63f1b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.082 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.083 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.083 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.incoming.bytes volume: 1218 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '934e713c-d698-4e6c-a56f-95e1adf05fd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1218, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.083921', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb6677ce-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': 'c3a5be0cbe82180506a53e31f7847ad7b05169b4331e9668c1aa1898cc995b07'}]}, 'timestamp': '2025-09-30 21:52:44.084501', '_unique_id': '7225b4e8e6c8491ca7dec03b1743db18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.085 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.086 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.read.requests volume: 1063 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76167839-6fdb-4c89-ac6e-d21cba4f37f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1063, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-vda', 'timestamp': '2025-09-30T21:52:44.086881', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb66e4de-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': 'f60bf11f30fdc646e2411e062ac6d05f91f2b62789586c09c97108da781b6f22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-sda', 'timestamp': '2025-09-30T21:52:44.086881', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb66ed44-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': '0253ea643eb066c399ebd7af15c151ef69c6343dd30024c6c3f4738d44bb43fd'}]}, 'timestamp': '2025-09-30 21:52:44.087341', '_unique_id': '0d71e6a6d6c147a989bfdbe0523b3ca6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.087 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.088 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.088 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-70187108>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-70187108>]
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.088 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.088 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53285d7a-08ec-44c1-9c88-739a4c99a770', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.088838', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb673100-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': 'ffadc48ad995cce7216e7f5fc394746309d843a36fd585ba0d9c0b1fe7f4e1dd'}]}, 'timestamp': '2025-09-30 21:52:44.089069', '_unique_id': 'fd1cf63dba6a49e7b14f77de05da0247'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.089 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5285e62-c71f-458f-bb27-4a6860e32262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.090211', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb676792-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': '8d2088ad528477df50a195918191eddb04e70ecd524791f5555d98172a8df57c'}]}, 'timestamp': '2025-09-30 21:52:44.090485', '_unique_id': '3bd1d9e7624a4ba681896dd666cf6227'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.090 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.091 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.106 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/cpu volume: 12120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7df73985-e5b2-4ab1-97d8-0a781ceeb4be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12120000000, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'timestamp': '2025-09-30T21:52:44.091665', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cb69f76e-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.839416515, 'message_signature': '3c99a378ea98978bba5dd7822b839c9709b1b12b8812f69041777577909c921d'}]}, 'timestamp': '2025-09-30 21:52:44.107370', '_unique_id': 'dc336054cee6450b817a17e35ca1ec8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.108 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.109 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.109 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-70187108>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-70187108>]
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.109 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.109 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.109 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb1f545f-678b-4df9-bfd2-f0cf058a7b57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-vda', 'timestamp': '2025-09-30T21:52:44.109521', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb6a5948-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.795834948, 'message_signature': '81807a0a4416af74ac91248ef0f361fe65cb6a65a8d94e4e8643b43131c9a04b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-sda', 'timestamp': '2025-09-30T21:52:44.109521', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb6a6186-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.795834948, 'message_signature': 'e5dbd0e8efead0216da6b1e3c6e717a93a478fe905d6fa940c70243ed9273edc'}]}, 'timestamp': '2025-09-30 21:52:44.109961', '_unique_id': 'd2e58b8669604864be576998f426ee7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24a879b4-046e-4484-a04b-bd5a187dfda7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-vda', 'timestamp': '2025-09-30T21:52:44.111075', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb6a9548-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.795834948, 'message_signature': '75b0851da70f2fbff76f739621f54b0bb6a8b346caa347af1791f2851693b4a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-sda', 'timestamp': '2025-09-30T21:52:44.111075', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb6a9da4-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.795834948, 'message_signature': 'a4b57e8232f39f8a7040e1e371126430933ffac9fea5800463ba085b838c67cc'}]}, 'timestamp': '2025-09-30 21:52:44.111494', '_unique_id': 'f41c2b2742eb41c18544b1ff71e3e2f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.111 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.112 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.112 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c117994-b4ed-4fe9-a9a4-548837e83f57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'timestamp': '2025-09-30T21:52:44.112576', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'cb6acff4-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.839416515, 'message_signature': '241a6e44fe0d6f557b79c44ff3e92a6de2b5c3a6372f0ba883467bc40a5791f8'}]}, 'timestamp': '2025-09-30 21:52:44.112795', '_unique_id': 'a3f76384e0b241eebe574aca0ac6b580'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.113 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '822341ae-98ec-40a2-82f3-1ac98ddecc29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.113819', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb6b0096-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': 'e771e1e04100c8861b838bbc9c23cebf30f2cf0af41d9452e7f1425c6b0c8857'}]}, 'timestamp': '2025-09-30 21:52:44.114040', '_unique_id': 'f0e01dd35da94464a04b93b85d9d1c39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.114 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.115 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.115 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-70187108>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-70187108>]
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.115 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15638e93-9b08-4a74-b519-e8b4532f2186', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.115367', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb6b3ce6-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': '9259f506b6499931e111c893b67ec9924ce1484f57d842164f62faf606cce2c5'}]}, 'timestamp': '2025-09-30 21:52:44.115587', '_unique_id': '374cd84383794eb2a12eb894b8105fbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.116 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dad28cc4-fdc7-4ffb-859f-1534c90e8def', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.116676', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb6b701c-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': 'ad5dca8f14ce328d967ea68366c1ee6be5f6b65f2934cabcc50e3b02aea96e7e'}]}, 'timestamp': '2025-09-30 21:52:44.116901', '_unique_id': '0de9ce9975a04b26b2d30bf55f57e998'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.117 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.write.requests volume: 275 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f749ed5a-9c6f-4b39-ba54-283d70361594', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 275, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-vda', 'timestamp': '2025-09-30T21:52:44.117919', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb6ba096-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': '4cb0a1931487c461a8baafdd4f782cd7b4e780c9d5562f9ffe51aec253a349ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-sda', 'timestamp': '2025-09-30T21:52:44.117919', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb6ba80c-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': '0ba5dee29d18d57940cc141c54a91dd2f97ecd7afd6030d0e5ed9f0e4223d26c'}]}, 'timestamp': '2025-09-30 21:52:44.118331', '_unique_id': '867d879a32304825898acc29195b58cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.118 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.119 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afaf31a4-8fe8-4b17-b97b-4a05620debb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': 'instance-000000b2-014e9b2f-936f-480e-a320-2d20f6fa98ce-tap8806e5a6-34', 'timestamp': '2025-09-30T21:52:44.119393', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'tap8806e5a6-34', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:7f:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8806e5a6-34'}, 'message_id': 'cb6bda48-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.765339395, 'message_signature': '15b2b458656b290235e416199538408fc5aa06ac50ae637bdfbe4d55c0a30e06'}]}, 'timestamp': '2025-09-30 21:52:44.119633', '_unique_id': '6f2b970e12e8405b8460fb97e3d65ce5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.write.latency volume: 3308455762 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.120 12 DEBUG ceilometer.compute.pollsters [-] 014e9b2f-936f-480e-a320-2d20f6fa98ce/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1975b76b-4e9a-493f-bab4-ceeff9a75f0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3308455762, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-vda', 'timestamp': '2025-09-30T21:52:44.120707', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb6c0d7e-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': 'be22129b2f27ddbd7e716d180a44b39a3a47e676fb912dd8adbfa7ea237ebb97'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd5901e3d1f454c3ebc6d467bb263431f', 'user_name': None, 'project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'project_name': None, 'resource_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce-sda', 'timestamp': '2025-09-30T21:52:44.120707', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-70187108', 'name': 'instance-000000b2', 'instance_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'instance_type': 'm1.nano', 'host': '151d305776cee2e8a1ed66f1af4e9c88f2844f36ef10c8ce64209ad1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb6c14fe-9e47-11f0-984a-fa163e8033fc', 'monotonic_time': 5859.774040701, 'message_signature': 'fdb7fba5c53822bded94f7109ac5049bf0051fc876911a0239451a810fbb90b6'}]}, 'timestamp': '2025-09-30 21:52:44.121104', '_unique_id': '45ca7f213e394efbbc1d69ecf81105cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:52:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:52:44.121 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:52:44 compute-1 nova_compute[192795]: 2025-09-30 21:52:44.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:44 compute-1 ovn_controller[94902]: 2025-09-30T21:52:44Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:32:7f:ec 10.100.0.12
Sep 30 21:52:44 compute-1 ovn_controller[94902]: 2025-09-30T21:52:44Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:32:7f:ec 10.100.0.12
Sep 30 21:52:44 compute-1 sshd-session[251044]: Failed password for root from 8.210.178.40 port 48828 ssh2
Sep 30 21:52:44 compute-1 nova_compute[192795]: 2025-09-30 21:52:44.740 2 DEBUG nova.compute.manager [req-4e0f8e58-c9d4-4d18-8685-2a48d0fad8d1 req-5efb7db9-7a3e-4b7b-87fd-eb2d017e5c7f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Received event network-vif-deleted-ae5d6d4d-3fd4-4d7d-88c5-70d37de5505e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:45 compute-1 nova_compute[192795]: 2025-09-30 21:52:45.873 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269150.8719833, 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:45 compute-1 nova_compute[192795]: 2025-09-30 21:52:45.874 2 INFO nova.compute.manager [-] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] VM Stopped (Lifecycle Event)
Sep 30 21:52:45 compute-1 nova_compute[192795]: 2025-09-30 21:52:45.891 2 DEBUG nova.compute.manager [None req-1dac2562-7d7c-4e84-88e0-826334abd975 - - - - - -] [instance: 3e21cf45-01fc-43e6-a4a9-1c6b6e0fcd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:45 compute-1 unix_chkpwd[251160]: password check failed for user (root)
Sep 30 21:52:46 compute-1 nova_compute[192795]: 2025-09-30 21:52:46.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:46 compute-1 podman[251163]: 2025-09-30 21:52:46.251195242 +0000 UTC m=+0.077211615 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:52:46 compute-1 podman[251161]: 2025-09-30 21:52:46.279767283 +0000 UTC m=+0.101601683 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:46 compute-1 podman[251162]: 2025-09-30 21:52:46.310642946 +0000 UTC m=+0.136478435 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:52:47 compute-1 ovn_controller[94902]: 2025-09-30T21:52:47Z|00743|binding|INFO|Releasing lport f1ad069a-d0ee-45c6-8d17-4565ef1a2c70 from this chassis (sb_readonly=0)
Sep 30 21:52:47 compute-1 nova_compute[192795]: 2025-09-30 21:52:47.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:48 compute-1 sshd-session[251044]: Failed password for root from 8.210.178.40 port 48828 ssh2
Sep 30 21:52:49 compute-1 nova_compute[192795]: 2025-09-30 21:52:49.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:49 compute-1 unix_chkpwd[251229]: password check failed for user (root)
Sep 30 21:52:51 compute-1 nova_compute[192795]: 2025-09-30 21:52:51.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:51 compute-1 sshd-session[251044]: Failed password for root from 8.210.178.40 port 48828 ssh2
Sep 30 21:52:52 compute-1 unix_chkpwd[251230]: password check failed for user (root)
Sep 30 21:52:54 compute-1 nova_compute[192795]: 2025-09-30 21:52:54.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:54 compute-1 nova_compute[192795]: 2025-09-30 21:52:54.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:54 compute-1 sshd-session[251044]: Failed password for root from 8.210.178.40 port 48828 ssh2
Sep 30 21:52:55 compute-1 podman[251231]: 2025-09-30 21:52:55.250920356 +0000 UTC m=+0.082699183 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Sep 30 21:52:55 compute-1 sshd-session[251044]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 48828 ssh2 [preauth]
Sep 30 21:52:55 compute-1 sshd-session[251044]: Disconnecting authenticating user root 8.210.178.40 port 48828: Too many authentication failures [preauth]
Sep 30 21:52:55 compute-1 sshd-session[251044]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:55 compute-1 sshd-session[251044]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:52:55 compute-1 nova_compute[192795]: 2025-09-30 21:52:55.929 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269160.9275303, 5e3582bd-9296-455b-8ecc-fd02bf833409 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:55 compute-1 nova_compute[192795]: 2025-09-30 21:52:55.930 2 INFO nova.compute.manager [-] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] VM Stopped (Lifecycle Event)
Sep 30 21:52:55 compute-1 nova_compute[192795]: 2025-09-30 21:52:55.957 2 DEBUG nova.compute.manager [None req-1e92cca2-d7e2-4180-901c-7440d5b2a5c4 - - - - - -] [instance: 5e3582bd-9296-455b-8ecc-fd02bf833409] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:56 compute-1 nova_compute[192795]: 2025-09-30 21:52:56.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:56 compute-1 nova_compute[192795]: 2025-09-30 21:52:56.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:58 compute-1 unix_chkpwd[251255]: password check failed for user (root)
Sep 30 21:52:58 compute-1 sshd-session[251253]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:59 compute-1 nova_compute[192795]: 2025-09-30 21:52:59.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:59 compute-1 nova_compute[192795]: 2025-09-30 21:52:59.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:00 compute-1 sshd-session[251253]: Failed password for root from 8.210.178.40 port 49464 ssh2
Sep 30 21:53:00 compute-1 podman[251258]: 2025-09-30 21:53:00.233254868 +0000 UTC m=+0.059975680 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:53:00 compute-1 podman[251257]: 2025-09-30 21:53:00.254583274 +0000 UTC m=+0.078491360 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:53:00 compute-1 podman[251256]: 2025-09-30 21:53:00.283611018 +0000 UTC m=+0.108887401 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 21:53:00 compute-1 nova_compute[192795]: 2025-09-30 21:53:00.711 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:01 compute-1 nova_compute[192795]: 2025-09-30 21:53:01.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-1 unix_chkpwd[251315]: password check failed for user (root)
Sep 30 21:53:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:01.929 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:75:8c 10.100.0.2 2001:db8::f816:3eff:fea2:758c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fea2:758c/64', 'neutron:device_id': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94a95e41-6166-48de-bdf2-67ffa578edb2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7661d9ad-20f6-48e0-8cdf-e095331fbd29) old=Port_Binding(mac=['fa:16:3e:a2:75:8c 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:01.931 103861 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7661d9ad-20f6-48e0-8cdf-e095331fbd29 in datapath 59f11ff9-50c1-45e8-ac0d-a61faf820997 updated
Sep 30 21:53:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:01.934 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59f11ff9-50c1-45e8-ac0d-a61faf820997, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:53:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:01.936 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ee1f96-77a7-4795-bb1d-b6425bd72ca4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:03 compute-1 sshd-session[251253]: Failed password for root from 8.210.178.40 port 49464 ssh2
Sep 30 21:53:03 compute-1 nova_compute[192795]: 2025-09-30 21:53:03.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:03 compute-1 nova_compute[192795]: 2025-09-30 21:53:03.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:03 compute-1 nova_compute[192795]: 2025-09-30 21:53:03.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:53:04 compute-1 nova_compute[192795]: 2025-09-30 21:53:04.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:04 compute-1 unix_chkpwd[251317]: password check failed for user (root)
Sep 30 21:53:05 compute-1 nova_compute[192795]: 2025-09-30 21:53:05.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:05 compute-1 nova_compute[192795]: 2025-09-30 21:53:05.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:05 compute-1 nova_compute[192795]: 2025-09-30 21:53:05.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:05 compute-1 nova_compute[192795]: 2025-09-30 21:53:05.730 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:05 compute-1 nova_compute[192795]: 2025-09-30 21:53:05.730 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:53:05 compute-1 nova_compute[192795]: 2025-09-30 21:53:05.819 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:05 compute-1 nova_compute[192795]: 2025-09-30 21:53:05.881 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:05 compute-1 nova_compute[192795]: 2025-09-30 21:53:05.883 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:05 compute-1 nova_compute[192795]: 2025-09-30 21:53:05.951 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.141 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.143 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5483MB free_disk=73.26757049560547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.144 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.144 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.230 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 014e9b2f-936f-480e-a320-2d20f6fa98ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.231 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.231 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.281 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.297 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.320 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:53:06 compute-1 nova_compute[192795]: 2025-09-30 21:53:06.321 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:06 compute-1 sshd-session[251253]: Failed password for root from 8.210.178.40 port 49464 ssh2
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.321 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:08.343 103970 DEBUG eventlet.wsgi.server [-] (103970) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Sep 30 21:53:08 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:08.344 103970 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0
Sep 30 21:53:08 compute-1 ovn_metadata_agent[103856]: Accept: */*
Sep 30 21:53:08 compute-1 ovn_metadata_agent[103856]: Connection: close
Sep 30 21:53:08 compute-1 ovn_metadata_agent[103856]: Content-Type: text/plain
Sep 30 21:53:08 compute-1 ovn_metadata_agent[103856]: Host: 169.254.169.254
Sep 30 21:53:08 compute-1 ovn_metadata_agent[103856]: User-Agent: curl/7.84.0
Sep 30 21:53:08 compute-1 ovn_metadata_agent[103856]: X-Forwarded-For: 10.100.0.12
Sep 30 21:53:08 compute-1 ovn_metadata_agent[103856]: X-Ovn-Network-Id: 2ea0f3f8-3465-4da1-824e-159a41073230 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.373 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.374 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.394 2 DEBUG nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:53:08 compute-1 unix_chkpwd[251325]: password check failed for user (root)
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.524 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.525 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.537 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.538 2 INFO nova.compute.claims [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.737 2 DEBUG nova.compute.provider_tree [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.756 2 DEBUG nova.scheduler.client.report [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.792 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.794 2 DEBUG nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.862 2 DEBUG nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.863 2 DEBUG nova.network.neutron [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.887 2 INFO nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:53:08 compute-1 nova_compute[192795]: 2025-09-30 21:53:08.909 2 DEBUG nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.057 2 DEBUG nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.059 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.060 2 INFO nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Creating image(s)
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.061 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.062 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.063 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.091 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.155 103970 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.156 103970 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.8115959
Sep 30 21:53:09 compute-1 haproxy-metadata-proxy-2ea0f3f8-3465-4da1-824e-159a41073230[250870]: 10.100.0.12:37394 [30/Sep/2025:21:53:08.341] listener listener/metadata 0/0/0/814/814 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.160 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.162 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.163 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.187 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.243 103970 DEBUG eventlet.wsgi.server [-] (103970) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.244 103970 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: Accept: */*
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: Connection: close
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: Content-Length: 100
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: Content-Type: application/x-www-form-urlencoded
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: Host: 169.254.169.254
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: User-Agent: curl/7.84.0
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: X-Forwarded-For: 10.100.0.12
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: X-Ovn-Network-Id: 2ea0f3f8-3465-4da1-824e-159a41073230
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.259 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.261 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.323 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk 1073741824" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.324 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.325 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.359 103970 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.360 103970 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.1157961
Sep 30 21:53:09 compute-1 haproxy-metadata-proxy-2ea0f3f8-3465-4da1-824e-159a41073230[250870]: 10.100.0.12:37404 [30/Sep/2025:21:53:09.241] listener listener/metadata 0/0/0/118/118 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.390 2 DEBUG nova.policy [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.412 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:75:8c 10.100.0.2 2001:db8:0:1:f816:3eff:fea2:758c 2001:db8::f816:3eff:fea2:758c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fea2:758c/64 2001:db8::f816:3eff:fea2:758c/64', 'neutron:device_id': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94a95e41-6166-48de-bdf2-67ffa578edb2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7661d9ad-20f6-48e0-8cdf-e095331fbd29) old=Port_Binding(mac=['fa:16:3e:a2:75:8c 10.100.0.2 2001:db8::f816:3eff:fea2:758c'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fea2:758c/64', 'neutron:device_id': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.415 103861 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7661d9ad-20f6-48e0-8cdf-e095331fbd29 in datapath 59f11ff9-50c1-45e8-ac0d-a61faf820997 updated
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.416 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.417 2 DEBUG nova.virt.disk.api [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.417 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.419 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59f11ff9-50c1-45e8-ac0d-a61faf820997, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:53:09 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:09.420 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f5084c07-7e3b-4c03-8418-f35c12dc50ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.480 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.482 2 DEBUG nova.virt.disk.api [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.483 2 DEBUG nova.objects.instance [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.495 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.496 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Ensure instance console log exists: /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.497 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.497 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:09 compute-1 nova_compute[192795]: 2025-09-30 21:53:09.498 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:10 compute-1 sshd-session[251253]: Failed password for root from 8.210.178.40 port 49464 ssh2
Sep 30 21:53:10 compute-1 nova_compute[192795]: 2025-09-30 21:53:10.655 2 DEBUG nova.network.neutron [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Successfully created port: 2f45d1af-afdb-4679-8d2f-2d010229f326 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:53:10 compute-1 nova_compute[192795]: 2025-09-30 21:53:10.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:11 compute-1 podman[251341]: 2025-09-30 21:53:11.234734342 +0000 UTC m=+0.074354949 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.357 2 DEBUG oslo_concurrency.lockutils [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "014e9b2f-936f-480e-a320-2d20f6fa98ce" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.357 2 DEBUG oslo_concurrency.lockutils [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.357 2 DEBUG oslo_concurrency.lockutils [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.358 2 DEBUG oslo_concurrency.lockutils [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.358 2 DEBUG oslo_concurrency.lockutils [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.385 2 INFO nova.compute.manager [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Terminating instance
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.399 2 DEBUG nova.compute.manager [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:53:11 compute-1 kernel: tap8806e5a6-34 (unregistering): left promiscuous mode
Sep 30 21:53:11 compute-1 NetworkManager[51724]: <info>  [1759269191.4245] device (tap8806e5a6-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:53:11 compute-1 ovn_controller[94902]: 2025-09-30T21:53:11Z|00744|binding|INFO|Releasing lport 8806e5a6-3463-43ee-aef5-01483480da59 from this chassis (sb_readonly=0)
Sep 30 21:53:11 compute-1 ovn_controller[94902]: 2025-09-30T21:53:11Z|00745|binding|INFO|Setting lport 8806e5a6-3463-43ee-aef5-01483480da59 down in Southbound
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:11 compute-1 ovn_controller[94902]: 2025-09-30T21:53:11Z|00746|binding|INFO|Removing iface tap8806e5a6-34 ovn-installed in OVS
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.447 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:7f:ec 10.100.0.12'], port_security=['fa:16:3e:32:7f:ec 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '014e9b2f-936f-480e-a320-2d20f6fa98ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ea0f3f8-3465-4da1-824e-159a41073230', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf9b66a62c37489792c3bdff7dfdb47f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5aaeaec0-d649-4040-b726-c1b3a3e877c9 6eb38129-ddeb-4fde-9f6f-50f70e37a068', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.244'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f762d1d8-ecd8-4f73-b47e-4f506fe1de65, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=8806e5a6-3463-43ee-aef5-01483480da59) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.448 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 8806e5a6-3463-43ee-aef5-01483480da59 in datapath 2ea0f3f8-3465-4da1-824e-159a41073230 unbound from our chassis
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.451 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2ea0f3f8-3465-4da1-824e-159a41073230, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.452 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[491bfdcb-63fc-4b09-9b77-73336beb5145]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.453 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230 namespace which is not needed anymore
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:11 compute-1 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b2.scope: Deactivated successfully.
Sep 30 21:53:11 compute-1 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b2.scope: Consumed 15.238s CPU time.
Sep 30 21:53:11 compute-1 systemd-machined[152783]: Machine qemu-82-instance-000000b2 terminated.
Sep 30 21:53:11 compute-1 neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230[250864]: [NOTICE]   (250868) : haproxy version is 2.8.14-c23fe91
Sep 30 21:53:11 compute-1 neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230[250864]: [NOTICE]   (250868) : path to executable is /usr/sbin/haproxy
Sep 30 21:53:11 compute-1 neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230[250864]: [WARNING]  (250868) : Exiting Master process...
Sep 30 21:53:11 compute-1 neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230[250864]: [WARNING]  (250868) : Exiting Master process...
Sep 30 21:53:11 compute-1 neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230[250864]: [ALERT]    (250868) : Current worker (250870) exited with code 143 (Terminated)
Sep 30 21:53:11 compute-1 neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230[250864]: [WARNING]  (250868) : All workers exited. Exiting... (0)
Sep 30 21:53:11 compute-1 systemd[1]: libpod-93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e.scope: Deactivated successfully.
Sep 30 21:53:11 compute-1 podman[251386]: 2025-09-30 21:53:11.601206975 +0000 UTC m=+0.048517030 container died 93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:53:11 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e-userdata-shm.mount: Deactivated successfully.
Sep 30 21:53:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-b7fb2ee61a0b6e72faafb0f269f5f3787cccf2dd964ab23777f44061cfad8bd2-merged.mount: Deactivated successfully.
Sep 30 21:53:11 compute-1 podman[251386]: 2025-09-30 21:53:11.66211687 +0000 UTC m=+0.109426915 container cleanup 93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:53:11 compute-1 systemd[1]: libpod-conmon-93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e.scope: Deactivated successfully.
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.677 2 INFO nova.virt.libvirt.driver [-] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Instance destroyed successfully.
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.678 2 DEBUG nova.objects.instance [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lazy-loading 'resources' on Instance uuid 014e9b2f-936f-480e-a320-2d20f6fa98ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.696 2 DEBUG nova.virt.libvirt.vif [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:52:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-70187108',display_name='tempest-TestServerBasicOps-server-70187108',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-70187108',id=178,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIyzriIScYOmBfIK6W7fMuNgPa5v7NwPylJUkiBv2Gd8HXqCDL8rRINEcrCbJXn/NhtVhKHpMMvZmHxLI0nQZPUJoIp7f9iKXMPElhWdDUN1uySAtHG0Gra6rGJdPHWsuQ==',key_name='tempest-TestServerBasicOps-1989430457',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:52:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bf9b66a62c37489792c3bdff7dfdb47f',ramdisk_id='',reservation_id='r-0pneisxq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-1078278889',owner_user_name='tempest-TestServerBasicOps-1078278889-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:53:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d5901e3d1f454c3ebc6d467bb263431f',uuid=014e9b2f-936f-480e-a320-2d20f6fa98ce,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.697 2 DEBUG nova.network.os_vif_util [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Converting VIF {"id": "8806e5a6-3463-43ee-aef5-01483480da59", "address": "fa:16:3e:32:7f:ec", "network": {"id": "2ea0f3f8-3465-4da1-824e-159a41073230", "bridge": "br-int", "label": "tempest-TestServerBasicOps-770580009-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf9b66a62c37489792c3bdff7dfdb47f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8806e5a6-34", "ovs_interfaceid": "8806e5a6-3463-43ee-aef5-01483480da59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.697 2 DEBUG nova.network.os_vif_util [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:32:7f:ec,bridge_name='br-int',has_traffic_filtering=True,id=8806e5a6-3463-43ee-aef5-01483480da59,network=Network(2ea0f3f8-3465-4da1-824e-159a41073230),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8806e5a6-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.698 2 DEBUG os_vif [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:7f:ec,bridge_name='br-int',has_traffic_filtering=True,id=8806e5a6-3463-43ee-aef5-01483480da59,network=Network(2ea0f3f8-3465-4da1-824e-159a41073230),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8806e5a6-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.701 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8806e5a6-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.707 2 INFO os_vif [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:7f:ec,bridge_name='br-int',has_traffic_filtering=True,id=8806e5a6-3463-43ee-aef5-01483480da59,network=Network(2ea0f3f8-3465-4da1-824e-159a41073230),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8806e5a6-34')
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.707 2 INFO nova.virt.libvirt.driver [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Deleting instance files /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce_del
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.708 2 INFO nova.virt.libvirt.driver [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Deletion of /var/lib/nova/instances/014e9b2f-936f-480e-a320-2d20f6fa98ce_del complete
Sep 30 21:53:11 compute-1 podman[251433]: 2025-09-30 21:53:11.733446805 +0000 UTC m=+0.044933873 container remove 93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.739 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[648e90ef-1a82-4e1f-8764-db89b49ba3ec]: (4, ('Tue Sep 30 09:53:11 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230 (93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e)\n93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e\nTue Sep 30 09:53:11 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230 (93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e)\n93ddd5ba1aa34a88392635f76c32c7329704a1b9cd625836f14fd4e89fa7a43e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.740 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7510d97a-6f24-414c-ada4-3c6cd0f29ac7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.741 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ea0f3f8-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:11 compute-1 kernel: tap2ea0f3f8-30: left promiscuous mode
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.760 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[ca73305b-21be-40e6-8f7e-c46ecd3ff1ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.782 2 INFO nova.compute.manager [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.782 2 DEBUG oslo.service.loopingcall [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.783 2 DEBUG nova.compute.manager [-] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.783 2 DEBUG nova.network.neutron [-] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.792 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bcdfc7f7-9abb-43a5-af4a-af05476a0844]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.794 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d0749acc-2f9c-4c06-8713-e32f7d75b591]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.814 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[970aea13-635f-4c05-ad85-970ff05e3389]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584523, 'reachable_time': 34327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251444, 'error': None, 'target': 'ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:11 compute-1 systemd[1]: run-netns-ovnmeta\x2d2ea0f3f8\x2d3465\x2d4da1\x2d824e\x2d159a41073230.mount: Deactivated successfully.
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.818 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2ea0f3f8-3465-4da1-824e-159a41073230 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:53:11 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:11.818 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[773b6b92-8aca-40d3-b7f4-69e0f1bec67c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.890 2 DEBUG nova.network.neutron [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Successfully updated port: 2f45d1af-afdb-4679-8d2f-2d010229f326 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.909 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.909 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:53:11 compute-1 nova_compute[192795]: 2025-09-30 21:53:11.910 2 DEBUG nova.network.neutron [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:53:12 compute-1 unix_chkpwd[251445]: password check failed for user (root)
Sep 30 21:53:12 compute-1 nova_compute[192795]: 2025-09-30 21:53:12.032 2 DEBUG nova.compute.manager [req-0a0d5ce9-a0d4-4aef-8c51-3cb47706d9bb req-ba9eb777-a1b2-421e-93d8-f494b758a540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-changed-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:12 compute-1 nova_compute[192795]: 2025-09-30 21:53:12.033 2 DEBUG nova.compute.manager [req-0a0d5ce9-a0d4-4aef-8c51-3cb47706d9bb req-ba9eb777-a1b2-421e-93d8-f494b758a540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Refreshing instance network info cache due to event network-changed-2f45d1af-afdb-4679-8d2f-2d010229f326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:53:12 compute-1 nova_compute[192795]: 2025-09-30 21:53:12.033 2 DEBUG oslo_concurrency.lockutils [req-0a0d5ce9-a0d4-4aef-8c51-3cb47706d9bb req-ba9eb777-a1b2-421e-93d8-f494b758a540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:53:12 compute-1 nova_compute[192795]: 2025-09-30 21:53:12.206 2 DEBUG nova.network.neutron [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:53:12 compute-1 nova_compute[192795]: 2025-09-30 21:53:12.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.097 2 DEBUG nova.network.neutron [-] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.129 2 INFO nova.compute.manager [-] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Took 1.35 seconds to deallocate network for instance.
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.215 2 DEBUG nova.compute.manager [req-cde86a8e-e83a-4303-a419-4304174ebf42 req-69e26fcf-710b-46cd-9e29-34f798fb3eae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received event network-vif-deleted-8806e5a6-3463-43ee-aef5-01483480da59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.217 2 DEBUG oslo_concurrency.lockutils [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.218 2 DEBUG oslo_concurrency.lockutils [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.308 2 DEBUG nova.compute.provider_tree [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.337 2 DEBUG nova.scheduler.client.report [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.360 2 DEBUG oslo_concurrency.lockutils [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.400 2 INFO nova.scheduler.client.report [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Deleted allocations for instance 014e9b2f-936f-480e-a320-2d20f6fa98ce
Sep 30 21:53:13 compute-1 nova_compute[192795]: 2025-09-30 21:53:13.484 2 DEBUG oslo_concurrency.lockutils [None req-3e172730-cd35-4d57-83a0-cfe6301692cf d5901e3d1f454c3ebc6d467bb263431f bf9b66a62c37489792c3bdff7dfdb47f - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.017 2 DEBUG nova.network.neutron [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Updating instance_info_cache with network_info: [{"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.042 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.042 2 DEBUG nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance network_info: |[{"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.043 2 DEBUG oslo_concurrency.lockutils [req-0a0d5ce9-a0d4-4aef-8c51-3cb47706d9bb req-ba9eb777-a1b2-421e-93d8-f494b758a540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.043 2 DEBUG nova.network.neutron [req-0a0d5ce9-a0d4-4aef-8c51-3cb47706d9bb req-ba9eb777-a1b2-421e-93d8-f494b758a540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Refreshing network info cache for port 2f45d1af-afdb-4679-8d2f-2d010229f326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.045 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Start _get_guest_xml network_info=[{"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.049 2 WARNING nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.053 2 DEBUG nova.virt.libvirt.host [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.054 2 DEBUG nova.virt.libvirt.host [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.057 2 DEBUG nova.virt.libvirt.host [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.057 2 DEBUG nova.virt.libvirt.host [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.058 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.058 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.059 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.059 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.059 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.059 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.059 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.059 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.060 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.060 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.060 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.060 2 DEBUG nova.virt.hardware [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.063 2 DEBUG nova.virt.libvirt.vif [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:53:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-953330524',display_name='tempest-TestNetworkAdvancedServerOps-server-953330524',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-953330524',id=179,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+SiIJBSDZoHEW7uZfXRJB02bHZHfLjpolqJKedjaF3ugClQhPjnag23izoQKtE0lfmXnRRE7o//b0Imm5kElfoQEjTyJ8qEZO5UlPQ7Ig/cgoJTHb0d7qBvQ0lj454eQ==',key_name='tempest-TestNetworkAdvancedServerOps-1800378349',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-kducfc9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:53:08Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.064 2 DEBUG nova.network.os_vif_util [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.064 2 DEBUG nova.network.os_vif_util [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.065 2 DEBUG nova.objects.instance [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.079 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <uuid>39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf</uuid>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <name>instance-000000b3</name>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-953330524</nova:name>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:53:14</nova:creationTime>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:53:14 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:53:14 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:53:14 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:53:14 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:53:14 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:53:14 compute-1 nova_compute[192795]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:53:14 compute-1 nova_compute[192795]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:53:14 compute-1 nova_compute[192795]:         <nova:port uuid="2f45d1af-afdb-4679-8d2f-2d010229f326">
Sep 30 21:53:14 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <system>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <entry name="serial">39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf</entry>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <entry name="uuid">39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf</entry>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     </system>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <os>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   </os>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <features>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   </features>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.config"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:9d:16:ec"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <target dev="tap2f45d1af-af"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/console.log" append="off"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <video>
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     </video>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:53:14 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:53:14 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:53:14 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:53:14 compute-1 nova_compute[192795]: </domain>
Sep 30 21:53:14 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.079 2 DEBUG nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Preparing to wait for external event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.080 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.080 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.080 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.081 2 DEBUG nova.virt.libvirt.vif [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:53:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-953330524',display_name='tempest-TestNetworkAdvancedServerOps-server-953330524',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-953330524',id=179,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+SiIJBSDZoHEW7uZfXRJB02bHZHfLjpolqJKedjaF3ugClQhPjnag23izoQKtE0lfmXnRRE7o//b0Imm5kElfoQEjTyJ8qEZO5UlPQ7Ig/cgoJTHb0d7qBvQ0lj454eQ==',key_name='tempest-TestNetworkAdvancedServerOps-1800378349',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-kducfc9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:53:08Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.081 2 DEBUG nova.network.os_vif_util [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.081 2 DEBUG nova.network.os_vif_util [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.082 2 DEBUG os_vif [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.082 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.083 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.085 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f45d1af-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.086 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2f45d1af-af, col_values=(('external_ids', {'iface-id': '2f45d1af-afdb-4679-8d2f-2d010229f326', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:16:ec', 'vm-uuid': '39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:14 compute-1 NetworkManager[51724]: <info>  [1759269194.0886] manager: (tap2f45d1af-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.093 2 INFO os_vif [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af')
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.153 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.153 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.154 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No VIF found with MAC fa:16:3e:9d:16:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.154 2 INFO nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Using config drive
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.187 2 DEBUG nova.compute.manager [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received event network-vif-unplugged-8806e5a6-3463-43ee-aef5-01483480da59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.187 2 DEBUG oslo_concurrency.lockutils [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.187 2 DEBUG oslo_concurrency.lockutils [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.188 2 DEBUG oslo_concurrency.lockutils [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.188 2 DEBUG nova.compute.manager [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] No waiting events found dispatching network-vif-unplugged-8806e5a6-3463-43ee-aef5-01483480da59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.188 2 WARNING nova.compute.manager [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received unexpected event network-vif-unplugged-8806e5a6-3463-43ee-aef5-01483480da59 for instance with vm_state deleted and task_state None.
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.188 2 DEBUG nova.compute.manager [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received event network-vif-plugged-8806e5a6-3463-43ee-aef5-01483480da59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.188 2 DEBUG oslo_concurrency.lockutils [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.189 2 DEBUG oslo_concurrency.lockutils [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.189 2 DEBUG oslo_concurrency.lockutils [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "014e9b2f-936f-480e-a320-2d20f6fa98ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.189 2 DEBUG nova.compute.manager [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] No waiting events found dispatching network-vif-plugged-8806e5a6-3463-43ee-aef5-01483480da59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.189 2 WARNING nova.compute.manager [req-bfe2e9c9-7ee0-4fa2-9cd4-739f244c1749 req-3b9cb428-cd03-48b8-9a1d-92530e745e15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Received unexpected event network-vif-plugged-8806e5a6-3463-43ee-aef5-01483480da59 for instance with vm_state deleted and task_state None.
Sep 30 21:53:14 compute-1 nova_compute[192795]: 2025-09-30 21:53:14.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:14 compute-1 sshd-session[251253]: Failed password for root from 8.210.178.40 port 49464 ssh2
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.069 2 INFO nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Creating config drive at /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.config
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.075 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr0_f8cfz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.203 2 DEBUG oslo_concurrency.processutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr0_f8cfz" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:15 compute-1 kernel: tap2f45d1af-af: entered promiscuous mode
Sep 30 21:53:15 compute-1 NetworkManager[51724]: <info>  [1759269195.2833] manager: (tap2f45d1af-af): new Tun device (/org/freedesktop/NetworkManager/Devices/373)
Sep 30 21:53:15 compute-1 ovn_controller[94902]: 2025-09-30T21:53:15Z|00747|binding|INFO|Claiming lport 2f45d1af-afdb-4679-8d2f-2d010229f326 for this chassis.
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:15 compute-1 ovn_controller[94902]: 2025-09-30T21:53:15Z|00748|binding|INFO|2f45d1af-afdb-4679-8d2f-2d010229f326: Claiming fa:16:3e:9d:16:ec 10.100.0.12
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.293 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:16:ec 10.100.0.12'], port_security=['fa:16:3e:9d:16:ec 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9ca777df-1571-4b79-a4b9-b7fe14fe03a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8fe4991b-2c30-4eef-a733-657321747bc9, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=2f45d1af-afdb-4679-8d2f-2d010229f326) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.294 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 2f45d1af-afdb-4679-8d2f-2d010229f326 in datapath 267ac3a9-931c-4394-b6ef-c2c8738400dd bound to our chassis
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.296 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267ac3a9-931c-4394-b6ef-c2c8738400dd
Sep 30 21:53:15 compute-1 ovn_controller[94902]: 2025-09-30T21:53:15Z|00749|binding|INFO|Setting lport 2f45d1af-afdb-4679-8d2f-2d010229f326 ovn-installed in OVS
Sep 30 21:53:15 compute-1 ovn_controller[94902]: 2025-09-30T21:53:15Z|00750|binding|INFO|Setting lport 2f45d1af-afdb-4679-8d2f-2d010229f326 up in Southbound
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.311 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[caf3ee41-6331-4473-9088-d85d178a4e1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.312 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap267ac3a9-91 in ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.313 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap267ac3a9-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.313 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[56aaf9ae-b71e-4120-a592-4f955d2a6d5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.314 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7f62018b-6ac9-4594-951c-30777dd7d4f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 systemd-udevd[251468]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.328 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[bdde6c1c-b092-496b-aebf-dce66f9be8e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 systemd-machined[152783]: New machine qemu-83-instance-000000b3.
Sep 30 21:53:15 compute-1 NetworkManager[51724]: <info>  [1759269195.3414] device (tap2f45d1af-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:53:15 compute-1 NetworkManager[51724]: <info>  [1759269195.3422] device (tap2f45d1af-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:53:15 compute-1 systemd[1]: Started Virtual Machine qemu-83-instance-000000b3.
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.363 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0abff340-3219-4065-a1f5-5a5a0e2ac550]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.411 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[654228cc-79f8-4cd2-bf58-4ee8643f41a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.418 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab57efe-5d38-452f-b295-5e4be9007eb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 NetworkManager[51724]: <info>  [1759269195.4204] manager: (tap267ac3a9-90): new Veth device (/org/freedesktop/NetworkManager/Devices/374)
Sep 30 21:53:15 compute-1 unix_chkpwd[251481]: password check failed for user (root)
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.460 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7040169f-34b0-4f55-9fde-5df29e74a7ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.464 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[153e1ded-89e5-41df-a8d9-ceb092277ee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 NetworkManager[51724]: <info>  [1759269195.4880] device (tap267ac3a9-90): carrier: link connected
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.493 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9437fe-b031-4913-8488-cc648676a528]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.516 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4f92483d-85b4-442b-bb85-c3cd800e3f0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267ac3a9-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:cc:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589116, 'reachable_time': 25840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251501, 'error': None, 'target': 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.536 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[55430a25-ce07-47b9-8f72-bc6e99610a82]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe86:cc88'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589116, 'tstamp': 589116}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251502, 'error': None, 'target': 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.685 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[90680b4e-46b0-428d-a9a6-8ffd00bb5e84]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267ac3a9-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:cc:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589116, 'reachable_time': 25840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251510, 'error': None, 'target': 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.745 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8277493d-0d99-4ac4-87f7-fe6db8fb1b03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.850 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5a206c05-f935-4d73-b890-3da079e91f8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.853 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267ac3a9-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.853 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.854 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267ac3a9-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:15 compute-1 kernel: tap267ac3a9-90: entered promiscuous mode
Sep 30 21:53:15 compute-1 NetworkManager[51724]: <info>  [1759269195.8582] manager: (tap267ac3a9-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/375)
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.862 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267ac3a9-90, col_values=(('external_ids', {'iface-id': 'f0f9539b-917b-461d-b148-6fc2f2b1f6d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:15 compute-1 ovn_controller[94902]: 2025-09-30T21:53:15Z|00751|binding|INFO|Releasing lport f0f9539b-917b-461d-b148-6fc2f2b1f6d2 from this chassis (sb_readonly=0)
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.866 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/267ac3a9-931c-4394-b6ef-c2c8738400dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/267ac3a9-931c-4394-b6ef-c2c8738400dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.867 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1bce3d64-f2d3-4785-9adf-bbd947b2c1e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.869 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-267ac3a9-931c-4394-b6ef-c2c8738400dd
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/267ac3a9-931c-4394-b6ef-c2c8738400dd.pid.haproxy
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 267ac3a9-931c-4394-b6ef-c2c8738400dd
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:53:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:15.870 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'env', 'PROCESS_TAG=haproxy-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/267ac3a9-931c-4394-b6ef-c2c8738400dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:53:15 compute-1 nova_compute[192795]: 2025-09-30 21:53:15.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.065 2 DEBUG nova.compute.manager [req-52676fef-fa47-45bc-83b9-66467f9fac06 req-5216a050-258a-4850-b401-68754089ab5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.066 2 DEBUG oslo_concurrency.lockutils [req-52676fef-fa47-45bc-83b9-66467f9fac06 req-5216a050-258a-4850-b401-68754089ab5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.066 2 DEBUG oslo_concurrency.lockutils [req-52676fef-fa47-45bc-83b9-66467f9fac06 req-5216a050-258a-4850-b401-68754089ab5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.066 2 DEBUG oslo_concurrency.lockutils [req-52676fef-fa47-45bc-83b9-66467f9fac06 req-5216a050-258a-4850-b401-68754089ab5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.066 2 DEBUG nova.compute.manager [req-52676fef-fa47-45bc-83b9-66467f9fac06 req-5216a050-258a-4850-b401-68754089ab5f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Processing event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.187 2 DEBUG nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.188 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269196.1867046, 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.188 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] VM Started (Lifecycle Event)
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.192 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.197 2 INFO nova.virt.libvirt.driver [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance spawned successfully.
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.197 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.217 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.223 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.226 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.227 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.227 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.227 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.228 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.228 2 DEBUG nova.virt.libvirt.driver [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.231 2 DEBUG nova.network.neutron [req-0a0d5ce9-a0d4-4aef-8c51-3cb47706d9bb req-ba9eb777-a1b2-421e-93d8-f494b758a540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Updated VIF entry in instance network info cache for port 2f45d1af-afdb-4679-8d2f-2d010229f326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.231 2 DEBUG nova.network.neutron [req-0a0d5ce9-a0d4-4aef-8c51-3cb47706d9bb req-ba9eb777-a1b2-421e-93d8-f494b758a540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Updating instance_info_cache with network_info: [{"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.256 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.256 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269196.1877425, 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.256 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] VM Paused (Lifecycle Event)
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.271 2 DEBUG oslo_concurrency.lockutils [req-0a0d5ce9-a0d4-4aef-8c51-3cb47706d9bb req-ba9eb777-a1b2-421e-93d8-f494b758a540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.278 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.283 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269196.193254, 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.283 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] VM Resumed (Lifecycle Event)
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.301 2 INFO nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Took 7.24 seconds to spawn the instance on the hypervisor.
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.302 2 DEBUG nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:16 compute-1 podman[251543]: 2025-09-30 21:53:16.312959572 +0000 UTC m=+0.068661305 container create b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.317 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.323 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:53:16 compute-1 podman[251543]: 2025-09-30 21:53:16.27287635 +0000 UTC m=+0.028578183 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:53:16 compute-1 systemd[1]: Started libpod-conmon-b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002.scope.
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.389 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:53:16 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:53:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b0eb0566acb0d1f239b56d6bf9b030c67a3cabb6184749ffbacd9cf194e4324/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:53:16 compute-1 podman[251556]: 2025-09-30 21:53:16.423079245 +0000 UTC m=+0.065403417 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:53:16 compute-1 podman[251543]: 2025-09-30 21:53:16.423894047 +0000 UTC m=+0.179595790 container init b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:53:16 compute-1 podman[251558]: 2025-09-30 21:53:16.42808906 +0000 UTC m=+0.066173108 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:53:16 compute-1 podman[251543]: 2025-09-30 21:53:16.431260326 +0000 UTC m=+0.186962059 container start b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.453 2 INFO nova.compute.manager [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Took 7.97 seconds to build instance.
Sep 30 21:53:16 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[251584]: [NOTICE]   (251622) : New worker (251630) forked
Sep 30 21:53:16 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[251584]: [NOTICE]   (251622) : Loading success.
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.475 2 DEBUG oslo_concurrency.lockutils [None req-6990f255-709b-4a94-854f-78089f1c7318 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:16 compute-1 podman[251557]: 2025-09-30 21:53:16.485201291 +0000 UTC m=+0.124564073 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:53:16 compute-1 nova_compute[192795]: 2025-09-30 21:53:16.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:16 compute-1 sshd-session[251253]: Failed password for root from 8.210.178.40 port 49464 ssh2
Sep 30 21:53:17 compute-1 sshd-session[251253]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 49464 ssh2 [preauth]
Sep 30 21:53:17 compute-1 sshd-session[251253]: Disconnecting authenticating user root 8.210.178.40 port 49464: Too many authentication failures [preauth]
Sep 30 21:53:17 compute-1 sshd-session[251253]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:53:17 compute-1 sshd-session[251253]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:53:18 compute-1 nova_compute[192795]: 2025-09-30 21:53:18.182 2 DEBUG nova.compute.manager [req-0e110eac-89e7-496d-95ac-3503394c3a29 req-7c340ea1-aa43-4842-8df3-750ea6d9e7ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:18 compute-1 nova_compute[192795]: 2025-09-30 21:53:18.183 2 DEBUG oslo_concurrency.lockutils [req-0e110eac-89e7-496d-95ac-3503394c3a29 req-7c340ea1-aa43-4842-8df3-750ea6d9e7ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:18 compute-1 nova_compute[192795]: 2025-09-30 21:53:18.183 2 DEBUG oslo_concurrency.lockutils [req-0e110eac-89e7-496d-95ac-3503394c3a29 req-7c340ea1-aa43-4842-8df3-750ea6d9e7ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:18 compute-1 nova_compute[192795]: 2025-09-30 21:53:18.183 2 DEBUG oslo_concurrency.lockutils [req-0e110eac-89e7-496d-95ac-3503394c3a29 req-7c340ea1-aa43-4842-8df3-750ea6d9e7ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:18 compute-1 nova_compute[192795]: 2025-09-30 21:53:18.183 2 DEBUG nova.compute.manager [req-0e110eac-89e7-496d-95ac-3503394c3a29 req-7c340ea1-aa43-4842-8df3-750ea6d9e7ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] No waiting events found dispatching network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:18 compute-1 nova_compute[192795]: 2025-09-30 21:53:18.184 2 WARNING nova.compute.manager [req-0e110eac-89e7-496d-95ac-3503394c3a29 req-7c340ea1-aa43-4842-8df3-750ea6d9e7ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received unexpected event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 for instance with vm_state active and task_state None.
Sep 30 21:53:18 compute-1 unix_chkpwd[251643]: password check failed for user (root)
Sep 30 21:53:18 compute-1 sshd-session[251641]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:53:19 compute-1 nova_compute[192795]: 2025-09-30 21:53:19.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:19 compute-1 nova_compute[192795]: 2025-09-30 21:53:19.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:19 compute-1 nova_compute[192795]: 2025-09-30 21:53:19.997 2 DEBUG nova.compute.manager [req-7bb39278-198f-49eb-86a3-a6d42c4b9a01 req-77190cc6-935a-4bf4-8a62-d5c7258cb9e2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-changed-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:19 compute-1 nova_compute[192795]: 2025-09-30 21:53:19.998 2 DEBUG nova.compute.manager [req-7bb39278-198f-49eb-86a3-a6d42c4b9a01 req-77190cc6-935a-4bf4-8a62-d5c7258cb9e2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Refreshing instance network info cache due to event network-changed-2f45d1af-afdb-4679-8d2f-2d010229f326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:53:19 compute-1 nova_compute[192795]: 2025-09-30 21:53:19.998 2 DEBUG oslo_concurrency.lockutils [req-7bb39278-198f-49eb-86a3-a6d42c4b9a01 req-77190cc6-935a-4bf4-8a62-d5c7258cb9e2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:53:19 compute-1 nova_compute[192795]: 2025-09-30 21:53:19.998 2 DEBUG oslo_concurrency.lockutils [req-7bb39278-198f-49eb-86a3-a6d42c4b9a01 req-77190cc6-935a-4bf4-8a62-d5c7258cb9e2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:53:19 compute-1 nova_compute[192795]: 2025-09-30 21:53:19.998 2 DEBUG nova.network.neutron [req-7bb39278-198f-49eb-86a3-a6d42c4b9a01 req-77190cc6-935a-4bf4-8a62-d5c7258cb9e2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Refreshing network info cache for port 2f45d1af-afdb-4679-8d2f-2d010229f326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:53:20 compute-1 sshd-session[251641]: Failed password for root from 8.210.178.40 port 50182 ssh2
Sep 30 21:53:20 compute-1 ovn_controller[94902]: 2025-09-30T21:53:20Z|00752|binding|INFO|Releasing lport f0f9539b-917b-461d-b148-6fc2f2b1f6d2 from this chassis (sb_readonly=0)
Sep 30 21:53:20 compute-1 nova_compute[192795]: 2025-09-30 21:53:20.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:21 compute-1 nova_compute[192795]: 2025-09-30 21:53:21.295 2 DEBUG nova.network.neutron [req-7bb39278-198f-49eb-86a3-a6d42c4b9a01 req-77190cc6-935a-4bf4-8a62-d5c7258cb9e2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Updated VIF entry in instance network info cache for port 2f45d1af-afdb-4679-8d2f-2d010229f326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:53:21 compute-1 nova_compute[192795]: 2025-09-30 21:53:21.301 2 DEBUG nova.network.neutron [req-7bb39278-198f-49eb-86a3-a6d42c4b9a01 req-77190cc6-935a-4bf4-8a62-d5c7258cb9e2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Updating instance_info_cache with network_info: [{"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:21 compute-1 nova_compute[192795]: 2025-09-30 21:53:21.327 2 DEBUG oslo_concurrency.lockutils [req-7bb39278-198f-49eb-86a3-a6d42c4b9a01 req-77190cc6-935a-4bf4-8a62-d5c7258cb9e2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:53:21 compute-1 nova_compute[192795]: 2025-09-30 21:53:21.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:21 compute-1 nova_compute[192795]: 2025-09-30 21:53:21.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:53:21 compute-1 nova_compute[192795]: 2025-09-30 21:53:21.719 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:53:22 compute-1 unix_chkpwd[251644]: password check failed for user (root)
Sep 30 21:53:24 compute-1 nova_compute[192795]: 2025-09-30 21:53:24.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:24 compute-1 sshd-session[251641]: Failed password for root from 8.210.178.40 port 50182 ssh2
Sep 30 21:53:24 compute-1 nova_compute[192795]: 2025-09-30 21:53:24.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:24 compute-1 ovn_controller[94902]: 2025-09-30T21:53:24Z|00753|binding|INFO|Releasing lport f0f9539b-917b-461d-b148-6fc2f2b1f6d2 from this chassis (sb_readonly=0)
Sep 30 21:53:24 compute-1 nova_compute[192795]: 2025-09-30 21:53:24.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:25 compute-1 sshd-session[251641]: Disconnecting authenticating user root 8.210.178.40 port 50182: Change of username or service not allowed: (root,ssh-connection) -> (admin,ssh-connection) [preauth]
Sep 30 21:53:25 compute-1 sshd-session[251641]: PAM 1 more authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:53:26 compute-1 podman[251647]: 2025-09-30 21:53:26.248730216 +0000 UTC m=+0.087802052 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:53:26 compute-1 nova_compute[192795]: 2025-09-30 21:53:26.675 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269191.6741748, 014e9b2f-936f-480e-a320-2d20f6fa98ce => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:26 compute-1 nova_compute[192795]: 2025-09-30 21:53:26.676 2 INFO nova.compute.manager [-] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] VM Stopped (Lifecycle Event)
Sep 30 21:53:26 compute-1 nova_compute[192795]: 2025-09-30 21:53:26.708 2 DEBUG nova.compute.manager [None req-9827aa30-4305-4de2-9b72-ec5e054a5ada - - - - - -] [instance: 014e9b2f-936f-480e-a320-2d20f6fa98ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:27 compute-1 sshd-session[251645]: Invalid user admin from 8.210.178.40 port 50478
Sep 30 21:53:27 compute-1 sshd-session[251645]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:27 compute-1 sshd-session[251645]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:53:28 compute-1 ovn_controller[94902]: 2025-09-30T21:53:28Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9d:16:ec 10.100.0.12
Sep 30 21:53:28 compute-1 ovn_controller[94902]: 2025-09-30T21:53:28Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9d:16:ec 10.100.0.12
Sep 30 21:53:29 compute-1 nova_compute[192795]: 2025-09-30 21:53:29.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:29 compute-1 nova_compute[192795]: 2025-09-30 21:53:29.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:29 compute-1 sshd-session[251645]: Failed password for invalid user admin from 8.210.178.40 port 50478 ssh2
Sep 30 21:53:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:30.091 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:30 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:30.092 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:53:30 compute-1 nova_compute[192795]: 2025-09-30 21:53:30.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:30 compute-1 sshd-session[251645]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:30 compute-1 podman[251682]: 2025-09-30 21:53:30.525613002 +0000 UTC m=+0.070857523 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:53:30 compute-1 podman[251683]: 2025-09-30 21:53:30.536817815 +0000 UTC m=+0.068817359 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:53:30 compute-1 podman[251688]: 2025-09-30 21:53:30.556124566 +0000 UTC m=+0.084690806 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:53:32 compute-1 sshd-session[251645]: Failed password for invalid user admin from 8.210.178.40 port 50478 ssh2
Sep 30 21:53:33 compute-1 nova_compute[192795]: 2025-09-30 21:53:33.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:34 compute-1 nova_compute[192795]: 2025-09-30 21:53:34.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:34 compute-1 nova_compute[192795]: 2025-09-30 21:53:34.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:35 compute-1 sshd-session[251645]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:35 compute-1 nova_compute[192795]: 2025-09-30 21:53:35.899 2 INFO nova.compute.manager [None req-7dd040e7-0ae1-4b32-a223-195860314c9a 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Get console output
Sep 30 21:53:35 compute-1 nova_compute[192795]: 2025-09-30 21:53:35.905 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:53:36 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:36.095 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:36 compute-1 nova_compute[192795]: 2025-09-30 21:53:36.521 2 DEBUG oslo_concurrency.lockutils [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:36 compute-1 nova_compute[192795]: 2025-09-30 21:53:36.522 2 DEBUG oslo_concurrency.lockutils [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:36 compute-1 nova_compute[192795]: 2025-09-30 21:53:36.522 2 DEBUG nova.compute.manager [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:36 compute-1 nova_compute[192795]: 2025-09-30 21:53:36.527 2 DEBUG nova.compute.manager [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Sep 30 21:53:36 compute-1 nova_compute[192795]: 2025-09-30 21:53:36.527 2 DEBUG nova.objects.instance [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'flavor' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:36 compute-1 nova_compute[192795]: 2025-09-30 21:53:36.631 2 DEBUG nova.objects.instance [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'info_cache' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:36 compute-1 nova_compute[192795]: 2025-09-30 21:53:36.812 2 DEBUG nova.virt.libvirt.driver [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:53:37 compute-1 sshd-session[251645]: Failed password for invalid user admin from 8.210.178.40 port 50478 ssh2
Sep 30 21:53:38 compute-1 nova_compute[192795]: 2025-09-30 21:53:38.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:38.713 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:38.714 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:38.715 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:38 compute-1 kernel: tap2f45d1af-af (unregistering): left promiscuous mode
Sep 30 21:53:38 compute-1 NetworkManager[51724]: <info>  [1759269218.9912] device (tap2f45d1af-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:53:39 compute-1 ovn_controller[94902]: 2025-09-30T21:53:39Z|00754|binding|INFO|Releasing lport 2f45d1af-afdb-4679-8d2f-2d010229f326 from this chassis (sb_readonly=0)
Sep 30 21:53:39 compute-1 ovn_controller[94902]: 2025-09-30T21:53:39Z|00755|binding|INFO|Setting lport 2f45d1af-afdb-4679-8d2f-2d010229f326 down in Southbound
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:39 compute-1 ovn_controller[94902]: 2025-09-30T21:53:39Z|00756|binding|INFO|Removing iface tap2f45d1af-af ovn-installed in OVS
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.056 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:16:ec 10.100.0.12'], port_security=['fa:16:3e:9d:16:ec 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9ca777df-1571-4b79-a4b9-b7fe14fe03a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8fe4991b-2c30-4eef-a733-657321747bc9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=2f45d1af-afdb-4679-8d2f-2d010229f326) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.057 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 2f45d1af-afdb-4679-8d2f-2d010229f326 in datapath 267ac3a9-931c-4394-b6ef-c2c8738400dd unbound from our chassis
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.059 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 267ac3a9-931c-4394-b6ef-c2c8738400dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.061 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3f21b569-eb45-4b63-9516-bcf640f68024]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.061 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd namespace which is not needed anymore
Sep 30 21:53:39 compute-1 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Sep 30 21:53:39 compute-1 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000b3.scope: Consumed 13.800s CPU time.
Sep 30 21:53:39 compute-1 systemd-machined[152783]: Machine qemu-83-instance-000000b3 terminated.
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:39 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[251584]: [NOTICE]   (251622) : haproxy version is 2.8.14-c23fe91
Sep 30 21:53:39 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[251584]: [NOTICE]   (251622) : path to executable is /usr/sbin/haproxy
Sep 30 21:53:39 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[251584]: [WARNING]  (251622) : Exiting Master process...
Sep 30 21:53:39 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[251584]: [ALERT]    (251622) : Current worker (251630) exited with code 143 (Terminated)
Sep 30 21:53:39 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[251584]: [WARNING]  (251622) : All workers exited. Exiting... (0)
Sep 30 21:53:39 compute-1 systemd[1]: libpod-b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002.scope: Deactivated successfully.
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:39 compute-1 podman[251771]: 2025-09-30 21:53:39.239952143 +0000 UTC m=+0.062341253 container died b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:53:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-1b0eb0566acb0d1f239b56d6bf9b030c67a3cabb6184749ffbacd9cf194e4324-merged.mount: Deactivated successfully.
Sep 30 21:53:39 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002-userdata-shm.mount: Deactivated successfully.
Sep 30 21:53:39 compute-1 podman[251771]: 2025-09-30 21:53:39.292729628 +0000 UTC m=+0.115118778 container cleanup b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:53:39 compute-1 systemd[1]: libpod-conmon-b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002.scope: Deactivated successfully.
Sep 30 21:53:39 compute-1 podman[251814]: 2025-09-30 21:53:39.387819715 +0000 UTC m=+0.062785016 container remove b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.395 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[41f4f693-f52e-4a0c-826e-800b93c8409a]: (4, ('Tue Sep 30 09:53:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd (b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002)\nb16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002\nTue Sep 30 09:53:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd (b16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002)\nb16257e67370390929f9e2e9c1e38593de5d9a8f13de93132848f9f2bcaaf002\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.398 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[23961218-bea7-44c5-bf7c-db13ab5e4010]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.399 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267ac3a9-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:39 compute-1 kernel: tap267ac3a9-90: left promiscuous mode
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.427 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[20701389-62d0-4600-aafe-66d1cc2d157e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.458 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[75046cad-4714-46ac-a66d-20120dfe4174]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.461 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[067848f6-f9fe-4766-8e54-2bc990af30f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.479 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[739aa8c2-8d7b-46bb-849b-76f08a96ee4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589107, 'reachable_time': 17892, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251836, 'error': None, 'target': 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.482 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:53:39 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:39.482 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[9f669c9f-3c9d-4871-ad9b-813057574ca6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:39 compute-1 systemd[1]: run-netns-ovnmeta\x2d267ac3a9\x2d931c\x2d4394\x2db6ef\x2dc2c8738400dd.mount: Deactivated successfully.
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.831 2 INFO nova.virt.libvirt.driver [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance shutdown successfully after 3 seconds.
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.837 2 INFO nova.virt.libvirt.driver [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance destroyed successfully.
Sep 30 21:53:39 compute-1 nova_compute[192795]: 2025-09-30 21:53:39.837 2 DEBUG nova.objects.instance [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'numa_topology' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:39 compute-1 sshd-session[251645]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:40 compute-1 nova_compute[192795]: 2025-09-30 21:53:40.131 2 DEBUG nova.compute.manager [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:40 compute-1 nova_compute[192795]: 2025-09-30 21:53:40.557 2 DEBUG oslo_concurrency.lockutils [None req-6ef51705-c71d-4c60-b4a3-a627ac674636 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 4.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:41 compute-1 sshd-session[251645]: Failed password for invalid user admin from 8.210.178.40 port 50478 ssh2
Sep 30 21:53:42 compute-1 podman[251837]: 2025-09-30 21:53:42.229273161 +0000 UTC m=+0.073409022 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 21:53:42 compute-1 sshd-session[251645]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:43 compute-1 nova_compute[192795]: 2025-09-30 21:53:43.522 2 DEBUG nova.compute.manager [req-f6917599-e7e9-4ecc-972a-913d2a07632e req-259da296-8bb8-4043-aebc-e8c3d6f1aa86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-unplugged-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:43 compute-1 nova_compute[192795]: 2025-09-30 21:53:43.522 2 DEBUG oslo_concurrency.lockutils [req-f6917599-e7e9-4ecc-972a-913d2a07632e req-259da296-8bb8-4043-aebc-e8c3d6f1aa86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:43 compute-1 nova_compute[192795]: 2025-09-30 21:53:43.523 2 DEBUG oslo_concurrency.lockutils [req-f6917599-e7e9-4ecc-972a-913d2a07632e req-259da296-8bb8-4043-aebc-e8c3d6f1aa86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:43 compute-1 nova_compute[192795]: 2025-09-30 21:53:43.523 2 DEBUG oslo_concurrency.lockutils [req-f6917599-e7e9-4ecc-972a-913d2a07632e req-259da296-8bb8-4043-aebc-e8c3d6f1aa86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:43 compute-1 nova_compute[192795]: 2025-09-30 21:53:43.523 2 DEBUG nova.compute.manager [req-f6917599-e7e9-4ecc-972a-913d2a07632e req-259da296-8bb8-4043-aebc-e8c3d6f1aa86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] No waiting events found dispatching network-vif-unplugged-2f45d1af-afdb-4679-8d2f-2d010229f326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:43 compute-1 nova_compute[192795]: 2025-09-30 21:53:43.523 2 WARNING nova.compute.manager [req-f6917599-e7e9-4ecc-972a-913d2a07632e req-259da296-8bb8-4043-aebc-e8c3d6f1aa86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received unexpected event network-vif-unplugged-2f45d1af-afdb-4679-8d2f-2d010229f326 for instance with vm_state stopped and task_state None.
Sep 30 21:53:44 compute-1 nova_compute[192795]: 2025-09-30 21:53:44.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:44 compute-1 nova_compute[192795]: 2025-09-30 21:53:44.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:44 compute-1 nova_compute[192795]: 2025-09-30 21:53:44.266 2 INFO nova.compute.manager [None req-14c76264-2a93-4b1e-9a45-493e8aeb7512 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Get console output
Sep 30 21:53:44 compute-1 sshd-session[251645]: Failed password for invalid user admin from 8.210.178.40 port 50478 ssh2
Sep 30 21:53:44 compute-1 nova_compute[192795]: 2025-09-30 21:53:44.921 2 DEBUG nova.objects.instance [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'flavor' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:44 compute-1 sshd-session[251645]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:44 compute-1 nova_compute[192795]: 2025-09-30 21:53:44.982 2 DEBUG nova.objects.instance [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'info_cache' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:45 compute-1 nova_compute[192795]: 2025-09-30 21:53:45.134 2 DEBUG oslo_concurrency.lockutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:53:45 compute-1 nova_compute[192795]: 2025-09-30 21:53:45.134 2 DEBUG oslo_concurrency.lockutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:53:45 compute-1 nova_compute[192795]: 2025-09-30 21:53:45.135 2 DEBUG nova.network.neutron [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:53:45 compute-1 nova_compute[192795]: 2025-09-30 21:53:45.660 2 DEBUG nova.compute.manager [req-2017d382-1b55-476b-a573-338991c2bb69 req-df762c1c-033b-4396-b873-fc78718842d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:45 compute-1 nova_compute[192795]: 2025-09-30 21:53:45.661 2 DEBUG oslo_concurrency.lockutils [req-2017d382-1b55-476b-a573-338991c2bb69 req-df762c1c-033b-4396-b873-fc78718842d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:45 compute-1 nova_compute[192795]: 2025-09-30 21:53:45.661 2 DEBUG oslo_concurrency.lockutils [req-2017d382-1b55-476b-a573-338991c2bb69 req-df762c1c-033b-4396-b873-fc78718842d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:45 compute-1 nova_compute[192795]: 2025-09-30 21:53:45.662 2 DEBUG oslo_concurrency.lockutils [req-2017d382-1b55-476b-a573-338991c2bb69 req-df762c1c-033b-4396-b873-fc78718842d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:45 compute-1 nova_compute[192795]: 2025-09-30 21:53:45.662 2 DEBUG nova.compute.manager [req-2017d382-1b55-476b-a573-338991c2bb69 req-df762c1c-033b-4396-b873-fc78718842d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] No waiting events found dispatching network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:45 compute-1 nova_compute[192795]: 2025-09-30 21:53:45.662 2 WARNING nova.compute.manager [req-2017d382-1b55-476b-a573-338991c2bb69 req-df762c1c-033b-4396-b873-fc78718842d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received unexpected event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 for instance with vm_state stopped and task_state powering-on.
Sep 30 21:53:47 compute-1 sshd-session[251645]: Failed password for invalid user admin from 8.210.178.40 port 50478 ssh2
Sep 30 21:53:47 compute-1 sshd-session[251645]: error: maximum authentication attempts exceeded for invalid user admin from 8.210.178.40 port 50478 ssh2 [preauth]
Sep 30 21:53:47 compute-1 sshd-session[251645]: Disconnecting invalid user admin 8.210.178.40 port 50478: Too many authentication failures [preauth]
Sep 30 21:53:47 compute-1 podman[251861]: 2025-09-30 21:53:47.233224117 +0000 UTC m=+0.056858516 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:53:47 compute-1 sshd-session[251645]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:53:47 compute-1 sshd-session[251645]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:53:47 compute-1 podman[251859]: 2025-09-30 21:53:47.242056165 +0000 UTC m=+0.074992015 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:53:47 compute-1 podman[251860]: 2025-09-30 21:53:47.297910833 +0000 UTC m=+0.119440105 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Sep 30 21:53:47 compute-1 nova_compute[192795]: 2025-09-30 21:53:47.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:47 compute-1 nova_compute[192795]: 2025-09-30 21:53:47.965 2 DEBUG nova.network.neutron [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Updating instance_info_cache with network_info: [{"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:47 compute-1 nova_compute[192795]: 2025-09-30 21:53:47.990 2 DEBUG oslo_concurrency.lockutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.029 2 INFO nova.virt.libvirt.driver [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance destroyed successfully.
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.029 2 DEBUG nova.objects.instance [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'numa_topology' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.047 2 DEBUG nova.objects.instance [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.058 2 DEBUG nova.virt.libvirt.vif [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:53:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-953330524',display_name='tempest-TestNetworkAdvancedServerOps-server-953330524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-953330524',id=179,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+SiIJBSDZoHEW7uZfXRJB02bHZHfLjpolqJKedjaF3ugClQhPjnag23izoQKtE0lfmXnRRE7o//b0Imm5kElfoQEjTyJ8qEZO5UlPQ7Ig/cgoJTHb0d7qBvQ0lj454eQ==',key_name='tempest-TestNetworkAdvancedServerOps-1800378349',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:53:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-kducfc9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:53:40Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.059 2 DEBUG nova.network.os_vif_util [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.060 2 DEBUG nova.network.os_vif_util [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.060 2 DEBUG os_vif [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f45d1af-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.154 2 INFO os_vif [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af')
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.165 2 DEBUG nova.virt.libvirt.driver [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Start _get_guest_xml network_info=[{"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.172 2 WARNING nova.virt.libvirt.driver [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.178 2 DEBUG nova.virt.libvirt.host [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.179 2 DEBUG nova.virt.libvirt.host [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.184 2 DEBUG nova.virt.libvirt.host [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.184 2 DEBUG nova.virt.libvirt.host [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.186 2 DEBUG nova.virt.libvirt.driver [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.186 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.186 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.187 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.187 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.187 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.187 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.188 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.188 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.188 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.189 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.189 2 DEBUG nova.virt.hardware [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.189 2 DEBUG nova.objects.instance [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.208 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.289 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.config --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.290 2 DEBUG oslo_concurrency.lockutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.290 2 DEBUG oslo_concurrency.lockutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.292 2 DEBUG oslo_concurrency.lockutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.293 2 DEBUG nova.virt.libvirt.vif [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:53:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-953330524',display_name='tempest-TestNetworkAdvancedServerOps-server-953330524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-953330524',id=179,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+SiIJBSDZoHEW7uZfXRJB02bHZHfLjpolqJKedjaF3ugClQhPjnag23izoQKtE0lfmXnRRE7o//b0Imm5kElfoQEjTyJ8qEZO5UlPQ7Ig/cgoJTHb0d7qBvQ0lj454eQ==',key_name='tempest-TestNetworkAdvancedServerOps-1800378349',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:53:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-kducfc9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:53:40Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.293 2 DEBUG nova.network.os_vif_util [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.294 2 DEBUG nova.network.os_vif_util [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.296 2 DEBUG nova.objects.instance [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.314 2 DEBUG nova.virt.libvirt.driver [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <uuid>39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf</uuid>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <name>instance-000000b3</name>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-953330524</nova:name>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:53:48</nova:creationTime>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:53:48 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:53:48 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:53:48 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:53:48 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:53:48 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:53:48 compute-1 nova_compute[192795]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:53:48 compute-1 nova_compute[192795]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:53:48 compute-1 nova_compute[192795]:         <nova:port uuid="2f45d1af-afdb-4679-8d2f-2d010229f326">
Sep 30 21:53:48 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <system>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <entry name="serial">39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf</entry>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <entry name="uuid">39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf</entry>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     </system>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <os>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   </os>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <features>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   </features>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk.config"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:9d:16:ec"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <target dev="tap2f45d1af-af"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/console.log" append="off"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <video>
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     </video>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <input type="keyboard" bus="usb"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:53:48 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:53:48 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:53:48 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:53:48 compute-1 nova_compute[192795]: </domain>
Sep 30 21:53:48 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.316 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.390 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.391 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.464 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.466 2 DEBUG nova.objects.instance [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.486 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.548 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.549 2 DEBUG nova.virt.disk.api [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.549 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.616 2 DEBUG oslo_concurrency.processutils [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.617 2 DEBUG nova.virt.disk.api [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.617 2 DEBUG nova.objects.instance [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.632 2 DEBUG nova.virt.libvirt.vif [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:53:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-953330524',display_name='tempest-TestNetworkAdvancedServerOps-server-953330524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-953330524',id=179,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+SiIJBSDZoHEW7uZfXRJB02bHZHfLjpolqJKedjaF3ugClQhPjnag23izoQKtE0lfmXnRRE7o//b0Imm5kElfoQEjTyJ8qEZO5UlPQ7Ig/cgoJTHb0d7qBvQ0lj454eQ==',key_name='tempest-TestNetworkAdvancedServerOps-1800378349',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:53:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-kducfc9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:53:40Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.632 2 DEBUG nova.network.os_vif_util [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.634 2 DEBUG nova.network.os_vif_util [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.635 2 DEBUG os_vif [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.636 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.636 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.640 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f45d1af-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2f45d1af-af, col_values=(('external_ids', {'iface-id': '2f45d1af-afdb-4679-8d2f-2d010229f326', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:16:ec', 'vm-uuid': '39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 NetworkManager[51724]: <info>  [1759269228.6438] manager: (tap2f45d1af-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/376)
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.650 2 INFO os_vif [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af')
Sep 30 21:53:48 compute-1 kernel: tap2f45d1af-af: entered promiscuous mode
Sep 30 21:53:48 compute-1 NetworkManager[51724]: <info>  [1759269228.7540] manager: (tap2f45d1af-af): new Tun device (/org/freedesktop/NetworkManager/Devices/377)
Sep 30 21:53:48 compute-1 ovn_controller[94902]: 2025-09-30T21:53:48Z|00757|binding|INFO|Claiming lport 2f45d1af-afdb-4679-8d2f-2d010229f326 for this chassis.
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 ovn_controller[94902]: 2025-09-30T21:53:48Z|00758|binding|INFO|2f45d1af-afdb-4679-8d2f-2d010229f326: Claiming fa:16:3e:9d:16:ec 10.100.0.12
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 NetworkManager[51724]: <info>  [1759269228.7716] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/378)
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 NetworkManager[51724]: <info>  [1759269228.7726] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/379)
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.781 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:16:ec 10.100.0.12'], port_security=['fa:16:3e:9d:16:ec 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9ca777df-1571-4b79-a4b9-b7fe14fe03a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8fe4991b-2c30-4eef-a733-657321747bc9, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=2f45d1af-afdb-4679-8d2f-2d010229f326) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.782 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 2f45d1af-afdb-4679-8d2f-2d010229f326 in datapath 267ac3a9-931c-4394-b6ef-c2c8738400dd bound to our chassis
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.784 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267ac3a9-931c-4394-b6ef-c2c8738400dd
Sep 30 21:53:48 compute-1 systemd-udevd[251961]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:53:48 compute-1 NetworkManager[51724]: <info>  [1759269228.8018] device (tap2f45d1af-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:53:48 compute-1 NetworkManager[51724]: <info>  [1759269228.8025] device (tap2f45d1af-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.803 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[281fe1cf-a1fb-4b70-a2ec-6b8a75cec248]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.804 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap267ac3a9-91 in ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:53:48 compute-1 systemd-machined[152783]: New machine qemu-84-instance-000000b3.
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.806 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap267ac3a9-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.806 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7f65aa68-6a76-4216-8ac8-d4f2d4eeb442]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.807 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fd406c0e-4c47-4e65-8790-232ad2894c77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.825 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[aa058ed9-dae1-4ee8-bc3a-eedb855f914f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.862 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5704a7d5-67f8-4410-989e-77123572ac97]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.899 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0a8ad6-b725-42ce-bbcb-545be5da2583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.908 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[10c3ce2e-60cd-4809-a34a-48b11d610700]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:48 compute-1 NetworkManager[51724]: <info>  [1759269228.9087] manager: (tap267ac3a9-90): new Veth device (/org/freedesktop/NetworkManager/Devices/380)
Sep 30 21:53:48 compute-1 systemd[1]: Started Virtual Machine qemu-84-instance-000000b3.
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 sshd-session[251927]: Invalid user admin from 8.210.178.40 port 51190
Sep 30 21:53:48 compute-1 sshd-session[251927]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:48 compute-1 sshd-session[251927]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 ovn_controller[94902]: 2025-09-30T21:53:48Z|00759|binding|INFO|Setting lport 2f45d1af-afdb-4679-8d2f-2d010229f326 ovn-installed in OVS
Sep 30 21:53:48 compute-1 ovn_controller[94902]: 2025-09-30T21:53:48Z|00760|binding|INFO|Setting lport 2f45d1af-afdb-4679-8d2f-2d010229f326 up in Southbound
Sep 30 21:53:48 compute-1 nova_compute[192795]: 2025-09-30 21:53:48.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.959 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[d3521255-1eec-4f67-8fea-7fec594347cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:48 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:48.963 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb3038e-6856-48ad-994e-dc3b3e7d8358]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:49 compute-1 NetworkManager[51724]: <info>  [1759269229.0013] device (tap267ac3a9-90): carrier: link connected
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.009 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[c30ff107-ac40-4008-a903-fc6f4a21a5c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.036 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e440bd46-d9a1-415d-b7b5-ce4425cba73c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267ac3a9-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:cc:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592467, 'reachable_time': 37672, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251995, 'error': None, 'target': 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.064 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e763639c-f022-44e0-977c-5f3e35a64d49]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe86:cc88'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 592467, 'tstamp': 592467}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251996, 'error': None, 'target': 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.090 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d24e9c-a9b6-410e-adf6-74bffe942695]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267ac3a9-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:cc:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592467, 'reachable_time': 37672, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251997, 'error': None, 'target': 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.145 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4b0cba-ee6d-42da-b505-bc8800f539af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.236 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e846bca4-59a9-4102-bf76-2d1d9070fb85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.238 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267ac3a9-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.240 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.241 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267ac3a9-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:49 compute-1 NetworkManager[51724]: <info>  [1759269229.2440] manager: (tap267ac3a9-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/381)
Sep 30 21:53:49 compute-1 kernel: tap267ac3a9-90: entered promiscuous mode
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.248 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267ac3a9-90, col_values=(('external_ids', {'iface-id': 'f0f9539b-917b-461d-b148-6fc2f2b1f6d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:49 compute-1 ovn_controller[94902]: 2025-09-30T21:53:49Z|00761|binding|INFO|Releasing lport f0f9539b-917b-461d-b148-6fc2f2b1f6d2 from this chassis (sb_readonly=1)
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.252 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/267ac3a9-931c-4394-b6ef-c2c8738400dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/267ac3a9-931c-4394-b6ef-c2c8738400dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.254 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[59b8a61d-ed8d-45eb-b94e-57645185d44f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.254 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-267ac3a9-931c-4394-b6ef-c2c8738400dd
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/267ac3a9-931c-4394-b6ef-c2c8738400dd.pid.haproxy
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 267ac3a9-931c-4394-b6ef-c2c8738400dd
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:53:49 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:53:49.255 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'env', 'PROCESS_TAG=haproxy-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/267ac3a9-931c-4394-b6ef-c2c8738400dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:49 compute-1 podman[252036]: 2025-09-30 21:53:49.680452452 +0000 UTC m=+0.073020702 container create 0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:53:49 compute-1 systemd[1]: Started libpod-conmon-0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583.scope.
Sep 30 21:53:49 compute-1 podman[252036]: 2025-09-30 21:53:49.641365497 +0000 UTC m=+0.033933827 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:53:49 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:53:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33276118c9198e78b4d5fb7e43847ee04cf2494ffb27fe41d3bfde2f206012a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.757 2 DEBUG nova.virt.libvirt.host [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Removed pending event for 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.758 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269229.7572753, 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.758 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] VM Resumed (Lifecycle Event)
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.760 2 DEBUG nova.compute.manager [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.764 2 INFO nova.virt.libvirt.driver [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance rebooted successfully.
Sep 30 21:53:49 compute-1 nova_compute[192795]: 2025-09-30 21:53:49.764 2 DEBUG nova.compute.manager [None req-ab00447f-9de2-4922-81bd-52c0cc6a6a3e 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:49 compute-1 podman[252036]: 2025-09-30 21:53:49.775076506 +0000 UTC m=+0.167644786 container init 0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:53:49 compute-1 podman[252036]: 2025-09-30 21:53:49.780678527 +0000 UTC m=+0.173246777 container start 0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:53:49 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[252052]: [NOTICE]   (252056) : New worker (252058) forked
Sep 30 21:53:49 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[252052]: [NOTICE]   (252056) : Loading success.
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.404 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.410 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.434 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] During sync_power_state the instance has a pending task (powering-on). Skip.
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.435 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269229.7584102, 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.435 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] VM Started (Lifecycle Event)
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.467 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.473 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.679 2 DEBUG nova.compute.manager [req-5697a72e-7321-458d-b76d-15394b63c279 req-bde51dbb-7e54-45d4-9f78-02c9cbb3b2f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.682 2 DEBUG oslo_concurrency.lockutils [req-5697a72e-7321-458d-b76d-15394b63c279 req-bde51dbb-7e54-45d4-9f78-02c9cbb3b2f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.682 2 DEBUG oslo_concurrency.lockutils [req-5697a72e-7321-458d-b76d-15394b63c279 req-bde51dbb-7e54-45d4-9f78-02c9cbb3b2f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.682 2 DEBUG oslo_concurrency.lockutils [req-5697a72e-7321-458d-b76d-15394b63c279 req-bde51dbb-7e54-45d4-9f78-02c9cbb3b2f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.683 2 DEBUG nova.compute.manager [req-5697a72e-7321-458d-b76d-15394b63c279 req-bde51dbb-7e54-45d4-9f78-02c9cbb3b2f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] No waiting events found dispatching network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:50 compute-1 nova_compute[192795]: 2025-09-30 21:53:50.683 2 WARNING nova.compute.manager [req-5697a72e-7321-458d-b76d-15394b63c279 req-bde51dbb-7e54-45d4-9f78-02c9cbb3b2f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received unexpected event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 for instance with vm_state active and task_state None.
Sep 30 21:53:50 compute-1 sshd-session[251927]: Failed password for invalid user admin from 8.210.178.40 port 51190 ssh2
Sep 30 21:53:51 compute-1 sshd-session[251927]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:52 compute-1 nova_compute[192795]: 2025-09-30 21:53:52.906 2 DEBUG nova.compute.manager [req-7993a8df-f3c6-4441-99c1-dc65143f0a47 req-7e2cc9c2-5ece-43ae-b5b5-2967da42cfe6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:52 compute-1 nova_compute[192795]: 2025-09-30 21:53:52.908 2 DEBUG oslo_concurrency.lockutils [req-7993a8df-f3c6-4441-99c1-dc65143f0a47 req-7e2cc9c2-5ece-43ae-b5b5-2967da42cfe6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:52 compute-1 nova_compute[192795]: 2025-09-30 21:53:52.908 2 DEBUG oslo_concurrency.lockutils [req-7993a8df-f3c6-4441-99c1-dc65143f0a47 req-7e2cc9c2-5ece-43ae-b5b5-2967da42cfe6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:52 compute-1 nova_compute[192795]: 2025-09-30 21:53:52.908 2 DEBUG oslo_concurrency.lockutils [req-7993a8df-f3c6-4441-99c1-dc65143f0a47 req-7e2cc9c2-5ece-43ae-b5b5-2967da42cfe6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:52 compute-1 nova_compute[192795]: 2025-09-30 21:53:52.908 2 DEBUG nova.compute.manager [req-7993a8df-f3c6-4441-99c1-dc65143f0a47 req-7e2cc9c2-5ece-43ae-b5b5-2967da42cfe6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] No waiting events found dispatching network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:52 compute-1 nova_compute[192795]: 2025-09-30 21:53:52.909 2 WARNING nova.compute.manager [req-7993a8df-f3c6-4441-99c1-dc65143f0a47 req-7e2cc9c2-5ece-43ae-b5b5-2967da42cfe6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received unexpected event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 for instance with vm_state active and task_state None.
Sep 30 21:53:53 compute-1 sshd-session[251927]: Failed password for invalid user admin from 8.210.178.40 port 51190 ssh2
Sep 30 21:53:53 compute-1 nova_compute[192795]: 2025-09-30 21:53:53.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:53 compute-1 sshd-session[251927]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:54 compute-1 nova_compute[192795]: 2025-09-30 21:53:54.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:55 compute-1 ovn_controller[94902]: 2025-09-30T21:53:55Z|00762|binding|INFO|Releasing lport f0f9539b-917b-461d-b148-6fc2f2b1f6d2 from this chassis (sb_readonly=0)
Sep 30 21:53:55 compute-1 nova_compute[192795]: 2025-09-30 21:53:55.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:55 compute-1 sshd-session[251927]: Failed password for invalid user admin from 8.210.178.40 port 51190 ssh2
Sep 30 21:53:56 compute-1 sshd-session[251927]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:56 compute-1 podman[252068]: 2025-09-30 21:53:56.468971162 +0000 UTC m=+0.086649681 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:53:58 compute-1 sshd-session[251927]: Failed password for invalid user admin from 8.210.178.40 port 51190 ssh2
Sep 30 21:53:58 compute-1 nova_compute[192795]: 2025-09-30 21:53:58.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:58 compute-1 sshd-session[251927]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:59 compute-1 nova_compute[192795]: 2025-09-30 21:53:59.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:00 compute-1 sshd-session[251927]: Failed password for invalid user admin from 8.210.178.40 port 51190 ssh2
Sep 30 21:54:01 compute-1 podman[252091]: 2025-09-30 21:54:01.231326246 +0000 UTC m=+0.056570199 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:54:01 compute-1 podman[252090]: 2025-09-30 21:54:01.234131421 +0000 UTC m=+0.064021859 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:54:01 compute-1 podman[252089]: 2025-09-30 21:54:01.281372626 +0000 UTC m=+0.102862908 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64)
Sep 30 21:54:01 compute-1 sshd-session[251927]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:02 compute-1 sshd-session[251927]: Failed password for invalid user admin from 8.210.178.40 port 51190 ssh2
Sep 30 21:54:02 compute-1 nova_compute[192795]: 2025-09-30 21:54:02.715 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:03 compute-1 ovn_controller[94902]: 2025-09-30T21:54:03Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9d:16:ec 10.100.0.12
Sep 30 21:54:03 compute-1 sshd-session[251927]: error: maximum authentication attempts exceeded for invalid user admin from 8.210.178.40 port 51190 ssh2 [preauth]
Sep 30 21:54:03 compute-1 sshd-session[251927]: Disconnecting invalid user admin 8.210.178.40 port 51190: Too many authentication failures [preauth]
Sep 30 21:54:03 compute-1 sshd-session[251927]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:03 compute-1 sshd-session[251927]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:54:03 compute-1 nova_compute[192795]: 2025-09-30 21:54:03.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:04 compute-1 nova_compute[192795]: 2025-09-30 21:54:04.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:04 compute-1 nova_compute[192795]: 2025-09-30 21:54:04.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:04 compute-1 nova_compute[192795]: 2025-09-30 21:54:04.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:54:05 compute-1 sshd-session[252164]: Invalid user admin from 8.210.178.40 port 51746
Sep 30 21:54:05 compute-1 sshd-session[252164]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:05 compute-1 sshd-session[252164]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:07 compute-1 sshd-session[252164]: Failed password for invalid user admin from 8.210.178.40 port 51746 ssh2
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.740 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.741 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.741 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.741 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:54:07 compute-1 sshd-session[252164]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.823 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.906 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.907 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:07 compute-1 nova_compute[192795]: 2025-09-30 21:54:07.967 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.130 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.131 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5491MB free_disk=73.2672233581543GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.131 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.132 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.257 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.258 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.258 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.324 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.367 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.412 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.413 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:08 compute-1 nova_compute[192795]: 2025-09-30 21:54:08.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:09 compute-1 nova_compute[192795]: 2025-09-30 21:54:09.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:09 compute-1 sshd-session[252164]: Failed password for invalid user admin from 8.210.178.40 port 51746 ssh2
Sep 30 21:54:10 compute-1 sshd-session[252164]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:10 compute-1 nova_compute[192795]: 2025-09-30 21:54:10.503 2 INFO nova.compute.manager [None req-9b3bb21d-74d9-4f8f-a77c-927b36fad0ca 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Get console output
Sep 30 21:54:10 compute-1 nova_compute[192795]: 2025-09-30 21:54:10.512 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:54:12 compute-1 sshd-session[252164]: Failed password for invalid user admin from 8.210.178.40 port 51746 ssh2
Sep 30 21:54:12 compute-1 sshd-session[252164]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:12 compute-1 podman[252173]: 2025-09-30 21:54:12.932088657 +0000 UTC m=+0.088448970 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:54:13 compute-1 nova_compute[192795]: 2025-09-30 21:54:13.414 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:13 compute-1 nova_compute[192795]: 2025-09-30 21:54:13.415 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:13 compute-1 nova_compute[192795]: 2025-09-30 21:54:13.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:13 compute-1 nova_compute[192795]: 2025-09-30 21:54:13.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:14 compute-1 sshd-session[252164]: Failed password for invalid user admin from 8.210.178.40 port 51746 ssh2
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.847 2 DEBUG nova.compute.manager [req-59fc5deb-9e7a-4522-b147-641777d8d697 req-e3f3bf50-f98e-4015-abc2-b6fb16171139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-changed-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.848 2 DEBUG nova.compute.manager [req-59fc5deb-9e7a-4522-b147-641777d8d697 req-e3f3bf50-f98e-4015-abc2-b6fb16171139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Refreshing instance network info cache due to event network-changed-2f45d1af-afdb-4679-8d2f-2d010229f326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.848 2 DEBUG oslo_concurrency.lockutils [req-59fc5deb-9e7a-4522-b147-641777d8d697 req-e3f3bf50-f98e-4015-abc2-b6fb16171139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.848 2 DEBUG oslo_concurrency.lockutils [req-59fc5deb-9e7a-4522-b147-641777d8d697 req-e3f3bf50-f98e-4015-abc2-b6fb16171139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.848 2 DEBUG nova.network.neutron [req-59fc5deb-9e7a-4522-b147-641777d8d697 req-e3f3bf50-f98e-4015-abc2-b6fb16171139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Refreshing network info cache for port 2f45d1af-afdb-4679-8d2f-2d010229f326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.947 2 DEBUG oslo_concurrency.lockutils [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.948 2 DEBUG oslo_concurrency.lockutils [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.948 2 DEBUG oslo_concurrency.lockutils [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.948 2 DEBUG oslo_concurrency.lockutils [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.948 2 DEBUG oslo_concurrency.lockutils [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.963 2 INFO nova.compute.manager [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Terminating instance
Sep 30 21:54:14 compute-1 nova_compute[192795]: 2025-09-30 21:54:14.978 2 DEBUG nova.compute.manager [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:54:15 compute-1 kernel: tap2f45d1af-af (unregistering): left promiscuous mode
Sep 30 21:54:15 compute-1 NetworkManager[51724]: <info>  [1759269255.0175] device (tap2f45d1af-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:54:15 compute-1 ovn_controller[94902]: 2025-09-30T21:54:15Z|00763|binding|INFO|Releasing lport 2f45d1af-afdb-4679-8d2f-2d010229f326 from this chassis (sb_readonly=0)
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:15 compute-1 ovn_controller[94902]: 2025-09-30T21:54:15Z|00764|binding|INFO|Setting lport 2f45d1af-afdb-4679-8d2f-2d010229f326 down in Southbound
Sep 30 21:54:15 compute-1 ovn_controller[94902]: 2025-09-30T21:54:15Z|00765|binding|INFO|Removing iface tap2f45d1af-af ovn-installed in OVS
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.041 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:16:ec 10.100.0.12'], port_security=['fa:16:3e:9d:16:ec 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9ca777df-1571-4b79-a4b9-b7fe14fe03a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8fe4991b-2c30-4eef-a733-657321747bc9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=2f45d1af-afdb-4679-8d2f-2d010229f326) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.043 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 2f45d1af-afdb-4679-8d2f-2d010229f326 in datapath 267ac3a9-931c-4394-b6ef-c2c8738400dd unbound from our chassis
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.044 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 267ac3a9-931c-4394-b6ef-c2c8738400dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.046 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7f049765-015b-4b47-b4ec-040bb2a79e4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.046 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd namespace which is not needed anymore
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:15 compute-1 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Sep 30 21:54:15 compute-1 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000b3.scope: Consumed 14.319s CPU time.
Sep 30 21:54:15 compute-1 systemd-machined[152783]: Machine qemu-84-instance-000000b3 terminated.
Sep 30 21:54:15 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[252052]: [NOTICE]   (252056) : haproxy version is 2.8.14-c23fe91
Sep 30 21:54:15 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[252052]: [NOTICE]   (252056) : path to executable is /usr/sbin/haproxy
Sep 30 21:54:15 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[252052]: [WARNING]  (252056) : Exiting Master process...
Sep 30 21:54:15 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[252052]: [ALERT]    (252056) : Current worker (252058) exited with code 143 (Terminated)
Sep 30 21:54:15 compute-1 neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd[252052]: [WARNING]  (252056) : All workers exited. Exiting... (0)
Sep 30 21:54:15 compute-1 systemd[1]: libpod-0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583.scope: Deactivated successfully.
Sep 30 21:54:15 compute-1 podman[252220]: 2025-09-30 21:54:15.207078691 +0000 UTC m=+0.053099785 container died 0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:54:15 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583-userdata-shm.mount: Deactivated successfully.
Sep 30 21:54:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-33276118c9198e78b4d5fb7e43847ee04cf2494ffb27fe41d3bfde2f206012a2-merged.mount: Deactivated successfully.
Sep 30 21:54:15 compute-1 podman[252220]: 2025-09-30 21:54:15.258428337 +0000 UTC m=+0.104449401 container cleanup 0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.272 2 INFO nova.virt.libvirt.driver [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Instance destroyed successfully.
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.273 2 DEBUG nova.objects.instance [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:54:15 compute-1 systemd[1]: libpod-conmon-0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583.scope: Deactivated successfully.
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.294 2 DEBUG nova.virt.libvirt.vif [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:53:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-953330524',display_name='tempest-TestNetworkAdvancedServerOps-server-953330524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-953330524',id=179,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF+SiIJBSDZoHEW7uZfXRJB02bHZHfLjpolqJKedjaF3ugClQhPjnag23izoQKtE0lfmXnRRE7o//b0Imm5kElfoQEjTyJ8qEZO5UlPQ7Ig/cgoJTHb0d7qBvQ0lj454eQ==',key_name='tempest-TestNetworkAdvancedServerOps-1800378349',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:53:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-kducfc9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:53:50Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.295 2 DEBUG nova.network.os_vif_util [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.296 2 DEBUG nova.network.os_vif_util [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.296 2 DEBUG os_vif [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.299 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f45d1af-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:54:15 compute-1 sshd-session[252164]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.307 2 INFO os_vif [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:16:ec,bridge_name='br-int',has_traffic_filtering=True,id=2f45d1af-afdb-4679-8d2f-2d010229f326,network=Network(267ac3a9-931c-4394-b6ef-c2c8738400dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f45d1af-af')
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.308 2 INFO nova.virt.libvirt.driver [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Deleting instance files /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf_del
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.308 2 INFO nova.virt.libvirt.driver [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Deletion of /var/lib/nova/instances/39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf_del complete
Sep 30 21:54:15 compute-1 podman[252266]: 2025-09-30 21:54:15.33815744 +0000 UTC m=+0.054280847 container remove 0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.347 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[db7ff5b6-5bba-4c03-9881-eb87e04fde0f]: (4, ('Tue Sep 30 09:54:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd (0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583)\n0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583\nTue Sep 30 09:54:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd (0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583)\n0a27be9eb123d5ff2c15d5b8d3e809b124e4f7c9245c574a15a04246762f2583\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.349 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd0fb88-8440-4056-87b6-21ec5a1b1546]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.350 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267ac3a9-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:15 compute-1 kernel: tap267ac3a9-90: left promiscuous mode
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.369 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4b03c480-a4a7-40b9-bc6a-00375a5598cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.403 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[03aae200-52cb-4777-89b5-acc798fa7370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.405 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fd03811b-dab6-4ab2-9f83-57fc428d7829]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.429 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb1cbec-a45a-4e93-918a-cc3c9459ca77]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592456, 'reachable_time': 38729, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252283, 'error': None, 'target': 'ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.430 2 DEBUG nova.compute.manager [req-c81b1f0f-be63-4e0d-9e87-bb4cd3c5af43 req-870c2484-cf84-4698-8348-702331cee71d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-unplugged-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.431 2 DEBUG oslo_concurrency.lockutils [req-c81b1f0f-be63-4e0d-9e87-bb4cd3c5af43 req-870c2484-cf84-4698-8348-702331cee71d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.432 2 DEBUG oslo_concurrency.lockutils [req-c81b1f0f-be63-4e0d-9e87-bb4cd3c5af43 req-870c2484-cf84-4698-8348-702331cee71d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.432 2 DEBUG oslo_concurrency.lockutils [req-c81b1f0f-be63-4e0d-9e87-bb4cd3c5af43 req-870c2484-cf84-4698-8348-702331cee71d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.433 2 DEBUG nova.compute.manager [req-c81b1f0f-be63-4e0d-9e87-bb4cd3c5af43 req-870c2484-cf84-4698-8348-702331cee71d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] No waiting events found dispatching network-vif-unplugged-2f45d1af-afdb-4679-8d2f-2d010229f326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.434 2 DEBUG nova.compute.manager [req-c81b1f0f-be63-4e0d-9e87-bb4cd3c5af43 req-870c2484-cf84-4698-8348-702331cee71d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-unplugged-2f45d1af-afdb-4679-8d2f-2d010229f326 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.434 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-267ac3a9-931c-4394-b6ef-c2c8738400dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:54:15 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:15.435 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[33b6d154-f7ad-42c4-b823-e73d47ffb75f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:15 compute-1 systemd[1]: run-netns-ovnmeta\x2d267ac3a9\x2d931c\x2d4394\x2db6ef\x2dc2c8738400dd.mount: Deactivated successfully.
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.473 2 INFO nova.compute.manager [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Took 0.49 seconds to destroy the instance on the hypervisor.
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.474 2 DEBUG oslo.service.loopingcall [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.474 2 DEBUG nova.compute.manager [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:54:15 compute-1 nova_compute[192795]: 2025-09-30 21:54:15.474 2 DEBUG nova.network.neutron [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.621 2 DEBUG nova.network.neutron [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:16.625 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:54:16 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:16.626 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.645 2 INFO nova.compute.manager [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Took 1.17 seconds to deallocate network for instance.
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.764 2 DEBUG oslo_concurrency.lockutils [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.765 2 DEBUG oslo_concurrency.lockutils [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.800 2 DEBUG nova.compute.manager [req-d04caef1-f51a-4ab6-963f-6ae103917f49 req-99421a79-14b6-4a1b-b45f-b48d4b0c4e0d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-deleted-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.835 2 DEBUG nova.compute.provider_tree [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.854 2 DEBUG nova.scheduler.client.report [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.882 2 DEBUG oslo_concurrency.lockutils [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:16 compute-1 nova_compute[192795]: 2025-09-30 21:54:16.937 2 INFO nova.scheduler.client.report [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Deleted allocations for instance 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.044 2 DEBUG oslo_concurrency.lockutils [None req-1429528e-9ab3-48b5-ac21-f7bc92561f68 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:17 compute-1 sshd-session[252164]: Failed password for invalid user admin from 8.210.178.40 port 51746 ssh2
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.280 2 DEBUG nova.network.neutron [req-59fc5deb-9e7a-4522-b147-641777d8d697 req-e3f3bf50-f98e-4015-abc2-b6fb16171139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Updated VIF entry in instance network info cache for port 2f45d1af-afdb-4679-8d2f-2d010229f326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.281 2 DEBUG nova.network.neutron [req-59fc5deb-9e7a-4522-b147-641777d8d697 req-e3f3bf50-f98e-4015-abc2-b6fb16171139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Updating instance_info_cache with network_info: [{"id": "2f45d1af-afdb-4679-8d2f-2d010229f326", "address": "fa:16:3e:9d:16:ec", "network": {"id": "267ac3a9-931c-4394-b6ef-c2c8738400dd", "bridge": "br-int", "label": "tempest-network-smoke--2000714178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f45d1af-af", "ovs_interfaceid": "2f45d1af-afdb-4679-8d2f-2d010229f326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.315 2 DEBUG oslo_concurrency.lockutils [req-59fc5deb-9e7a-4522-b147-641777d8d697 req-e3f3bf50-f98e-4015-abc2-b6fb16171139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.536 2 DEBUG nova.compute.manager [req-82cabc7c-1890-4e72-b19f-bf9dc044409b req-d0072716-98c4-4637-a70a-3f1810a2db20 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.537 2 DEBUG oslo_concurrency.lockutils [req-82cabc7c-1890-4e72-b19f-bf9dc044409b req-d0072716-98c4-4637-a70a-3f1810a2db20 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.537 2 DEBUG oslo_concurrency.lockutils [req-82cabc7c-1890-4e72-b19f-bf9dc044409b req-d0072716-98c4-4637-a70a-3f1810a2db20 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.537 2 DEBUG oslo_concurrency.lockutils [req-82cabc7c-1890-4e72-b19f-bf9dc044409b req-d0072716-98c4-4637-a70a-3f1810a2db20 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.537 2 DEBUG nova.compute.manager [req-82cabc7c-1890-4e72-b19f-bf9dc044409b req-d0072716-98c4-4637-a70a-3f1810a2db20 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] No waiting events found dispatching network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:54:17 compute-1 nova_compute[192795]: 2025-09-30 21:54:17.537 2 WARNING nova.compute.manager [req-82cabc7c-1890-4e72-b19f-bf9dc044409b req-d0072716-98c4-4637-a70a-3f1810a2db20 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Received unexpected event network-vif-plugged-2f45d1af-afdb-4679-8d2f-2d010229f326 for instance with vm_state deleted and task_state None.
Sep 30 21:54:17 compute-1 sshd-session[252164]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:17 compute-1 podman[252286]: 2025-09-30 21:54:17.905983259 +0000 UTC m=+0.058553822 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:54:17 compute-1 podman[252284]: 2025-09-30 21:54:17.914480388 +0000 UTC m=+0.074007020 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 21:54:17 compute-1 podman[252285]: 2025-09-30 21:54:17.941005685 +0000 UTC m=+0.100528995 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:54:19 compute-1 nova_compute[192795]: 2025-09-30 21:54:19.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:19 compute-1 sshd-session[252164]: Failed password for invalid user admin from 8.210.178.40 port 51746 ssh2
Sep 30 21:54:20 compute-1 sshd-session[252164]: error: maximum authentication attempts exceeded for invalid user admin from 8.210.178.40 port 51746 ssh2 [preauth]
Sep 30 21:54:20 compute-1 sshd-session[252164]: Disconnecting invalid user admin 8.210.178.40 port 51746: Too many authentication failures [preauth]
Sep 30 21:54:20 compute-1 sshd-session[252164]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:20 compute-1 sshd-session[252164]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:54:20 compute-1 nova_compute[192795]: 2025-09-30 21:54:20.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:20 compute-1 nova_compute[192795]: 2025-09-30 21:54:20.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:20 compute-1 nova_compute[192795]: 2025-09-30 21:54:20.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:21.629 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:21 compute-1 sshd-session[252347]: Invalid user ubuntu from 8.210.178.40 port 52290
Sep 30 21:54:21 compute-1 sshd-session[252347]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:21 compute-1 sshd-session[252347]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:22 compute-1 nova_compute[192795]: 2025-09-30 21:54:22.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:22 compute-1 nova_compute[192795]: 2025-09-30 21:54:22.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:54:22 compute-1 nova_compute[192795]: 2025-09-30 21:54:22.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:54:23 compute-1 nova_compute[192795]: 2025-09-30 21:54:23.376 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:54:23 compute-1 sshd-session[252347]: Failed password for invalid user ubuntu from 8.210.178.40 port 52290 ssh2
Sep 30 21:54:24 compute-1 nova_compute[192795]: 2025-09-30 21:54:24.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:24 compute-1 sshd-session[252347]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:25 compute-1 nova_compute[192795]: 2025-09-30 21:54:25.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:26 compute-1 nova_compute[192795]: 2025-09-30 21:54:26.372 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:27 compute-1 sshd-session[252347]: Failed password for invalid user ubuntu from 8.210.178.40 port 52290 ssh2
Sep 30 21:54:27 compute-1 podman[252350]: 2025-09-30 21:54:27.231564219 +0000 UTC m=+0.067955175 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Sep 30 21:54:27 compute-1 sshd-session[252347]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:29 compute-1 sshd-session[252347]: Failed password for invalid user ubuntu from 8.210.178.40 port 52290 ssh2
Sep 30 21:54:29 compute-1 nova_compute[192795]: 2025-09-30 21:54:29.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:30 compute-1 nova_compute[192795]: 2025-09-30 21:54:30.266 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269255.2596095, 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:30 compute-1 nova_compute[192795]: 2025-09-30 21:54:30.266 2 INFO nova.compute.manager [-] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] VM Stopped (Lifecycle Event)
Sep 30 21:54:30 compute-1 nova_compute[192795]: 2025-09-30 21:54:30.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:30 compute-1 sshd-session[252347]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:31 compute-1 nova_compute[192795]: 2025-09-30 21:54:31.386 2 DEBUG nova.compute.manager [None req-8973bb58-afb7-4804-9627-dfde1da4cb28 - - - - - -] [instance: 39ac9e8a-e1f4-4ac6-a171-dc4a49bc7bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:32 compute-1 podman[252373]: 2025-09-30 21:54:32.235615448 +0000 UTC m=+0.065432087 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:54:32 compute-1 podman[252372]: 2025-09-30 21:54:32.237259382 +0000 UTC m=+0.075876669 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container)
Sep 30 21:54:32 compute-1 podman[252379]: 2025-09-30 21:54:32.244264441 +0000 UTC m=+0.065866619 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 21:54:32 compute-1 sshd-session[252347]: Failed password for invalid user ubuntu from 8.210.178.40 port 52290 ssh2
Sep 30 21:54:33 compute-1 sshd-session[252347]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:34 compute-1 nova_compute[192795]: 2025-09-30 21:54:34.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:35 compute-1 nova_compute[192795]: 2025-09-30 21:54:35.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:35 compute-1 sshd-session[252347]: Failed password for invalid user ubuntu from 8.210.178.40 port 52290 ssh2
Sep 30 21:54:36 compute-1 sshd-session[252347]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:38.715 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:38.716 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:38.716 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:38 compute-1 sshd-session[252347]: Failed password for invalid user ubuntu from 8.210.178.40 port 52290 ssh2
Sep 30 21:54:39 compute-1 nova_compute[192795]: 2025-09-30 21:54:39.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:39 compute-1 sshd-session[252347]: error: maximum authentication attempts exceeded for invalid user ubuntu from 8.210.178.40 port 52290 ssh2 [preauth]
Sep 30 21:54:39 compute-1 sshd-session[252347]: Disconnecting invalid user ubuntu 8.210.178.40 port 52290: Too many authentication failures [preauth]
Sep 30 21:54:39 compute-1 sshd-session[252347]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:39 compute-1 sshd-session[252347]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:54:40 compute-1 nova_compute[192795]: 2025-09-30 21:54:40.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:40 compute-1 sshd-session[252434]: Invalid user ubuntu from 8.210.178.40 port 52942
Sep 30 21:54:40 compute-1 sshd-session[252434]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:40 compute-1 sshd-session[252434]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:42 compute-1 sshd-session[252434]: Failed password for invalid user ubuntu from 8.210.178.40 port 52942 ssh2
Sep 30 21:54:43 compute-1 podman[252436]: 2025-09-30 21:54:43.24892114 +0000 UTC m=+0.082463767 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:54:43 compute-1 sshd-session[252434]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.030 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.030 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.030 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:54:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-1 nova_compute[192795]: 2025-09-30 21:54:44.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:45 compute-1 nova_compute[192795]: 2025-09-30 21:54:45.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:45 compute-1 sshd-session[252434]: Failed password for invalid user ubuntu from 8.210.178.40 port 52942 ssh2
Sep 30 21:54:46 compute-1 sshd-session[252434]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:48 compute-1 podman[252458]: 2025-09-30 21:54:48.228810116 +0000 UTC m=+0.063180106 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:54:48 compute-1 podman[252456]: 2025-09-30 21:54:48.231046026 +0000 UTC m=+0.070601756 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible)
Sep 30 21:54:48 compute-1 podman[252457]: 2025-09-30 21:54:48.281268512 +0000 UTC m=+0.116688561 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.298 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "70bc4ef2-d80e-49af-b20a-6240f762b81e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.299 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.332 2 DEBUG nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.519 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.520 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.532 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.533 2 INFO nova.compute.claims [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:54:48 compute-1 sshd-session[252434]: Failed password for invalid user ubuntu from 8.210.178.40 port 52942 ssh2
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.804 2 DEBUG nova.compute.provider_tree [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.831 2 DEBUG nova.scheduler.client.report [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.870 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:48 compute-1 nova_compute[192795]: 2025-09-30 21:54:48.870 2 DEBUG nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.132 2 DEBUG nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.133 2 DEBUG nova.network.neutron [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.176 2 INFO nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.224 2 DEBUG nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.485 2 DEBUG nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.487 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.487 2 INFO nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Creating image(s)
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.488 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "/var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.489 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "/var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.490 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "/var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.502 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.531 2 DEBUG nova.policy [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd6d7afba807d47549781e37178a01774', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2c2e514e7322435988a7f3bf398623e4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.572 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.573 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.574 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.584 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.646 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.648 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.702 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.703 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.704 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.794 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.796 2 DEBUG nova.virt.disk.api [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Checking if we can resize image /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.797 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.877 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.878 2 DEBUG nova.virt.disk.api [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Cannot resize image /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.878 2 DEBUG nova.objects.instance [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lazy-loading 'migration_context' on Instance uuid 70bc4ef2-d80e-49af-b20a-6240f762b81e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:54:49 compute-1 sshd-session[252434]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.896 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.896 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Ensure instance console log exists: /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.896 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.897 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:49 compute-1 nova_compute[192795]: 2025-09-30 21:54:49.897 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:50 compute-1 nova_compute[192795]: 2025-09-30 21:54:50.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:50 compute-1 nova_compute[192795]: 2025-09-30 21:54:50.936 2 DEBUG nova.network.neutron [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Successfully created port: 0b25a929-986f-4efa-bd3d-b1e23e16cc0b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:54:51 compute-1 sshd-session[252434]: Failed password for invalid user ubuntu from 8.210.178.40 port 52942 ssh2
Sep 30 21:54:52 compute-1 nova_compute[192795]: 2025-09-30 21:54:52.179 2 DEBUG nova.network.neutron [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Successfully updated port: 0b25a929-986f-4efa-bd3d-b1e23e16cc0b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:54:52 compute-1 nova_compute[192795]: 2025-09-30 21:54:52.206 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:54:52 compute-1 nova_compute[192795]: 2025-09-30 21:54:52.206 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquired lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:54:52 compute-1 nova_compute[192795]: 2025-09-30 21:54:52.206 2 DEBUG nova.network.neutron [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:54:52 compute-1 nova_compute[192795]: 2025-09-30 21:54:52.375 2 DEBUG nova.compute.manager [req-fbcfb342-6abe-46f4-a80e-eab22b987cec req-022ba826-1740-4036-b02b-a50c61c3340f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received event network-changed-0b25a929-986f-4efa-bd3d-b1e23e16cc0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:52 compute-1 nova_compute[192795]: 2025-09-30 21:54:52.375 2 DEBUG nova.compute.manager [req-fbcfb342-6abe-46f4-a80e-eab22b987cec req-022ba826-1740-4036-b02b-a50c61c3340f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Refreshing instance network info cache due to event network-changed-0b25a929-986f-4efa-bd3d-b1e23e16cc0b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:54:52 compute-1 nova_compute[192795]: 2025-09-30 21:54:52.375 2 DEBUG oslo_concurrency.lockutils [req-fbcfb342-6abe-46f4-a80e-eab22b987cec req-022ba826-1740-4036-b02b-a50c61c3340f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:54:52 compute-1 nova_compute[192795]: 2025-09-30 21:54:52.495 2 DEBUG nova.network.neutron [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:54:52 compute-1 sshd-session[252434]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.180 2 DEBUG nova.network.neutron [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updating instance_info_cache with network_info: [{"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.218 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Releasing lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.219 2 DEBUG nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Instance network_info: |[{"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.220 2 DEBUG oslo_concurrency.lockutils [req-fbcfb342-6abe-46f4-a80e-eab22b987cec req-022ba826-1740-4036-b02b-a50c61c3340f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.221 2 DEBUG nova.network.neutron [req-fbcfb342-6abe-46f4-a80e-eab22b987cec req-022ba826-1740-4036-b02b-a50c61c3340f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Refreshing network info cache for port 0b25a929-986f-4efa-bd3d-b1e23e16cc0b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.224 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Start _get_guest_xml network_info=[{"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.230 2 WARNING nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.236 2 DEBUG nova.virt.libvirt.host [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.237 2 DEBUG nova.virt.libvirt.host [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.243 2 DEBUG nova.virt.libvirt.host [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.244 2 DEBUG nova.virt.libvirt.host [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.245 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.245 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.246 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.247 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.247 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.247 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.248 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.248 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.248 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.249 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.249 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.249 2 DEBUG nova.virt.hardware [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.255 2 DEBUG nova.virt.libvirt.vif [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1785764051',display_name='tempest-TestSnapshotPattern-server-1785764051',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1785764051',id=182,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLKL8WwnYuJEP3XpY/ju5SXa+fZd+0s4ElE/Ammc3JO3wP15Y53TJ4QGSyyMbttI4T5Fjj/YGgDR1amj6cHQX5O4wQ/GeWnvDWjS/d7Zz3S4MDwj0ljVzMOx5HSsjDMRAA==',key_name='tempest-TestSnapshotPattern-42320000',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2c2e514e7322435988a7f3bf398623e4',ramdisk_id='',reservation_id='r-cxjkvhqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1968938915',owner_user_name='tempest-TestSnapshotPattern-1968938915-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:54:49Z,user_data=None,user_id='d6d7afba807d47549781e37178a01774',uuid=70bc4ef2-d80e-49af-b20a-6240f762b81e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.255 2 DEBUG nova.network.os_vif_util [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converting VIF {"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.256 2 DEBUG nova.network.os_vif_util [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:33:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b25a929-986f-4efa-bd3d-b1e23e16cc0b,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b25a929-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.258 2 DEBUG nova.objects.instance [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 70bc4ef2-d80e-49af-b20a-6240f762b81e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.277 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <uuid>70bc4ef2-d80e-49af-b20a-6240f762b81e</uuid>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <name>instance-000000b6</name>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <nova:name>tempest-TestSnapshotPattern-server-1785764051</nova:name>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:54:54</nova:creationTime>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:54:54 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:54:54 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:54:54 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:54:54 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:54:54 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:54:54 compute-1 nova_compute[192795]:         <nova:user uuid="d6d7afba807d47549781e37178a01774">tempest-TestSnapshotPattern-1968938915-project-member</nova:user>
Sep 30 21:54:54 compute-1 nova_compute[192795]:         <nova:project uuid="2c2e514e7322435988a7f3bf398623e4">tempest-TestSnapshotPattern-1968938915</nova:project>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:54:54 compute-1 nova_compute[192795]:         <nova:port uuid="0b25a929-986f-4efa-bd3d-b1e23e16cc0b">
Sep 30 21:54:54 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <system>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <entry name="serial">70bc4ef2-d80e-49af-b20a-6240f762b81e</entry>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <entry name="uuid">70bc4ef2-d80e-49af-b20a-6240f762b81e</entry>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     </system>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <os>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   </os>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <features>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   </features>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk.config"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:4c:33:fa"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <target dev="tap0b25a929-98"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/console.log" append="off"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <video>
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     </video>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:54:54 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:54:54 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:54:54 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:54:54 compute-1 nova_compute[192795]: </domain>
Sep 30 21:54:54 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.278 2 DEBUG nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Preparing to wait for external event network-vif-plugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.278 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.279 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.279 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.280 2 DEBUG nova.virt.libvirt.vif [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1785764051',display_name='tempest-TestSnapshotPattern-server-1785764051',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1785764051',id=182,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLKL8WwnYuJEP3XpY/ju5SXa+fZd+0s4ElE/Ammc3JO3wP15Y53TJ4QGSyyMbttI4T5Fjj/YGgDR1amj6cHQX5O4wQ/GeWnvDWjS/d7Zz3S4MDwj0ljVzMOx5HSsjDMRAA==',key_name='tempest-TestSnapshotPattern-42320000',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2c2e514e7322435988a7f3bf398623e4',ramdisk_id='',reservation_id='r-cxjkvhqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1968938915',owner_user_name='tempest-TestSnapshotPattern-1968938915-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:54:49Z,user_data=None,user_id='d6d7afba807d47549781e37178a01774',uuid=70bc4ef2-d80e-49af-b20a-6240f762b81e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.280 2 DEBUG nova.network.os_vif_util [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converting VIF {"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.281 2 DEBUG nova.network.os_vif_util [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:33:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b25a929-986f-4efa-bd3d-b1e23e16cc0b,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b25a929-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.282 2 DEBUG os_vif [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:33:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b25a929-986f-4efa-bd3d-b1e23e16cc0b,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b25a929-98') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.284 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.285 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.289 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b25a929-98, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.290 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b25a929-98, col_values=(('external_ids', {'iface-id': '0b25a929-986f-4efa-bd3d-b1e23e16cc0b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4c:33:fa', 'vm-uuid': '70bc4ef2-d80e-49af-b20a-6240f762b81e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:54 compute-1 NetworkManager[51724]: <info>  [1759269294.2930] manager: (tap0b25a929-98): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/382)
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.302 2 INFO os_vif [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:33:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b25a929-986f-4efa-bd3d-b1e23e16cc0b,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b25a929-98')
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.390 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.390 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.391 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] No VIF found with MAC fa:16:3e:4c:33:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:54:54 compute-1 nova_compute[192795]: 2025-09-30 21:54:54.391 2 INFO nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Using config drive
Sep 30 21:54:54 compute-1 sshd-session[252434]: Failed password for invalid user ubuntu from 8.210.178.40 port 52942 ssh2
Sep 30 21:54:55 compute-1 nova_compute[192795]: 2025-09-30 21:54:55.517 2 INFO nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Creating config drive at /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk.config
Sep 30 21:54:55 compute-1 nova_compute[192795]: 2025-09-30 21:54:55.524 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpug60quc6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:55 compute-1 nova_compute[192795]: 2025-09-30 21:54:55.662 2 DEBUG oslo_concurrency.processutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpug60quc6" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:55 compute-1 kernel: tap0b25a929-98: entered promiscuous mode
Sep 30 21:54:55 compute-1 NetworkManager[51724]: <info>  [1759269295.7485] manager: (tap0b25a929-98): new Tun device (/org/freedesktop/NetworkManager/Devices/383)
Sep 30 21:54:55 compute-1 ovn_controller[94902]: 2025-09-30T21:54:55Z|00766|binding|INFO|Claiming lport 0b25a929-986f-4efa-bd3d-b1e23e16cc0b for this chassis.
Sep 30 21:54:55 compute-1 ovn_controller[94902]: 2025-09-30T21:54:55Z|00767|binding|INFO|0b25a929-986f-4efa-bd3d-b1e23e16cc0b: Claiming fa:16:3e:4c:33:fa 10.100.0.5
Sep 30 21:54:55 compute-1 nova_compute[192795]: 2025-09-30 21:54:55.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:55 compute-1 nova_compute[192795]: 2025-09-30 21:54:55.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:55 compute-1 nova_compute[192795]: 2025-09-30 21:54:55.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.764 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:33:fa 10.100.0.5'], port_security=['fa:16:3e:4c:33:fa 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '70bc4ef2-d80e-49af-b20a-6240f762b81e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36195885-e54a-4c05-b721-98be98333841', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c2e514e7322435988a7f3bf398623e4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '98fe3b07-2473-4235-becf-2f443e01bab9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d62c416-32f7-4ff1-bc52-93428ed2b707, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=0b25a929-986f-4efa-bd3d-b1e23e16cc0b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.765 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 0b25a929-986f-4efa-bd3d-b1e23e16cc0b in datapath 36195885-e54a-4c05-b721-98be98333841 bound to our chassis
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.767 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 36195885-e54a-4c05-b721-98be98333841
Sep 30 21:54:55 compute-1 systemd-udevd[252559]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.786 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fb151cdd-0b98-4b47-987a-c73497bc1712]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.788 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap36195885-e1 in ovnmeta-36195885-e54a-4c05-b721-98be98333841 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.790 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap36195885-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.791 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0d11560d-dde3-4ff5-9ff6-a2b85c712d73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.792 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3983d929-1076-4695-b754-71de95cac2f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 systemd-machined[152783]: New machine qemu-85-instance-000000b6.
Sep 30 21:54:55 compute-1 NetworkManager[51724]: <info>  [1759269295.8021] device (tap0b25a929-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:54:55 compute-1 NetworkManager[51724]: <info>  [1759269295.8034] device (tap0b25a929-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.806 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[f3cbdeff-dcd1-4a84-a165-268e301f78ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.826 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4ebf60c6-f379-43dd-8d97-3e97c6ed5027]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 nova_compute[192795]: 2025-09-30 21:54:55.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:55 compute-1 ovn_controller[94902]: 2025-09-30T21:54:55Z|00768|binding|INFO|Setting lport 0b25a929-986f-4efa-bd3d-b1e23e16cc0b ovn-installed in OVS
Sep 30 21:54:55 compute-1 ovn_controller[94902]: 2025-09-30T21:54:55Z|00769|binding|INFO|Setting lport 0b25a929-986f-4efa-bd3d-b1e23e16cc0b up in Southbound
Sep 30 21:54:55 compute-1 systemd[1]: Started Virtual Machine qemu-85-instance-000000b6.
Sep 30 21:54:55 compute-1 nova_compute[192795]: 2025-09-30 21:54:55.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:55 compute-1 sshd-session[252434]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.871 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[bdcb0d50-5107-46ee-8e72-a550309784fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.877 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb25220-2c91-4272-ace9-672d8894914c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 NetworkManager[51724]: <info>  [1759269295.8793] manager: (tap36195885-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/384)
Sep 30 21:54:55 compute-1 systemd-udevd[252563]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.908 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9d88b3-83fc-4a1e-b907-f1a818dd92ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.911 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb6a6a8-0fc5-41bf-8f83-02885ea454af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 NetworkManager[51724]: <info>  [1759269295.9370] device (tap36195885-e0): carrier: link connected
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.944 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[cb91c759-9ebb-4128-8d63-b6261b170d09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.969 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[352204b9-d2cd-486b-9568-2d2fb865417e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap36195885-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:d5:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599161, 'reachable_time': 38210, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252592, 'error': None, 'target': 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:55 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:55.991 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[3db37915-b371-4c7c-b4e0-50d3e59a88be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe57:d5fa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 599161, 'tstamp': 599161}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252593, 'error': None, 'target': 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.022 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a82b7ccb-6c4c-4004-b678-9f9ffd2847e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap36195885-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:d5:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599161, 'reachable_time': 38210, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252594, 'error': None, 'target': 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.079 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[f29f1867-4ee8-4ef9-bae2-7ea3a6e31768]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.160 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[640b8ed6-20e7-4073-9342-30ff1dd8f58b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.162 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36195885-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.162 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.163 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36195885-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-1 kernel: tap36195885-e0: entered promiscuous mode
Sep 30 21:54:56 compute-1 NetworkManager[51724]: <info>  [1759269296.1665] manager: (tap36195885-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.169 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap36195885-e0, col_values=(('external_ids', {'iface-id': '4398875c-5162-48bf-962e-225588b9b8b0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-1 ovn_controller[94902]: 2025-09-30T21:54:56Z|00770|binding|INFO|Releasing lport 4398875c-5162-48bf-962e-225588b9b8b0 from this chassis (sb_readonly=0)
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.196 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/36195885-e54a-4c05-b721-98be98333841.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/36195885-e54a-4c05-b721-98be98333841.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.197 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8035faaf-6017-485f-85cc-455a04df894d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.198 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-36195885-e54a-4c05-b721-98be98333841
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/36195885-e54a-4c05-b721-98be98333841.pid.haproxy
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 36195885-e54a-4c05-b721-98be98333841
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.199 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'env', 'PROCESS_TAG=haproxy-36195885-e54a-4c05-b721-98be98333841', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/36195885-e54a-4c05-b721-98be98333841.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.362 2 DEBUG nova.compute.manager [req-27952faa-9c76-4eec-81e5-bcbe7c9c92f4 req-5b47b7c0-2f08-4252-9c90-3dd08959268c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received event network-vif-plugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.363 2 DEBUG oslo_concurrency.lockutils [req-27952faa-9c76-4eec-81e5-bcbe7c9c92f4 req-5b47b7c0-2f08-4252-9c90-3dd08959268c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.364 2 DEBUG oslo_concurrency.lockutils [req-27952faa-9c76-4eec-81e5-bcbe7c9c92f4 req-5b47b7c0-2f08-4252-9c90-3dd08959268c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.365 2 DEBUG oslo_concurrency.lockutils [req-27952faa-9c76-4eec-81e5-bcbe7c9c92f4 req-5b47b7c0-2f08-4252-9c90-3dd08959268c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.365 2 DEBUG nova.compute.manager [req-27952faa-9c76-4eec-81e5-bcbe7c9c92f4 req-5b47b7c0-2f08-4252-9c90-3dd08959268c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Processing event network-vif-plugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.544 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-1 podman[252633]: 2025-09-30 21:54:56.58188333 +0000 UTC m=+0.055582068 container create fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:54:56 compute-1 systemd[1]: Started libpod-conmon-fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c.scope.
Sep 30 21:54:56 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.644 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269296.6432817, 70bc4ef2-d80e-49af-b20a-6240f762b81e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.644 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] VM Started (Lifecycle Event)
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.647 2 DEBUG nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:54:56 compute-1 podman[252633]: 2025-09-30 21:54:56.556085699 +0000 UTC m=+0.029784457 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:54:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fd30de33dcafdcdbfb5002d54f05464ce3e8ac3ecd354d42e3be2ba5148ca72/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.652 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.667 2 INFO nova.virt.libvirt.driver [-] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Instance spawned successfully.
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.668 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:54:56 compute-1 podman[252633]: 2025-09-30 21:54:56.674083235 +0000 UTC m=+0.147781993 container init fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.674 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.679 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:54:56 compute-1 podman[252633]: 2025-09-30 21:54:56.683692528 +0000 UTC m=+0.157391266 container start fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.698 2 DEBUG nova.network.neutron [req-fbcfb342-6abe-46f4-a80e-eab22b987cec req-022ba826-1740-4036-b02b-a50c61c3340f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updated VIF entry in instance network info cache for port 0b25a929-986f-4efa-bd3d-b1e23e16cc0b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.699 2 DEBUG nova.network.neutron [req-fbcfb342-6abe-46f4-a80e-eab22b987cec req-022ba826-1740-4036-b02b-a50c61c3340f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updating instance_info_cache with network_info: [{"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:56 compute-1 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[252649]: [NOTICE]   (252653) : New worker (252655) forked
Sep 30 21:54:56 compute-1 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[252649]: [NOTICE]   (252653) : Loading success.
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.707 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.708 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269296.6435204, 70bc4ef2-d80e-49af-b20a-6240f762b81e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.708 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] VM Paused (Lifecycle Event)
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.715 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.715 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.716 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.716 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.716 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.717 2 DEBUG nova.virt.libvirt.driver [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.720 2 DEBUG oslo_concurrency.lockutils [req-fbcfb342-6abe-46f4-a80e-eab22b987cec req-022ba826-1740-4036-b02b-a50c61c3340f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.727 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.731 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269296.6509817, 70bc4ef2-d80e-49af-b20a-6240f762b81e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.731 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] VM Resumed (Lifecycle Event)
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.748 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.752 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:54:56 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:54:56.769 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.777 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.807 2 INFO nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Took 7.32 seconds to spawn the instance on the hypervisor.
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.808 2 DEBUG nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.909 2 INFO nova.compute.manager [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Took 8.44 seconds to build instance.
Sep 30 21:54:56 compute-1 nova_compute[192795]: 2025-09-30 21:54:56.927 2 DEBUG oslo_concurrency.lockutils [None req-29fa29f1-a72c-4a09-958f-601951b5d2a6 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:57 compute-1 sshd-session[252434]: Failed password for invalid user ubuntu from 8.210.178.40 port 52942 ssh2
Sep 30 21:54:58 compute-1 podman[252664]: 2025-09-30 21:54:58.228781576 +0000 UTC m=+0.068070069 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:54:58 compute-1 nova_compute[192795]: 2025-09-30 21:54:58.497 2 DEBUG nova.compute.manager [req-a89572f3-d5c9-497d-a1e9-ba6f3268304b req-e98853a9-1a34-404b-a2ea-9baf4235a4b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received event network-vif-plugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:58 compute-1 nova_compute[192795]: 2025-09-30 21:54:58.498 2 DEBUG oslo_concurrency.lockutils [req-a89572f3-d5c9-497d-a1e9-ba6f3268304b req-e98853a9-1a34-404b-a2ea-9baf4235a4b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:58 compute-1 nova_compute[192795]: 2025-09-30 21:54:58.498 2 DEBUG oslo_concurrency.lockutils [req-a89572f3-d5c9-497d-a1e9-ba6f3268304b req-e98853a9-1a34-404b-a2ea-9baf4235a4b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:58 compute-1 nova_compute[192795]: 2025-09-30 21:54:58.498 2 DEBUG oslo_concurrency.lockutils [req-a89572f3-d5c9-497d-a1e9-ba6f3268304b req-e98853a9-1a34-404b-a2ea-9baf4235a4b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:58 compute-1 nova_compute[192795]: 2025-09-30 21:54:58.498 2 DEBUG nova.compute.manager [req-a89572f3-d5c9-497d-a1e9-ba6f3268304b req-e98853a9-1a34-404b-a2ea-9baf4235a4b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] No waiting events found dispatching network-vif-plugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:54:58 compute-1 nova_compute[192795]: 2025-09-30 21:54:58.499 2 WARNING nova.compute.manager [req-a89572f3-d5c9-497d-a1e9-ba6f3268304b req-e98853a9-1a34-404b-a2ea-9baf4235a4b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received unexpected event network-vif-plugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b for instance with vm_state active and task_state None.
Sep 30 21:54:58 compute-1 sshd-session[252434]: error: maximum authentication attempts exceeded for invalid user ubuntu from 8.210.178.40 port 52942 ssh2 [preauth]
Sep 30 21:54:58 compute-1 sshd-session[252434]: Disconnecting invalid user ubuntu 8.210.178.40 port 52942: Too many authentication failures [preauth]
Sep 30 21:54:58 compute-1 sshd-session[252434]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:58 compute-1 sshd-session[252434]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:54:59 compute-1 nova_compute[192795]: 2025-09-30 21:54:59.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:59 compute-1 nova_compute[192795]: 2025-09-30 21:54:59.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:59 compute-1 nova_compute[192795]: 2025-09-30 21:54:59.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:59 compute-1 NetworkManager[51724]: <info>  [1759269299.8204] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/386)
Sep 30 21:54:59 compute-1 NetworkManager[51724]: <info>  [1759269299.8217] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/387)
Sep 30 21:54:59 compute-1 nova_compute[192795]: 2025-09-30 21:54:59.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:59 compute-1 ovn_controller[94902]: 2025-09-30T21:54:59Z|00771|binding|INFO|Releasing lport 4398875c-5162-48bf-962e-225588b9b8b0 from this chassis (sb_readonly=0)
Sep 30 21:54:59 compute-1 nova_compute[192795]: 2025-09-30 21:54:59.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:00 compute-1 sshd-session[252684]: Invalid user ubuntu from 8.210.178.40 port 53540
Sep 30 21:55:00 compute-1 sshd-session[252684]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:00 compute-1 sshd-session[252684]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:00 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:00.772 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:01 compute-1 nova_compute[192795]: 2025-09-30 21:55:01.077 2 DEBUG nova.compute.manager [req-2586f957-3e7a-4781-9f28-1d0780982e20 req-d46fca0b-7da0-4ab1-b8ca-ad5396a43b6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received event network-changed-0b25a929-986f-4efa-bd3d-b1e23e16cc0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:01 compute-1 nova_compute[192795]: 2025-09-30 21:55:01.078 2 DEBUG nova.compute.manager [req-2586f957-3e7a-4781-9f28-1d0780982e20 req-d46fca0b-7da0-4ab1-b8ca-ad5396a43b6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Refreshing instance network info cache due to event network-changed-0b25a929-986f-4efa-bd3d-b1e23e16cc0b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:55:01 compute-1 nova_compute[192795]: 2025-09-30 21:55:01.078 2 DEBUG oslo_concurrency.lockutils [req-2586f957-3e7a-4781-9f28-1d0780982e20 req-d46fca0b-7da0-4ab1-b8ca-ad5396a43b6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:55:01 compute-1 nova_compute[192795]: 2025-09-30 21:55:01.078 2 DEBUG oslo_concurrency.lockutils [req-2586f957-3e7a-4781-9f28-1d0780982e20 req-d46fca0b-7da0-4ab1-b8ca-ad5396a43b6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:55:01 compute-1 nova_compute[192795]: 2025-09-30 21:55:01.079 2 DEBUG nova.network.neutron [req-2586f957-3e7a-4781-9f28-1d0780982e20 req-d46fca0b-7da0-4ab1-b8ca-ad5396a43b6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Refreshing network info cache for port 0b25a929-986f-4efa-bd3d-b1e23e16cc0b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:55:01 compute-1 nova_compute[192795]: 2025-09-30 21:55:01.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:02 compute-1 sshd-session[252684]: Failed password for invalid user ubuntu from 8.210.178.40 port 53540 ssh2
Sep 30 21:55:02 compute-1 nova_compute[192795]: 2025-09-30 21:55:02.629 2 DEBUG nova.network.neutron [req-2586f957-3e7a-4781-9f28-1d0780982e20 req-d46fca0b-7da0-4ab1-b8ca-ad5396a43b6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updated VIF entry in instance network info cache for port 0b25a929-986f-4efa-bd3d-b1e23e16cc0b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:55:02 compute-1 nova_compute[192795]: 2025-09-30 21:55:02.629 2 DEBUG nova.network.neutron [req-2586f957-3e7a-4781-9f28-1d0780982e20 req-d46fca0b-7da0-4ab1-b8ca-ad5396a43b6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updating instance_info_cache with network_info: [{"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:02 compute-1 nova_compute[192795]: 2025-09-30 21:55:02.652 2 DEBUG oslo_concurrency.lockutils [req-2586f957-3e7a-4781-9f28-1d0780982e20 req-d46fca0b-7da0-4ab1-b8ca-ad5396a43b6f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:55:02 compute-1 nova_compute[192795]: 2025-09-30 21:55:02.719 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:03 compute-1 sshd-session[252684]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:03 compute-1 podman[252688]: 2025-09-30 21:55:03.240772386 +0000 UTC m=+0.067894544 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible)
Sep 30 21:55:03 compute-1 podman[252689]: 2025-09-30 21:55:03.241584437 +0000 UTC m=+0.065842400 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:55:03 compute-1 podman[252690]: 2025-09-30 21:55:03.258671728 +0000 UTC m=+0.080118106 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 21:55:03 compute-1 nova_compute[192795]: 2025-09-30 21:55:03.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:04 compute-1 nova_compute[192795]: 2025-09-30 21:55:04.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:04 compute-1 nova_compute[192795]: 2025-09-30 21:55:04.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:04 compute-1 sshd-session[252684]: Failed password for invalid user ubuntu from 8.210.178.40 port 53540 ssh2
Sep 30 21:55:04 compute-1 nova_compute[192795]: 2025-09-30 21:55:04.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:04 compute-1 nova_compute[192795]: 2025-09-30 21:55:04.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:55:05 compute-1 nova_compute[192795]: 2025-09-30 21:55:05.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:05 compute-1 nova_compute[192795]: 2025-09-30 21:55:05.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:55:06 compute-1 sshd-session[252684]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:07 compute-1 nova_compute[192795]: 2025-09-30 21:55:07.743 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:08 compute-1 sshd-session[252684]: Failed password for invalid user ubuntu from 8.210.178.40 port 53540 ssh2
Sep 30 21:55:08 compute-1 nova_compute[192795]: 2025-09-30 21:55:08.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:08 compute-1 nova_compute[192795]: 2025-09-30 21:55:08.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:08 compute-1 nova_compute[192795]: 2025-09-30 21:55:08.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:08 compute-1 nova_compute[192795]: 2025-09-30 21:55:08.729 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:08 compute-1 nova_compute[192795]: 2025-09-30 21:55:08.729 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:55:08 compute-1 nova_compute[192795]: 2025-09-30 21:55:08.807 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:08 compute-1 nova_compute[192795]: 2025-09-30 21:55:08.882 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:08 compute-1 nova_compute[192795]: 2025-09-30 21:55:08.883 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:08 compute-1 nova_compute[192795]: 2025-09-30 21:55:08.943 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:09 compute-1 sshd-session[252684]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.121 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.123 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5509MB free_disk=73.269775390625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.123 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.124 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.253 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 70bc4ef2-d80e-49af-b20a-6240f762b81e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.254 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.254 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:09 compute-1 ovn_controller[94902]: 2025-09-30T21:55:09Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4c:33:fa 10.100.0.5
Sep 30 21:55:09 compute-1 ovn_controller[94902]: 2025-09-30T21:55:09Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4c:33:fa 10.100.0.5
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.468 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.480 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.497 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:55:09 compute-1 nova_compute[192795]: 2025-09-30 21:55:09.498 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:10 compute-1 sshd-session[252684]: Failed password for invalid user ubuntu from 8.210.178.40 port 53540 ssh2
Sep 30 21:55:11 compute-1 sshd-session[252684]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:12.027 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:38:30 10.100.0.2 2001:db8::f816:3eff:fec0:3830'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec0:3830/64', 'neutron:device_id': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f488970f-e2de-4ce9-9091-bbb5c56f0cb2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=43244ebf-4de3-435e-a1f4-6cebae949e6e) old=Port_Binding(mac=['fa:16:3e:c0:38:30 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:12.029 103861 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 43244ebf-4de3-435e-a1f4-6cebae949e6e in datapath 412a19ab-c94f-46ed-9c4e-c69fc7962be3 updated
Sep 30 21:55:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:12.030 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 412a19ab-c94f-46ed-9c4e-c69fc7962be3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:55:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:12.031 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[a1418865-abc5-4f63-ba84-2c05410c8b2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:13 compute-1 nova_compute[192795]: 2025-09-30 21:55:13.499 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:14 compute-1 sshd-session[252684]: Failed password for invalid user ubuntu from 8.210.178.40 port 53540 ssh2
Sep 30 21:55:14 compute-1 podman[252771]: 2025-09-30 21:55:14.222217705 +0000 UTC m=+0.061715380 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:55:14 compute-1 nova_compute[192795]: 2025-09-30 21:55:14.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:14 compute-1 nova_compute[192795]: 2025-09-30 21:55:14.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:14 compute-1 nova_compute[192795]: 2025-09-30 21:55:14.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 21:55:14 compute-1 nova_compute[192795]: 2025-09-30 21:55:14.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:55:14 compute-1 nova_compute[192795]: 2025-09-30 21:55:14.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:14 compute-1 nova_compute[192795]: 2025-09-30 21:55:14.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:55:14 compute-1 nova_compute[192795]: 2025-09-30 21:55:14.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:14 compute-1 nova_compute[192795]: 2025-09-30 21:55:14.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:14 compute-1 sshd-session[252684]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:16 compute-1 sshd-session[252684]: Failed password for invalid user ubuntu from 8.210.178.40 port 53540 ssh2
Sep 30 21:55:16 compute-1 nova_compute[192795]: 2025-09-30 21:55:16.311 2 DEBUG nova.compute.manager [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:16 compute-1 sshd-session[252684]: error: maximum authentication attempts exceeded for invalid user ubuntu from 8.210.178.40 port 53540 ssh2 [preauth]
Sep 30 21:55:16 compute-1 sshd-session[252684]: Disconnecting invalid user ubuntu 8.210.178.40 port 53540: Too many authentication failures [preauth]
Sep 30 21:55:16 compute-1 sshd-session[252684]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:16 compute-1 sshd-session[252684]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:55:16 compute-1 nova_compute[192795]: 2025-09-30 21:55:16.377 2 INFO nova.compute.manager [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] instance snapshotting
Sep 30 21:55:16 compute-1 nova_compute[192795]: 2025-09-30 21:55:16.687 2 INFO nova.virt.libvirt.driver [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Beginning live snapshot process
Sep 30 21:55:16 compute-1 virtqemud[192217]: invalid argument: disk vda does not have an active block job
Sep 30 21:55:16 compute-1 nova_compute[192795]: 2025-09-30 21:55:16.920 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.017 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json -f qcow2" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.018 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.080 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json -f qcow2" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.094 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.158 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.159 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpmbfue8th/99dd3307027b4b64b0ac72c9c25f4fcb.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.261 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpmbfue8th/99dd3307027b4b64b0ac72c9c25f4fcb.delta 1073741824" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.263 2 INFO nova.virt.libvirt.driver [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.334 2 DEBUG nova.virt.libvirt.guest [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:55:17 compute-1 nova_compute[192795]: 2025-09-30 21:55:17.838 2 DEBUG nova.virt.libvirt.guest [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] COPY block job progress, current cursor: 74514432 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:55:18 compute-1 sshd-session[252791]: Invalid user ubuntu from 8.210.178.40 port 54130
Sep 30 21:55:18 compute-1 sshd-session[252791]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:18 compute-1 sshd-session[252791]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:18.297 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:38:30 10.100.0.2 2001:db8:0:1:f816:3eff:fec0:3830 2001:db8::f816:3eff:fec0:3830'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fec0:3830/64 2001:db8::f816:3eff:fec0:3830/64', 'neutron:device_id': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f488970f-e2de-4ce9-9091-bbb5c56f0cb2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=43244ebf-4de3-435e-a1f4-6cebae949e6e) old=Port_Binding(mac=['fa:16:3e:c0:38:30 10.100.0.2 2001:db8::f816:3eff:fec0:3830'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec0:3830/64', 'neutron:device_id': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:18.299 103861 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 43244ebf-4de3-435e-a1f4-6cebae949e6e in datapath 412a19ab-c94f-46ed-9c4e-c69fc7962be3 updated
Sep 30 21:55:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:18.301 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 412a19ab-c94f-46ed-9c4e-c69fc7962be3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:55:18 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:18.302 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[178b475d-5ffa-474b-bc56-f8fdbcb2f001]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-1 nova_compute[192795]: 2025-09-30 21:55:18.341 2 DEBUG nova.virt.libvirt.guest [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:55:18 compute-1 nova_compute[192795]: 2025-09-30 21:55:18.345 2 INFO nova.virt.libvirt.driver [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:55:18 compute-1 nova_compute[192795]: 2025-09-30 21:55:18.382 2 DEBUG nova.privsep.utils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:55:18 compute-1 nova_compute[192795]: 2025-09-30 21:55:18.383 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpmbfue8th/99dd3307027b4b64b0ac72c9c25f4fcb.delta /var/lib/nova/instances/snapshots/tmpmbfue8th/99dd3307027b4b64b0ac72c9c25f4fcb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:18 compute-1 nova_compute[192795]: 2025-09-30 21:55:18.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:18 compute-1 nova_compute[192795]: 2025-09-30 21:55:18.729 2 DEBUG oslo_concurrency.processutils [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpmbfue8th/99dd3307027b4b64b0ac72c9c25f4fcb.delta /var/lib/nova/instances/snapshots/tmpmbfue8th/99dd3307027b4b64b0ac72c9c25f4fcb" returned: 0 in 0.346s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:18 compute-1 nova_compute[192795]: 2025-09-30 21:55:18.735 2 INFO nova.virt.libvirt.driver [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Snapshot extracted, beginning image upload
Sep 30 21:55:19 compute-1 podman[252824]: 2025-09-30 21:55:19.233432827 +0000 UTC m=+0.066145439 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:55:19 compute-1 podman[252823]: 2025-09-30 21:55:19.26653073 +0000 UTC m=+0.104666084 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:55:19 compute-1 podman[252822]: 2025-09-30 21:55:19.26655377 +0000 UTC m=+0.109141433 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:55:19 compute-1 nova_compute[192795]: 2025-09-30 21:55:19.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:19 compute-1 nova_compute[192795]: 2025-09-30 21:55:19.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:20 compute-1 sshd-session[252791]: Failed password for invalid user ubuntu from 8.210.178.40 port 54130 ssh2
Sep 30 21:55:20 compute-1 sshd-session[252791]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:21 compute-1 nova_compute[192795]: 2025-09-30 21:55:21.237 2 INFO nova.virt.libvirt.driver [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Snapshot image upload complete
Sep 30 21:55:21 compute-1 nova_compute[192795]: 2025-09-30 21:55:21.238 2 INFO nova.compute.manager [None req-8e2052bc-c34e-4017-8bd6-d67162d044ad d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Took 4.85 seconds to snapshot the instance on the hypervisor.
Sep 30 21:55:22 compute-1 sshd-session[252791]: Failed password for invalid user ubuntu from 8.210.178.40 port 54130 ssh2
Sep 30 21:55:23 compute-1 sshd-session[252791]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.879 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.880 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.881 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:55:24 compute-1 nova_compute[192795]: 2025-09-30 21:55:24.881 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 70bc4ef2-d80e-49af-b20a-6240f762b81e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:55:26 compute-1 sshd-session[252791]: Failed password for invalid user ubuntu from 8.210.178.40 port 54130 ssh2
Sep 30 21:55:26 compute-1 sshd-session[252791]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:27 compute-1 nova_compute[192795]: 2025-09-30 21:55:27.000 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updating instance_info_cache with network_info: [{"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:27 compute-1 nova_compute[192795]: 2025-09-30 21:55:27.022 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:55:27 compute-1 nova_compute[192795]: 2025-09-30 21:55:27.023 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:55:27 compute-1 nova_compute[192795]: 2025-09-30 21:55:27.024 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:27 compute-1 nova_compute[192795]: 2025-09-30 21:55:27.024 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:55:27 compute-1 nova_compute[192795]: 2025-09-30 21:55:27.042 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:55:29 compute-1 sshd-session[252791]: Failed password for invalid user ubuntu from 8.210.178.40 port 54130 ssh2
Sep 30 21:55:29 compute-1 podman[252891]: 2025-09-30 21:55:29.243578329 +0000 UTC m=+0.077073646 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:55:29 compute-1 nova_compute[192795]: 2025-09-30 21:55:29.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:29 compute-1 nova_compute[192795]: 2025-09-30 21:55:29.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:29 compute-1 sshd-session[252791]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:31 compute-1 sshd-session[252791]: Failed password for invalid user ubuntu from 8.210.178.40 port 54130 ssh2
Sep 30 21:55:32 compute-1 sshd-session[252791]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:34 compute-1 podman[252911]: 2025-09-30 21:55:34.247244459 +0000 UTC m=+0.081264936 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:55:34 compute-1 podman[252913]: 2025-09-30 21:55:34.249497538 +0000 UTC m=+0.073591603 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:55:34 compute-1 podman[252912]: 2025-09-30 21:55:34.259446881 +0000 UTC m=+0.094967608 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:55:34 compute-1 nova_compute[192795]: 2025-09-30 21:55:34.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:34 compute-1 sshd-session[252791]: Failed password for invalid user ubuntu from 8.210.178.40 port 54130 ssh2
Sep 30 21:55:35 compute-1 sshd-session[252791]: error: maximum authentication attempts exceeded for invalid user ubuntu from 8.210.178.40 port 54130 ssh2 [preauth]
Sep 30 21:55:35 compute-1 sshd-session[252791]: Disconnecting invalid user ubuntu 8.210.178.40 port 54130: Too many authentication failures [preauth]
Sep 30 21:55:35 compute-1 sshd-session[252791]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:35 compute-1 sshd-session[252791]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:55:37 compute-1 sshd-session[252975]: Invalid user ubuntu from 8.210.178.40 port 54712
Sep 30 21:55:37 compute-1 sshd-session[252975]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:37 compute-1 sshd-session[252975]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:37.335 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:37.336 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:55:37 compute-1 nova_compute[192795]: 2025-09-30 21:55:37.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:37 compute-1 ovn_controller[94902]: 2025-09-30T21:55:37Z|00772|binding|INFO|Releasing lport 4398875c-5162-48bf-962e-225588b9b8b0 from this chassis (sb_readonly=0)
Sep 30 21:55:37 compute-1 nova_compute[192795]: 2025-09-30 21:55:37.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:38.716 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:38.717 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:38.718 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:39 compute-1 nova_compute[192795]: 2025-09-30 21:55:39.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:39 compute-1 nova_compute[192795]: 2025-09-30 21:55:39.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:39 compute-1 nova_compute[192795]: 2025-09-30 21:55:39.697 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:39 compute-1 sshd-session[252975]: Failed password for invalid user ubuntu from 8.210.178.40 port 54712 ssh2
Sep 30 21:55:40 compute-1 sshd-session[252975]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:41 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:55:41.339 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:41 compute-1 sshd-session[252975]: Failed password for invalid user ubuntu from 8.210.178.40 port 54712 ssh2
Sep 30 21:55:43 compute-1 sshd-session[252975]: Disconnecting invalid user ubuntu 8.210.178.40 port 54712: Change of username or service not allowed: (ubuntu,ssh-connection) -> (dev,ssh-connection) [preauth]
Sep 30 21:55:43 compute-1 sshd-session[252975]: PAM 1 more authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:44 compute-1 nova_compute[192795]: 2025-09-30 21:55:44.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:44 compute-1 sshd-session[252977]: Invalid user dev from 8.210.178.40 port 54948
Sep 30 21:55:44 compute-1 sshd-session[252977]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:44 compute-1 sshd-session[252977]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:44 compute-1 podman[252979]: 2025-09-30 21:55:44.893683576 +0000 UTC m=+0.058939997 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid)
Sep 30 21:55:46 compute-1 sshd-session[252977]: Failed password for invalid user dev from 8.210.178.40 port 54948 ssh2
Sep 30 21:55:47 compute-1 sshd-session[252977]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:49 compute-1 nova_compute[192795]: 2025-09-30 21:55:49.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:49 compute-1 sshd-session[252977]: Failed password for invalid user dev from 8.210.178.40 port 54948 ssh2
Sep 30 21:55:50 compute-1 podman[253001]: 2025-09-30 21:55:50.251511248 +0000 UTC m=+0.092256987 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd)
Sep 30 21:55:50 compute-1 podman[253003]: 2025-09-30 21:55:50.288774512 +0000 UTC m=+0.114812693 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:55:50 compute-1 podman[253002]: 2025-09-30 21:55:50.301011545 +0000 UTC m=+0.127071676 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:55:50 compute-1 sshd-session[252977]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:52 compute-1 sshd-session[252977]: Failed password for invalid user dev from 8.210.178.40 port 54948 ssh2
Sep 30 21:55:54 compute-1 sshd-session[252977]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:54 compute-1 nova_compute[192795]: 2025-09-30 21:55:54.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:56 compute-1 sshd-session[252977]: Failed password for invalid user dev from 8.210.178.40 port 54948 ssh2
Sep 30 21:55:57 compute-1 sshd-session[252977]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:58 compute-1 sshd-session[252977]: Failed password for invalid user dev from 8.210.178.40 port 54948 ssh2
Sep 30 21:55:58 compute-1 sshd-session[252977]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:59 compute-1 nova_compute[192795]: 2025-09-30 21:55:59.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:59 compute-1 nova_compute[192795]: 2025-09-30 21:55:59.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:59 compute-1 nova_compute[192795]: 2025-09-30 21:55:59.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 21:55:59 compute-1 nova_compute[192795]: 2025-09-30 21:55:59.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:55:59 compute-1 nova_compute[192795]: 2025-09-30 21:55:59.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:55:59 compute-1 nova_compute[192795]: 2025-09-30 21:55:59.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:59 compute-1 nova_compute[192795]: 2025-09-30 21:55:59.946 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:59 compute-1 nova_compute[192795]: 2025-09-30 21:55:59.946 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:59 compute-1 nova_compute[192795]: 2025-09-30 21:55:59.968 2 DEBUG nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.140 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.141 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.153 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.153 2 INFO nova.compute.claims [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:56:00 compute-1 podman[253072]: 2025-09-30 21:56:00.216393485 +0000 UTC m=+0.062175342 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=ceilometer_agent_compute)
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.527 2 DEBUG nova.compute.provider_tree [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.549 2 DEBUG nova.scheduler.client.report [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.578 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.579 2 DEBUG nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.636 2 DEBUG nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.636 2 DEBUG nova.network.neutron [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.666 2 INFO nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.690 2 DEBUG nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:56:00 compute-1 sshd-session[252977]: Failed password for invalid user dev from 8.210.178.40 port 54948 ssh2
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.829 2 DEBUG nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.831 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.831 2 INFO nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Creating image(s)
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.832 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.832 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.833 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.851 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.928 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.929 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.930 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:00 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.945 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:00.998 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.000 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.036 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.037 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.038 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.100 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.101 2 DEBUG nova.virt.disk.api [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.101 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.123 2 DEBUG nova.policy [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.164 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.164 2 DEBUG nova.virt.disk.api [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.165 2 DEBUG nova.objects.instance [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.183 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.183 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Ensure instance console log exists: /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.184 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.184 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.184 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:01 compute-1 sshd-session[252977]: error: maximum authentication attempts exceeded for invalid user dev from 8.210.178.40 port 54948 ssh2 [preauth]
Sep 30 21:56:01 compute-1 sshd-session[252977]: Disconnecting invalid user dev 8.210.178.40 port 54948: Too many authentication failures [preauth]
Sep 30 21:56:01 compute-1 sshd-session[252977]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:56:01 compute-1 sshd-session[252977]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:56:01 compute-1 nova_compute[192795]: 2025-09-30 21:56:01.847 2 DEBUG nova.network.neutron [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Successfully created port: ebdbea5f-7191-4743-b858-8baa349733aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:56:02 compute-1 nova_compute[192795]: 2025-09-30 21:56:02.720 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:02 compute-1 nova_compute[192795]: 2025-09-30 21:56:02.781 2 DEBUG nova.network.neutron [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Successfully updated port: ebdbea5f-7191-4743-b858-8baa349733aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:56:02 compute-1 nova_compute[192795]: 2025-09-30 21:56:02.798 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:56:02 compute-1 nova_compute[192795]: 2025-09-30 21:56:02.798 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:56:02 compute-1 nova_compute[192795]: 2025-09-30 21:56:02.799 2 DEBUG nova.network.neutron [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:56:02 compute-1 nova_compute[192795]: 2025-09-30 21:56:02.907 2 DEBUG nova.compute.manager [req-ce158789-6163-4f23-9495-5ab0da63e774 req-29ea4dd1-498a-40f1-b601-baeb4ffeaad0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received event network-changed-ebdbea5f-7191-4743-b858-8baa349733aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:02 compute-1 nova_compute[192795]: 2025-09-30 21:56:02.908 2 DEBUG nova.compute.manager [req-ce158789-6163-4f23-9495-5ab0da63e774 req-29ea4dd1-498a-40f1-b601-baeb4ffeaad0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Refreshing instance network info cache due to event network-changed-ebdbea5f-7191-4743-b858-8baa349733aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:56:02 compute-1 nova_compute[192795]: 2025-09-30 21:56:02.908 2 DEBUG oslo_concurrency.lockutils [req-ce158789-6163-4f23-9495-5ab0da63e774 req-29ea4dd1-498a-40f1-b601-baeb4ffeaad0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:56:03 compute-1 nova_compute[192795]: 2025-09-30 21:56:03.090 2 DEBUG nova.network.neutron [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:56:03 compute-1 sshd-session[253108]: Invalid user dev from 8.210.178.40 port 55554
Sep 30 21:56:03 compute-1 sshd-session[253108]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:56:03 compute-1 sshd-session[253108]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.788 2 DEBUG nova.network.neutron [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updating instance_info_cache with network_info: [{"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.808 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.808 2 DEBUG nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Instance network_info: |[{"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.808 2 DEBUG oslo_concurrency.lockutils [req-ce158789-6163-4f23-9495-5ab0da63e774 req-29ea4dd1-498a-40f1-b601-baeb4ffeaad0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.809 2 DEBUG nova.network.neutron [req-ce158789-6163-4f23-9495-5ab0da63e774 req-29ea4dd1-498a-40f1-b601-baeb4ffeaad0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Refreshing network info cache for port ebdbea5f-7191-4743-b858-8baa349733aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.811 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Start _get_guest_xml network_info=[{"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.818 2 WARNING nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.824 2 DEBUG nova.virt.libvirt.host [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.825 2 DEBUG nova.virt.libvirt.host [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.830 2 DEBUG nova.virt.libvirt.host [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.831 2 DEBUG nova.virt.libvirt.host [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.832 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.832 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.833 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.833 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.833 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.834 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.834 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.834 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.834 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.834 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.835 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.835 2 DEBUG nova.virt.hardware [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.841 2 DEBUG nova.virt.libvirt.vif [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:55:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1107065721',display_name='tempest-TestGettingAddress-server-1107065721',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1107065721',id=186,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOeOS3mBAobKMFVZmSXwhD5pR+hgUSxOEvW8JkTGv9SYHzB8LEBLx4hPSb24o9wi6QTC8NprCVLaWawzMJ4uLeG8TVdIhpgisBZvVv4nk53bbVF0NImbuMXtGSJ50iCNCQ==',key_name='tempest-TestGettingAddress-416719936',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-p8eiyq0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:56:00Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.841 2 DEBUG nova.network.os_vif_util [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.842 2 DEBUG nova.network.os_vif_util [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:bc:03,bridge_name='br-int',has_traffic_filtering=True,id=ebdbea5f-7191-4743-b858-8baa349733aa,network=Network(412a19ab-c94f-46ed-9c4e-c69fc7962be3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebdbea5f-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.843 2 DEBUG nova.objects.instance [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.856 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <uuid>a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7</uuid>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <name>instance-000000ba</name>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <nova:name>tempest-TestGettingAddress-server-1107065721</nova:name>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:56:04</nova:creationTime>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:56:04 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:56:04 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:56:04 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:56:04 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:56:04 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:56:04 compute-1 nova_compute[192795]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:56:04 compute-1 nova_compute[192795]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:56:04 compute-1 nova_compute[192795]:         <nova:port uuid="ebdbea5f-7191-4743-b858-8baa349733aa">
Sep 30 21:56:04 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fef9:bc03" ipVersion="6"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fef9:bc03" ipVersion="6"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <system>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <entry name="serial">a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7</entry>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <entry name="uuid">a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7</entry>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     </system>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <os>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   </os>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <features>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   </features>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk.config"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:f9:bc:03"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <target dev="tapebdbea5f-71"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/console.log" append="off"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <video>
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     </video>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:56:04 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:56:04 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:56:04 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:56:04 compute-1 nova_compute[192795]: </domain>
Sep 30 21:56:04 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.857 2 DEBUG nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Preparing to wait for external event network-vif-plugged-ebdbea5f-7191-4743-b858-8baa349733aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.857 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.857 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.857 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.858 2 DEBUG nova.virt.libvirt.vif [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:55:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1107065721',display_name='tempest-TestGettingAddress-server-1107065721',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1107065721',id=186,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOeOS3mBAobKMFVZmSXwhD5pR+hgUSxOEvW8JkTGv9SYHzB8LEBLx4hPSb24o9wi6QTC8NprCVLaWawzMJ4uLeG8TVdIhpgisBZvVv4nk53bbVF0NImbuMXtGSJ50iCNCQ==',key_name='tempest-TestGettingAddress-416719936',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-p8eiyq0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:56:00Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.858 2 DEBUG nova.network.os_vif_util [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.859 2 DEBUG nova.network.os_vif_util [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:bc:03,bridge_name='br-int',has_traffic_filtering=True,id=ebdbea5f-7191-4743-b858-8baa349733aa,network=Network(412a19ab-c94f-46ed-9c4e-c69fc7962be3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebdbea5f-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.859 2 DEBUG os_vif [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:bc:03,bridge_name='br-int',has_traffic_filtering=True,id=ebdbea5f-7191-4743-b858-8baa349733aa,network=Network(412a19ab-c94f-46ed-9c4e-c69fc7962be3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebdbea5f-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.860 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.861 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.870 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebdbea5f-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.870 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebdbea5f-71, col_values=(('external_ids', {'iface-id': 'ebdbea5f-7191-4743-b858-8baa349733aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:bc:03', 'vm-uuid': 'a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:04 compute-1 NetworkManager[51724]: <info>  [1759269364.8734] manager: (tapebdbea5f-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/388)
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.882 2 INFO os_vif [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:bc:03,bridge_name='br-int',has_traffic_filtering=True,id=ebdbea5f-7191-4743-b858-8baa349733aa,network=Network(412a19ab-c94f-46ed-9c4e-c69fc7962be3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebdbea5f-71')
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.933 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.933 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.933 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:f9:bc:03, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:56:04 compute-1 nova_compute[192795]: 2025-09-30 21:56:04.934 2 INFO nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Using config drive
Sep 30 21:56:05 compute-1 podman[253112]: 2025-09-30 21:56:05.275355925 +0000 UTC m=+0.077320831 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc.)
Sep 30 21:56:05 compute-1 podman[253114]: 2025-09-30 21:56:05.279709391 +0000 UTC m=+0.066195900 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.282 2 INFO nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Creating config drive at /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk.config
Sep 30 21:56:05 compute-1 podman[253113]: 2025-09-30 21:56:05.290419833 +0000 UTC m=+0.079506800 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.293 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4_pze46i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:05 compute-1 sshd-session[253108]: Failed password for invalid user dev from 8.210.178.40 port 55554 ssh2
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.431 2 DEBUG oslo_concurrency.processutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4_pze46i" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:05 compute-1 kernel: tapebdbea5f-71: entered promiscuous mode
Sep 30 21:56:05 compute-1 NetworkManager[51724]: <info>  [1759269365.5300] manager: (tapebdbea5f-71): new Tun device (/org/freedesktop/NetworkManager/Devices/389)
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:05 compute-1 ovn_controller[94902]: 2025-09-30T21:56:05Z|00773|binding|INFO|Claiming lport ebdbea5f-7191-4743-b858-8baa349733aa for this chassis.
Sep 30 21:56:05 compute-1 ovn_controller[94902]: 2025-09-30T21:56:05Z|00774|binding|INFO|ebdbea5f-7191-4743-b858-8baa349733aa: Claiming fa:16:3e:f9:bc:03 10.100.0.9 2001:db8:0:1:f816:3eff:fef9:bc03 2001:db8::f816:3eff:fef9:bc03
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.545 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:bc:03 10.100.0.9 2001:db8:0:1:f816:3eff:fef9:bc03 2001:db8::f816:3eff:fef9:bc03'], port_security=['fa:16:3e:f9:bc:03 10.100.0.9 2001:db8:0:1:f816:3eff:fef9:bc03 2001:db8::f816:3eff:fef9:bc03'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fef9:bc03/64 2001:db8::f816:3eff:fef9:bc03/64', 'neutron:device_id': 'a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fa85a388-97da-413c-abac-c6b8864a03b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f488970f-e2de-4ce9-9091-bbb5c56f0cb2, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=ebdbea5f-7191-4743-b858-8baa349733aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.547 103861 INFO neutron.agent.ovn.metadata.agent [-] Port ebdbea5f-7191-4743-b858-8baa349733aa in datapath 412a19ab-c94f-46ed-9c4e-c69fc7962be3 bound to our chassis
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.551 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 412a19ab-c94f-46ed-9c4e-c69fc7962be3
Sep 30 21:56:05 compute-1 ovn_controller[94902]: 2025-09-30T21:56:05Z|00775|binding|INFO|Setting lport ebdbea5f-7191-4743-b858-8baa349733aa ovn-installed in OVS
Sep 30 21:56:05 compute-1 ovn_controller[94902]: 2025-09-30T21:56:05Z|00776|binding|INFO|Setting lport ebdbea5f-7191-4743-b858-8baa349733aa up in Southbound
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.574 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6cbeaa00-4cd6-49c4-a233-bd2d051bbfa1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.575 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap412a19ab-c1 in ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.579 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap412a19ab-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.579 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[05a55785-1eb9-4a66-a157-814eaa1dd57c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.580 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c2896665-4551-4cce-b426-86a7e91b1eb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 systemd-udevd[253192]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:56:05 compute-1 systemd-machined[152783]: New machine qemu-86-instance-000000ba.
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.607 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[98ca0511-5667-4a09-82e4-6480a1b27b18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 NetworkManager[51724]: <info>  [1759269365.6120] device (tapebdbea5f-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:56:05 compute-1 NetworkManager[51724]: <info>  [1759269365.6129] device (tapebdbea5f-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:56:05 compute-1 systemd[1]: Started Virtual Machine qemu-86-instance-000000ba.
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.635 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[491a0de1-16f4-4af7-a129-c7d84eb3bb3d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.700 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[7c75314c-ea91-4613-82da-68038c4a8d17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 NetworkManager[51724]: <info>  [1759269365.7127] manager: (tap412a19ab-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/390)
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.711 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0f994ba3-4d0e-4b1e-b56b-1d930131fbc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.750 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[15ffbab1-d0ea-45fa-8375-8cb138fe3ccf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.753 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[0e9f244b-32c8-498c-bbad-16a0b212a0ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 NetworkManager[51724]: <info>  [1759269365.7831] device (tap412a19ab-c0): carrier: link connected
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.793 2 DEBUG nova.compute.manager [req-d4b9ebb3-08f9-4123-af15-457eae3e2412 req-fe4bad82-4f60-4abe-ac56-6fe55fc66a0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received event network-vif-plugged-ebdbea5f-7191-4743-b858-8baa349733aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.794 2 DEBUG oslo_concurrency.lockutils [req-d4b9ebb3-08f9-4123-af15-457eae3e2412 req-fe4bad82-4f60-4abe-ac56-6fe55fc66a0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.792 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[b7be72e9-c20c-4f98-82d2-af98cb9b47d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.794 2 DEBUG oslo_concurrency.lockutils [req-d4b9ebb3-08f9-4123-af15-457eae3e2412 req-fe4bad82-4f60-4abe-ac56-6fe55fc66a0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.795 2 DEBUG oslo_concurrency.lockutils [req-d4b9ebb3-08f9-4123-af15-457eae3e2412 req-fe4bad82-4f60-4abe-ac56-6fe55fc66a0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:05 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.795 2 DEBUG nova.compute.manager [req-d4b9ebb3-08f9-4123-af15-457eae3e2412 req-fe4bad82-4f60-4abe-ac56-6fe55fc66a0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Processing event network-vif-plugged-ebdbea5f-7191-4743-b858-8baa349733aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.813 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[104e87f6-57eb-4f4b-8d35-01e5134d5fc4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap412a19ab-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:38:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 243], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606145, 'reachable_time': 43392, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253225, 'error': None, 'target': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.838 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0f59aead-d3a1-4e20-a4bb-16ee1ece798c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:3830'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 606145, 'tstamp': 606145}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253226, 'error': None, 'target': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.865 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[54fb190c-0bec-44f2-8704-46bdcb38de11]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap412a19ab-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:38:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 243], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606145, 'reachable_time': 43392, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253227, 'error': None, 'target': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.913 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[632863e7-7ceb-46a7-9638-982301f0d4b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.990 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[6eeadbe7-3b5a-4997-81fd-47b9e92387f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.993 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap412a19ab-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.993 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:56:05 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:05.994 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap412a19ab-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:05 compute-1 NetworkManager[51724]: <info>  [1759269365.9982] manager: (tap412a19ab-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/391)
Sep 30 21:56:06 compute-1 kernel: tap412a19ab-c0: entered promiscuous mode
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:05.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:06.003 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap412a19ab-c0, col_values=(('external_ids', {'iface-id': '43244ebf-4de3-435e-a1f4-6cebae949e6e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:06 compute-1 ovn_controller[94902]: 2025-09-30T21:56:06Z|00777|binding|INFO|Releasing lport 43244ebf-4de3-435e-a1f4-6cebae949e6e from this chassis (sb_readonly=0)
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:06.009 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/412a19ab-c94f-46ed-9c4e-c69fc7962be3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/412a19ab-c94f-46ed-9c4e-c69fc7962be3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:06.023 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[34491718-ca22-4d2f-af4a-1282b9f453de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:06.025 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-412a19ab-c94f-46ed-9c4e-c69fc7962be3
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/412a19ab-c94f-46ed-9c4e-c69fc7962be3.pid.haproxy
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID 412a19ab-c94f-46ed-9c4e-c69fc7962be3
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:56:06 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:06.027 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'env', 'PROCESS_TAG=haproxy-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/412a19ab-c94f-46ed-9c4e-c69fc7962be3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:06 compute-1 sshd-session[253108]: Connection closed by invalid user dev 8.210.178.40 port 55554 [preauth]
Sep 30 21:56:06 compute-1 podman[253265]: 2025-09-30 21:56:06.502689553 +0000 UTC m=+0.079348777 container create 09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:56:06 compute-1 podman[253265]: 2025-09-30 21:56:06.459345988 +0000 UTC m=+0.036005242 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:56:06 compute-1 systemd[1]: Started libpod-conmon-09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440.scope.
Sep 30 21:56:06 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:56:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41e6f4d5650ebd259a5be64b854a79e5a4d801a2d59a06c0fc443a0c3fd4b204/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:56:06 compute-1 podman[253265]: 2025-09-30 21:56:06.626668447 +0000 UTC m=+0.203327711 container init 09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:56:06 compute-1 podman[253265]: 2025-09-30 21:56:06.637356688 +0000 UTC m=+0.214015922 container start 09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:56:06 compute-1 neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3[253281]: [NOTICE]   (253285) : New worker (253287) forked
Sep 30 21:56:06 compute-1 neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3[253281]: [NOTICE]   (253285) : Loading success.
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.943 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269366.9427357, a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.944 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] VM Started (Lifecycle Event)
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.946 2 DEBUG nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.950 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.954 2 INFO nova.virt.libvirt.driver [-] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Instance spawned successfully.
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.955 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:56:06 compute-1 nova_compute[192795]: 2025-09-30 21:56:06.989 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.001 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.009 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.010 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.010 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.011 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.011 2 DEBUG nova.virt.libvirt.driver [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.019 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.058 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.058 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269366.9431508, a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.059 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] VM Paused (Lifecycle Event)
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.091 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.094 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269366.9493613, a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.095 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] VM Resumed (Lifecycle Event)
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.131 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.135 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.140 2 INFO nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Took 6.31 seconds to spawn the instance on the hypervisor.
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.140 2 DEBUG nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.176 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.248 2 INFO nova.compute.manager [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Took 7.17 seconds to build instance.
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.264 2 DEBUG oslo_concurrency.lockutils [None req-89cff3e0-7ff5-4acc-b266-80405a9d4807 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.896 2 DEBUG nova.compute.manager [req-616b4cd6-36ea-4d97-b3e8-7405f7a343f9 req-43aadc15-0957-4ef7-a98c-be5731fdbd19 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received event network-vif-plugged-ebdbea5f-7191-4743-b858-8baa349733aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.896 2 DEBUG oslo_concurrency.lockutils [req-616b4cd6-36ea-4d97-b3e8-7405f7a343f9 req-43aadc15-0957-4ef7-a98c-be5731fdbd19 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.897 2 DEBUG oslo_concurrency.lockutils [req-616b4cd6-36ea-4d97-b3e8-7405f7a343f9 req-43aadc15-0957-4ef7-a98c-be5731fdbd19 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.897 2 DEBUG oslo_concurrency.lockutils [req-616b4cd6-36ea-4d97-b3e8-7405f7a343f9 req-43aadc15-0957-4ef7-a98c-be5731fdbd19 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.897 2 DEBUG nova.compute.manager [req-616b4cd6-36ea-4d97-b3e8-7405f7a343f9 req-43aadc15-0957-4ef7-a98c-be5731fdbd19 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] No waiting events found dispatching network-vif-plugged-ebdbea5f-7191-4743-b858-8baa349733aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:56:07 compute-1 nova_compute[192795]: 2025-09-30 21:56:07.897 2 WARNING nova.compute.manager [req-616b4cd6-36ea-4d97-b3e8-7405f7a343f9 req-43aadc15-0957-4ef7-a98c-be5731fdbd19 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received unexpected event network-vif-plugged-ebdbea5f-7191-4743-b858-8baa349733aa for instance with vm_state active and task_state None.
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.128 2 DEBUG nova.network.neutron [req-ce158789-6163-4f23-9495-5ab0da63e774 req-29ea4dd1-498a-40f1-b601-baeb4ffeaad0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updated VIF entry in instance network info cache for port ebdbea5f-7191-4743-b858-8baa349733aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.129 2 DEBUG nova.network.neutron [req-ce158789-6163-4f23-9495-5ab0da63e774 req-29ea4dd1-498a-40f1-b601-baeb4ffeaad0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updating instance_info_cache with network_info: [{"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.165 2 DEBUG oslo_concurrency.lockutils [req-ce158789-6163-4f23-9495-5ab0da63e774 req-29ea4dd1-498a-40f1-b601-baeb4ffeaad0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.727 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.728 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.728 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.729 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.815 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.918 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.919 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.983 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:09 compute-1 nova_compute[192795]: 2025-09-30 21:56:09.989 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.089 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.091 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.168 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.355 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.356 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5341MB free_disk=73.26700973510742GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.356 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.357 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.439 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 70bc4ef2-d80e-49af-b20a-6240f762b81e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.440 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.440 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.440 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.492 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.508 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.535 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:56:10 compute-1 nova_compute[192795]: 2025-09-30 21:56:10.536 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:13 compute-1 nova_compute[192795]: 2025-09-30 21:56:13.407 2 DEBUG nova.compute.manager [req-4ddd5a20-45d3-41da-8435-b814d7d833f5 req-0e9baff8-46fd-4991-82f3-e52b404e3546 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received event network-changed-ebdbea5f-7191-4743-b858-8baa349733aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:13 compute-1 nova_compute[192795]: 2025-09-30 21:56:13.407 2 DEBUG nova.compute.manager [req-4ddd5a20-45d3-41da-8435-b814d7d833f5 req-0e9baff8-46fd-4991-82f3-e52b404e3546 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Refreshing instance network info cache due to event network-changed-ebdbea5f-7191-4743-b858-8baa349733aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:56:13 compute-1 nova_compute[192795]: 2025-09-30 21:56:13.407 2 DEBUG oslo_concurrency.lockutils [req-4ddd5a20-45d3-41da-8435-b814d7d833f5 req-0e9baff8-46fd-4991-82f3-e52b404e3546 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:56:13 compute-1 nova_compute[192795]: 2025-09-30 21:56:13.408 2 DEBUG oslo_concurrency.lockutils [req-4ddd5a20-45d3-41da-8435-b814d7d833f5 req-0e9baff8-46fd-4991-82f3-e52b404e3546 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:56:13 compute-1 nova_compute[192795]: 2025-09-30 21:56:13.408 2 DEBUG nova.network.neutron [req-4ddd5a20-45d3-41da-8435-b814d7d833f5 req-0e9baff8-46fd-4991-82f3-e52b404e3546 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Refreshing network info cache for port ebdbea5f-7191-4743-b858-8baa349733aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:56:14 compute-1 nova_compute[192795]: 2025-09-30 21:56:14.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:14 compute-1 nova_compute[192795]: 2025-09-30 21:56:14.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:15 compute-1 podman[253309]: 2025-09-30 21:56:15.282203452 +0000 UTC m=+0.099417146 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid)
Sep 30 21:56:15 compute-1 nova_compute[192795]: 2025-09-30 21:56:15.536 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:15 compute-1 nova_compute[192795]: 2025-09-30 21:56:15.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:15 compute-1 nova_compute[192795]: 2025-09-30 21:56:15.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:16 compute-1 nova_compute[192795]: 2025-09-30 21:56:16.179 2 DEBUG nova.network.neutron [req-4ddd5a20-45d3-41da-8435-b814d7d833f5 req-0e9baff8-46fd-4991-82f3-e52b404e3546 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updated VIF entry in instance network info cache for port ebdbea5f-7191-4743-b858-8baa349733aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:56:16 compute-1 nova_compute[192795]: 2025-09-30 21:56:16.181 2 DEBUG nova.network.neutron [req-4ddd5a20-45d3-41da-8435-b814d7d833f5 req-0e9baff8-46fd-4991-82f3-e52b404e3546 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updating instance_info_cache with network_info: [{"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:16 compute-1 nova_compute[192795]: 2025-09-30 21:56:16.202 2 DEBUG oslo_concurrency.lockutils [req-4ddd5a20-45d3-41da-8435-b814d7d833f5 req-0e9baff8-46fd-4991-82f3-e52b404e3546 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:56:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:19.322 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:56:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:19.325 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:56:19 compute-1 nova_compute[192795]: 2025-09-30 21:56:19.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:19.327 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:19 compute-1 nova_compute[192795]: 2025-09-30 21:56:19.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-1 nova_compute[192795]: 2025-09-30 21:56:19.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:20 compute-1 ovn_controller[94902]: 2025-09-30T21:56:20Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f9:bc:03 10.100.0.9
Sep 30 21:56:20 compute-1 ovn_controller[94902]: 2025-09-30T21:56:20Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:bc:03 10.100.0.9
Sep 30 21:56:20 compute-1 nova_compute[192795]: 2025-09-30 21:56:20.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:21 compute-1 podman[253345]: 2025-09-30 21:56:21.226765265 +0000 UTC m=+0.070094482 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Sep 30 21:56:21 compute-1 podman[253347]: 2025-09-30 21:56:21.229260471 +0000 UTC m=+0.060315753 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:56:21 compute-1 podman[253346]: 2025-09-30 21:56:21.250225445 +0000 UTC m=+0.089139395 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:56:24 compute-1 nova_compute[192795]: 2025-09-30 21:56:24.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:24 compute-1 nova_compute[192795]: 2025-09-30 21:56:24.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.401 2 DEBUG nova.compute.manager [req-3568a4ac-817e-4bab-9ac3-d864f671a688 req-80432359-3b4a-42ac-a468-288b71e6ff02 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received event network-changed-0b25a929-986f-4efa-bd3d-b1e23e16cc0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.401 2 DEBUG nova.compute.manager [req-3568a4ac-817e-4bab-9ac3-d864f671a688 req-80432359-3b4a-42ac-a468-288b71e6ff02 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Refreshing instance network info cache due to event network-changed-0b25a929-986f-4efa-bd3d-b1e23e16cc0b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.402 2 DEBUG oslo_concurrency.lockutils [req-3568a4ac-817e-4bab-9ac3-d864f671a688 req-80432359-3b4a-42ac-a468-288b71e6ff02 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.402 2 DEBUG oslo_concurrency.lockutils [req-3568a4ac-817e-4bab-9ac3-d864f671a688 req-80432359-3b4a-42ac-a468-288b71e6ff02 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.402 2 DEBUG nova.network.neutron [req-3568a4ac-817e-4bab-9ac3-d864f671a688 req-80432359-3b4a-42ac-a468-288b71e6ff02 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Refreshing network info cache for port 0b25a929-986f-4efa-bd3d-b1e23e16cc0b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.513 2 DEBUG oslo_concurrency.lockutils [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "70bc4ef2-d80e-49af-b20a-6240f762b81e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.513 2 DEBUG oslo_concurrency.lockutils [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.514 2 DEBUG oslo_concurrency.lockutils [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.514 2 DEBUG oslo_concurrency.lockutils [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.514 2 DEBUG oslo_concurrency.lockutils [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.528 2 INFO nova.compute.manager [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Terminating instance
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.541 2 DEBUG nova.compute.manager [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:56:25 compute-1 kernel: tap0b25a929-98 (unregistering): left promiscuous mode
Sep 30 21:56:25 compute-1 NetworkManager[51724]: <info>  [1759269385.5767] device (tap0b25a929-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:56:25 compute-1 ovn_controller[94902]: 2025-09-30T21:56:25Z|00778|binding|INFO|Releasing lport 0b25a929-986f-4efa-bd3d-b1e23e16cc0b from this chassis (sb_readonly=0)
Sep 30 21:56:25 compute-1 ovn_controller[94902]: 2025-09-30T21:56:25Z|00779|binding|INFO|Setting lport 0b25a929-986f-4efa-bd3d-b1e23e16cc0b down in Southbound
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 ovn_controller[94902]: 2025-09-30T21:56:25Z|00780|binding|INFO|Removing iface tap0b25a929-98 ovn-installed in OVS
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.597 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:33:fa 10.100.0.5'], port_security=['fa:16:3e:4c:33:fa 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '70bc4ef2-d80e-49af-b20a-6240f762b81e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36195885-e54a-4c05-b721-98be98333841', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c2e514e7322435988a7f3bf398623e4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '98fe3b07-2473-4235-becf-2f443e01bab9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d62c416-32f7-4ff1-bc52-93428ed2b707, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=0b25a929-986f-4efa-bd3d-b1e23e16cc0b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.598 103861 INFO neutron.agent.ovn.metadata.agent [-] Port 0b25a929-986f-4efa-bd3d-b1e23e16cc0b in datapath 36195885-e54a-4c05-b721-98be98333841 unbound from our chassis
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.600 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36195885-e54a-4c05-b721-98be98333841, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.603 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[33e610e9-b39e-4ad8-b509-f3a22e96aeb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.604 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-36195885-e54a-4c05-b721-98be98333841 namespace which is not needed anymore
Sep 30 21:56:25 compute-1 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b6.scope: Deactivated successfully.
Sep 30 21:56:25 compute-1 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b6.scope: Consumed 16.880s CPU time.
Sep 30 21:56:25 compute-1 systemd-machined[152783]: Machine qemu-85-instance-000000b6 terminated.
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.734 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Sep 30 21:56:25 compute-1 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[252649]: [NOTICE]   (252653) : haproxy version is 2.8.14-c23fe91
Sep 30 21:56:25 compute-1 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[252649]: [NOTICE]   (252653) : path to executable is /usr/sbin/haproxy
Sep 30 21:56:25 compute-1 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[252649]: [WARNING]  (252653) : Exiting Master process...
Sep 30 21:56:25 compute-1 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[252649]: [ALERT]    (252653) : Current worker (252655) exited with code 143 (Terminated)
Sep 30 21:56:25 compute-1 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[252649]: [WARNING]  (252653) : All workers exited. Exiting... (0)
Sep 30 21:56:25 compute-1 systemd[1]: libpod-fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c.scope: Deactivated successfully.
Sep 30 21:56:25 compute-1 podman[253435]: 2025-09-30 21:56:25.755231877 +0000 UTC m=+0.050214506 container died fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:56:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-1fd30de33dcafdcdbfb5002d54f05464ce3e8ac3ecd354d42e3be2ba5148ca72-merged.mount: Deactivated successfully.
Sep 30 21:56:25 compute-1 podman[253435]: 2025-09-30 21:56:25.808046432 +0000 UTC m=+0.103029061 container cleanup fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:56:25 compute-1 systemd[1]: libpod-conmon-fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c.scope: Deactivated successfully.
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.823 2 INFO nova.virt.libvirt.driver [-] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Instance destroyed successfully.
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.824 2 DEBUG nova.objects.instance [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lazy-loading 'resources' on Instance uuid 70bc4ef2-d80e-49af-b20a-6240f762b81e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.840 2 DEBUG nova.compute.manager [req-b3bb6091-69a1-428c-ac3b-565380cbea61 req-156131ea-ea74-4c64-bb20-859e8708e279 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received event network-vif-unplugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.840 2 DEBUG oslo_concurrency.lockutils [req-b3bb6091-69a1-428c-ac3b-565380cbea61 req-156131ea-ea74-4c64-bb20-859e8708e279 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.841 2 DEBUG oslo_concurrency.lockutils [req-b3bb6091-69a1-428c-ac3b-565380cbea61 req-156131ea-ea74-4c64-bb20-859e8708e279 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.841 2 DEBUG oslo_concurrency.lockutils [req-b3bb6091-69a1-428c-ac3b-565380cbea61 req-156131ea-ea74-4c64-bb20-859e8708e279 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.841 2 DEBUG nova.compute.manager [req-b3bb6091-69a1-428c-ac3b-565380cbea61 req-156131ea-ea74-4c64-bb20-859e8708e279 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] No waiting events found dispatching network-vif-unplugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.841 2 DEBUG nova.compute.manager [req-b3bb6091-69a1-428c-ac3b-565380cbea61 req-156131ea-ea74-4c64-bb20-859e8708e279 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received event network-vif-unplugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.843 2 DEBUG nova.virt.libvirt.vif [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1785764051',display_name='tempest-TestSnapshotPattern-server-1785764051',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1785764051',id=182,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLKL8WwnYuJEP3XpY/ju5SXa+fZd+0s4ElE/Ammc3JO3wP15Y53TJ4QGSyyMbttI4T5Fjj/YGgDR1amj6cHQX5O4wQ/GeWnvDWjS/d7Zz3S4MDwj0ljVzMOx5HSsjDMRAA==',key_name='tempest-TestSnapshotPattern-42320000',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:54:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2c2e514e7322435988a7f3bf398623e4',ramdisk_id='',reservation_id='r-cxjkvhqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-1968938915',owner_user_name='tempest-TestSnapshotPattern-1968938915-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:55:21Z,user_data=None,user_id='d6d7afba807d47549781e37178a01774',uuid=70bc4ef2-d80e-49af-b20a-6240f762b81e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.843 2 DEBUG nova.network.os_vif_util [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converting VIF {"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.845 2 DEBUG nova.network.os_vif_util [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4c:33:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b25a929-986f-4efa-bd3d-b1e23e16cc0b,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b25a929-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.845 2 DEBUG os_vif [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4c:33:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b25a929-986f-4efa-bd3d-b1e23e16cc0b,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b25a929-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.848 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b25a929-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.855 2 INFO os_vif [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4c:33:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b25a929-986f-4efa-bd3d-b1e23e16cc0b,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b25a929-98')
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.856 2 INFO nova.virt.libvirt.driver [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Deleting instance files /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e_del
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.857 2 INFO nova.virt.libvirt.driver [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Deletion of /var/lib/nova/instances/70bc4ef2-d80e-49af-b20a-6240f762b81e_del complete
Sep 30 21:56:25 compute-1 podman[253480]: 2025-09-30 21:56:25.895632145 +0000 UTC m=+0.057205192 container remove fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.903 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[12037361-0bc1-44de-bf76-3a2adefb940e]: (4, ('Tue Sep 30 09:56:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841 (fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c)\nfa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c\nTue Sep 30 09:56:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841 (fa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c)\nfa7d33080364633ff48b27495292eaf1ad8405f58484f96cd1fc81b35230d14c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.904 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[4be4cc52-ff76-44ca-8659-7f3ea8679c44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.907 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36195885-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 kernel: tap36195885-e0: left promiscuous mode
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.916 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.916 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.917 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.917 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.919 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1e95893f-8e4f-45b9-b813-86d626840b29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.937 2 INFO nova.compute.manager [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.938 2 DEBUG oslo.service.loopingcall [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.939 2 DEBUG nova.compute.manager [-] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:56:25 compute-1 nova_compute[192795]: 2025-09-30 21:56:25.939 2 DEBUG nova.network.neutron [-] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.952 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[aec8a376-1dfc-4266-93d5-a6415a8d6d57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.955 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3a2fba-e192-4893-ac7c-28136ab8afbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.975 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9e719da8-0df3-41e8-8d9a-15ea22c70fdd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599153, 'reachable_time': 33570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253495, 'error': None, 'target': 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:25 compute-1 systemd[1]: run-netns-ovnmeta\x2d36195885\x2de54a\x2d4c05\x2db721\x2d98be98333841.mount: Deactivated successfully.
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.979 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-36195885-e54a-4c05-b721-98be98333841 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:56:25 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:25.980 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[5731294d-c6eb-4f95-ba23-8e5a48e39c3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.507 2 DEBUG nova.network.neutron [-] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.534 2 INFO nova.compute.manager [-] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Took 1.60 seconds to deallocate network for instance.
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.584 2 DEBUG nova.compute.manager [req-adbcf523-0ba1-4a1d-b50f-c33e3f5d8698 req-6b281529-59cf-4c8b-b1b7-0bcdd27f92dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received event network-vif-deleted-0b25a929-986f-4efa-bd3d-b1e23e16cc0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.604 2 DEBUG oslo_concurrency.lockutils [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.605 2 DEBUG oslo_concurrency.lockutils [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.684 2 DEBUG nova.compute.provider_tree [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.699 2 DEBUG nova.scheduler.client.report [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.721 2 DEBUG oslo_concurrency.lockutils [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.745 2 INFO nova.scheduler.client.report [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Deleted allocations for instance 70bc4ef2-d80e-49af-b20a-6240f762b81e
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.832 2 DEBUG nova.network.neutron [req-3568a4ac-817e-4bab-9ac3-d864f671a688 req-80432359-3b4a-42ac-a468-288b71e6ff02 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updated VIF entry in instance network info cache for port 0b25a929-986f-4efa-bd3d-b1e23e16cc0b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.833 2 DEBUG nova.network.neutron [req-3568a4ac-817e-4bab-9ac3-d864f671a688 req-80432359-3b4a-42ac-a468-288b71e6ff02 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Updating instance_info_cache with network_info: [{"id": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "address": "fa:16:3e:4c:33:fa", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b25a929-98", "ovs_interfaceid": "0b25a929-986f-4efa-bd3d-b1e23e16cc0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.855 2 DEBUG oslo_concurrency.lockutils [req-3568a4ac-817e-4bab-9ac3-d864f671a688 req-80432359-3b4a-42ac-a468-288b71e6ff02 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-70bc4ef2-d80e-49af-b20a-6240f762b81e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.857 2 DEBUG oslo_concurrency.lockutils [None req-3926b306-8853-460f-a6c4-b856ed256c3d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.936 2 DEBUG nova.compute.manager [req-a7045628-85e8-4759-a6fa-4e81b68f4bd6 req-73aaf422-04f2-454a-bab6-32f29fd286fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received event network-vif-plugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.936 2 DEBUG oslo_concurrency.lockutils [req-a7045628-85e8-4759-a6fa-4e81b68f4bd6 req-73aaf422-04f2-454a-bab6-32f29fd286fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.936 2 DEBUG oslo_concurrency.lockutils [req-a7045628-85e8-4759-a6fa-4e81b68f4bd6 req-73aaf422-04f2-454a-bab6-32f29fd286fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.937 2 DEBUG oslo_concurrency.lockutils [req-a7045628-85e8-4759-a6fa-4e81b68f4bd6 req-73aaf422-04f2-454a-bab6-32f29fd286fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "70bc4ef2-d80e-49af-b20a-6240f762b81e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.937 2 DEBUG nova.compute.manager [req-a7045628-85e8-4759-a6fa-4e81b68f4bd6 req-73aaf422-04f2-454a-bab6-32f29fd286fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] No waiting events found dispatching network-vif-plugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:56:27 compute-1 nova_compute[192795]: 2025-09-30 21:56:27.937 2 WARNING nova.compute.manager [req-a7045628-85e8-4759-a6fa-4e81b68f4bd6 req-73aaf422-04f2-454a-bab6-32f29fd286fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Received unexpected event network-vif-plugged-0b25a929-986f-4efa-bd3d-b1e23e16cc0b for instance with vm_state deleted and task_state None.
Sep 30 21:56:29 compute-1 nova_compute[192795]: 2025-09-30 21:56:29.120 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updating instance_info_cache with network_info: [{"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:29 compute-1 nova_compute[192795]: 2025-09-30 21:56:29.150 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:56:29 compute-1 nova_compute[192795]: 2025-09-30 21:56:29.150 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:56:29 compute-1 nova_compute[192795]: 2025-09-30 21:56:29.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:30 compute-1 nova_compute[192795]: 2025-09-30 21:56:30.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.145 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:31 compute-1 podman[253496]: 2025-09-30 21:56:31.227892859 +0000 UTC m=+0.065288395 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.761 2 DEBUG nova.compute.manager [req-cac19e9b-28c9-49de-8c8f-18f9fc4c414a req-3da38747-8c9b-4233-ab45-09d05ec2b246 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received event network-changed-ebdbea5f-7191-4743-b858-8baa349733aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.762 2 DEBUG nova.compute.manager [req-cac19e9b-28c9-49de-8c8f-18f9fc4c414a req-3da38747-8c9b-4233-ab45-09d05ec2b246 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Refreshing instance network info cache due to event network-changed-ebdbea5f-7191-4743-b858-8baa349733aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.762 2 DEBUG oslo_concurrency.lockutils [req-cac19e9b-28c9-49de-8c8f-18f9fc4c414a req-3da38747-8c9b-4233-ab45-09d05ec2b246 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.762 2 DEBUG oslo_concurrency.lockutils [req-cac19e9b-28c9-49de-8c8f-18f9fc4c414a req-3da38747-8c9b-4233-ab45-09d05ec2b246 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.762 2 DEBUG nova.network.neutron [req-cac19e9b-28c9-49de-8c8f-18f9fc4c414a req-3da38747-8c9b-4233-ab45-09d05ec2b246 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Refreshing network info cache for port ebdbea5f-7191-4743-b858-8baa349733aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.825 2 DEBUG oslo_concurrency.lockutils [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.825 2 DEBUG oslo_concurrency.lockutils [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.826 2 DEBUG oslo_concurrency.lockutils [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.826 2 DEBUG oslo_concurrency.lockutils [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.826 2 DEBUG oslo_concurrency.lockutils [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.836 2 INFO nova.compute.manager [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Terminating instance
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.850 2 DEBUG nova.compute.manager [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:56:31 compute-1 kernel: tapebdbea5f-71 (unregistering): left promiscuous mode
Sep 30 21:56:31 compute-1 NetworkManager[51724]: <info>  [1759269391.8763] device (tapebdbea5f-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:56:31 compute-1 ovn_controller[94902]: 2025-09-30T21:56:31Z|00781|binding|INFO|Releasing lport ebdbea5f-7191-4743-b858-8baa349733aa from this chassis (sb_readonly=0)
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:31 compute-1 ovn_controller[94902]: 2025-09-30T21:56:31Z|00782|binding|INFO|Setting lport ebdbea5f-7191-4743-b858-8baa349733aa down in Southbound
Sep 30 21:56:31 compute-1 ovn_controller[94902]: 2025-09-30T21:56:31Z|00783|binding|INFO|Removing iface tapebdbea5f-71 ovn-installed in OVS
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:31.926 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:bc:03 10.100.0.9 2001:db8:0:1:f816:3eff:fef9:bc03 2001:db8::f816:3eff:fef9:bc03'], port_security=['fa:16:3e:f9:bc:03 10.100.0.9 2001:db8:0:1:f816:3eff:fef9:bc03 2001:db8::f816:3eff:fef9:bc03'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fef9:bc03/64 2001:db8::f816:3eff:fef9:bc03/64', 'neutron:device_id': 'a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fa85a388-97da-413c-abac-c6b8864a03b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f488970f-e2de-4ce9-9091-bbb5c56f0cb2, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=ebdbea5f-7191-4743-b858-8baa349733aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:56:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:31.927 103861 INFO neutron.agent.ovn.metadata.agent [-] Port ebdbea5f-7191-4743-b858-8baa349733aa in datapath 412a19ab-c94f-46ed-9c4e-c69fc7962be3 unbound from our chassis
Sep 30 21:56:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:31.929 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 412a19ab-c94f-46ed-9c4e-c69fc7962be3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:56:31 compute-1 nova_compute[192795]: 2025-09-30 21:56:31.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:31.930 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5dcd7241-0260-4abe-b940-af86195f3966]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:31 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:31.931 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3 namespace which is not needed anymore
Sep 30 21:56:31 compute-1 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000ba.scope: Deactivated successfully.
Sep 30 21:56:31 compute-1 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000ba.scope: Consumed 14.764s CPU time.
Sep 30 21:56:31 compute-1 systemd-machined[152783]: Machine qemu-86-instance-000000ba terminated.
Sep 30 21:56:32 compute-1 neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3[253281]: [NOTICE]   (253285) : haproxy version is 2.8.14-c23fe91
Sep 30 21:56:32 compute-1 neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3[253281]: [NOTICE]   (253285) : path to executable is /usr/sbin/haproxy
Sep 30 21:56:32 compute-1 neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3[253281]: [WARNING]  (253285) : Exiting Master process...
Sep 30 21:56:32 compute-1 neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3[253281]: [WARNING]  (253285) : Exiting Master process...
Sep 30 21:56:32 compute-1 neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3[253281]: [ALERT]    (253285) : Current worker (253287) exited with code 143 (Terminated)
Sep 30 21:56:32 compute-1 neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3[253281]: [WARNING]  (253285) : All workers exited. Exiting... (0)
Sep 30 21:56:32 compute-1 systemd[1]: libpod-09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440.scope: Deactivated successfully.
Sep 30 21:56:32 compute-1 podman[253542]: 2025-09-30 21:56:32.085916985 +0000 UTC m=+0.051702236 container died 09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:56:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440-userdata-shm.mount: Deactivated successfully.
Sep 30 21:56:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-41e6f4d5650ebd259a5be64b854a79e5a4d801a2d59a06c0fc443a0c3fd4b204-merged.mount: Deactivated successfully.
Sep 30 21:56:32 compute-1 podman[253542]: 2025-09-30 21:56:32.126826875 +0000 UTC m=+0.092612136 container cleanup 09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.133 2 INFO nova.virt.libvirt.driver [-] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Instance destroyed successfully.
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.133 2 DEBUG nova.objects.instance [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:56:32 compute-1 systemd[1]: libpod-conmon-09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440.scope: Deactivated successfully.
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.159 2 DEBUG nova.virt.libvirt.vif [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:55:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1107065721',display_name='tempest-TestGettingAddress-server-1107065721',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1107065721',id=186,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOeOS3mBAobKMFVZmSXwhD5pR+hgUSxOEvW8JkTGv9SYHzB8LEBLx4hPSb24o9wi6QTC8NprCVLaWawzMJ4uLeG8TVdIhpgisBZvVv4nk53bbVF0NImbuMXtGSJ50iCNCQ==',key_name='tempest-TestGettingAddress-416719936',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:56:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-p8eiyq0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:56:07Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.160 2 DEBUG nova.network.os_vif_util [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.161 2 DEBUG nova.network.os_vif_util [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:bc:03,bridge_name='br-int',has_traffic_filtering=True,id=ebdbea5f-7191-4743-b858-8baa349733aa,network=Network(412a19ab-c94f-46ed-9c4e-c69fc7962be3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebdbea5f-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.161 2 DEBUG os_vif [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:bc:03,bridge_name='br-int',has_traffic_filtering=True,id=ebdbea5f-7191-4743-b858-8baa349733aa,network=Network(412a19ab-c94f-46ed-9c4e-c69fc7962be3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebdbea5f-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.163 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebdbea5f-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.171 2 INFO os_vif [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:bc:03,bridge_name='br-int',has_traffic_filtering=True,id=ebdbea5f-7191-4743-b858-8baa349733aa,network=Network(412a19ab-c94f-46ed-9c4e-c69fc7962be3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebdbea5f-71')
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.172 2 INFO nova.virt.libvirt.driver [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Deleting instance files /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7_del
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.174 2 INFO nova.virt.libvirt.driver [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Deletion of /var/lib/nova/instances/a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7_del complete
Sep 30 21:56:32 compute-1 podman[253588]: 2025-09-30 21:56:32.203785227 +0000 UTC m=+0.048292365 container remove 09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:56:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:32.210 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[92138a56-bea6-40b3-a3a5-d3592227c9e7]: (4, ('Tue Sep 30 09:56:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3 (09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440)\n09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440\nTue Sep 30 09:56:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3 (09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440)\n09f94b5859fbc6073e400fc6e073d6ca179b2081c0a4d122e1edf72bb014f440\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:32.212 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7d7c601a-8b19-4899-a027-ad7f6a021af9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:32.215 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap412a19ab-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:32 compute-1 kernel: tap412a19ab-c0: left promiscuous mode
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.231 2 DEBUG nova.compute.manager [req-1eee0968-39e5-480a-81e4-9286b6ae1e68 req-8c2387ee-ebde-4218-ad27-b06898e2f589 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received event network-vif-unplugged-ebdbea5f-7191-4743-b858-8baa349733aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:32.232 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9ace22a8-5c41-41b3-9a07-949c45acfd5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.232 2 DEBUG oslo_concurrency.lockutils [req-1eee0968-39e5-480a-81e4-9286b6ae1e68 req-8c2387ee-ebde-4218-ad27-b06898e2f589 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.233 2 DEBUG oslo_concurrency.lockutils [req-1eee0968-39e5-480a-81e4-9286b6ae1e68 req-8c2387ee-ebde-4218-ad27-b06898e2f589 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.233 2 DEBUG oslo_concurrency.lockutils [req-1eee0968-39e5-480a-81e4-9286b6ae1e68 req-8c2387ee-ebde-4218-ad27-b06898e2f589 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.233 2 DEBUG nova.compute.manager [req-1eee0968-39e5-480a-81e4-9286b6ae1e68 req-8c2387ee-ebde-4218-ad27-b06898e2f589 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] No waiting events found dispatching network-vif-unplugged-ebdbea5f-7191-4743-b858-8baa349733aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.234 2 DEBUG nova.compute.manager [req-1eee0968-39e5-480a-81e4-9286b6ae1e68 req-8c2387ee-ebde-4218-ad27-b06898e2f589 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received event network-vif-unplugged-ebdbea5f-7191-4743-b858-8baa349733aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:32.263 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[c2aab52e-e0dd-4784-a0b9-03e0bf3a5b51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:32.265 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a92acd-c879-4ffb-8b79-8c4206fbf550]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.280 2 INFO nova.compute.manager [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.280 2 DEBUG oslo.service.loopingcall [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.281 2 DEBUG nova.compute.manager [-] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:56:32 compute-1 nova_compute[192795]: 2025-09-30 21:56:32.281 2 DEBUG nova.network.neutron [-] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:56:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:32.290 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7771caf1-8794-4590-82bb-6d1f29f03e3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606136, 'reachable_time': 30289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253603, 'error': None, 'target': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:32.293 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:56:32 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:32.293 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a48d8e-47a5-463e-a89b-0beda813c52b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:32 compute-1 systemd[1]: run-netns-ovnmeta\x2d412a19ab\x2dc94f\x2d46ed\x2d9c4e\x2dc69fc7962be3.mount: Deactivated successfully.
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.425 2 DEBUG nova.network.neutron [-] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.452 2 INFO nova.compute.manager [-] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Took 1.17 seconds to deallocate network for instance.
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.538 2 DEBUG oslo_concurrency.lockutils [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.539 2 DEBUG oslo_concurrency.lockutils [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.607 2 DEBUG nova.compute.provider_tree [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.619 2 DEBUG nova.scheduler.client.report [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.638 2 DEBUG oslo_concurrency.lockutils [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.672 2 INFO nova.scheduler.client.report [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.759 2 DEBUG oslo_concurrency.lockutils [None req-742a3ea7-a01e-4f52-b61b-ea05c608062f 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:33 compute-1 nova_compute[192795]: 2025-09-30 21:56:33.851 2 DEBUG nova.compute.manager [req-ba7d9d81-85f3-4162-922d-dc1c0758362c req-035c516e-537e-42f6-8f2f-4bf1d9584d0b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received event network-vif-deleted-ebdbea5f-7191-4743-b858-8baa349733aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.327 2 DEBUG nova.compute.manager [req-8c120ca4-07cc-482f-8745-abf34fe07582 req-66e2db65-d5d4-4ed6-acd5-2b74fdb2d540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received event network-vif-plugged-ebdbea5f-7191-4743-b858-8baa349733aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.327 2 DEBUG oslo_concurrency.lockutils [req-8c120ca4-07cc-482f-8745-abf34fe07582 req-66e2db65-d5d4-4ed6-acd5-2b74fdb2d540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.328 2 DEBUG oslo_concurrency.lockutils [req-8c120ca4-07cc-482f-8745-abf34fe07582 req-66e2db65-d5d4-4ed6-acd5-2b74fdb2d540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.328 2 DEBUG oslo_concurrency.lockutils [req-8c120ca4-07cc-482f-8745-abf34fe07582 req-66e2db65-d5d4-4ed6-acd5-2b74fdb2d540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.328 2 DEBUG nova.compute.manager [req-8c120ca4-07cc-482f-8745-abf34fe07582 req-66e2db65-d5d4-4ed6-acd5-2b74fdb2d540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] No waiting events found dispatching network-vif-plugged-ebdbea5f-7191-4743-b858-8baa349733aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.329 2 WARNING nova.compute.manager [req-8c120ca4-07cc-482f-8745-abf34fe07582 req-66e2db65-d5d4-4ed6-acd5-2b74fdb2d540 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Received unexpected event network-vif-plugged-ebdbea5f-7191-4743-b858-8baa349733aa for instance with vm_state deleted and task_state None.
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.376 2 DEBUG nova.network.neutron [req-cac19e9b-28c9-49de-8c8f-18f9fc4c414a req-3da38747-8c9b-4233-ab45-09d05ec2b246 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updated VIF entry in instance network info cache for port ebdbea5f-7191-4743-b858-8baa349733aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.377 2 DEBUG nova.network.neutron [req-cac19e9b-28c9-49de-8c8f-18f9fc4c414a req-3da38747-8c9b-4233-ab45-09d05ec2b246 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Updating instance_info_cache with network_info: [{"id": "ebdbea5f-7191-4743-b858-8baa349733aa", "address": "fa:16:3e:f9:bc:03", "network": {"id": "412a19ab-c94f-46ed-9c4e-c69fc7962be3", "bridge": "br-int", "label": "tempest-network-smoke--1387088870", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef9:bc03", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebdbea5f-71", "ovs_interfaceid": "ebdbea5f-7191-4743-b858-8baa349733aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.426 2 DEBUG oslo_concurrency.lockutils [req-cac19e9b-28c9-49de-8c8f-18f9fc4c414a req-3da38747-8c9b-4233-ab45-09d05ec2b246 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:56:34 compute-1 nova_compute[192795]: 2025-09-30 21:56:34.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:36 compute-1 podman[253605]: 2025-09-30 21:56:36.219662094 +0000 UTC m=+0.060905919 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Sep 30 21:56:36 compute-1 podman[253606]: 2025-09-30 21:56:36.248975848 +0000 UTC m=+0.072583908 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:56:36 compute-1 podman[253607]: 2025-09-30 21:56:36.249853962 +0000 UTC m=+0.073357969 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:56:37 compute-1 nova_compute[192795]: 2025-09-30 21:56:37.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:38 compute-1 sshd-session[253667]: Invalid user admin from 167.71.248.239 port 43358
Sep 30 21:56:38 compute-1 sshd-session[253667]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:56:38 compute-1 sshd-session[253667]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239
Sep 30 21:56:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:38.717 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:38.718 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:56:38.718 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:39 compute-1 nova_compute[192795]: 2025-09-30 21:56:39.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:40 compute-1 sshd-session[253667]: Failed password for invalid user admin from 167.71.248.239 port 43358 ssh2
Sep 30 21:56:40 compute-1 nova_compute[192795]: 2025-09-30 21:56:40.823 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269385.8218558, 70bc4ef2-d80e-49af-b20a-6240f762b81e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:56:40 compute-1 nova_compute[192795]: 2025-09-30 21:56:40.824 2 INFO nova.compute.manager [-] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] VM Stopped (Lifecycle Event)
Sep 30 21:56:40 compute-1 nova_compute[192795]: 2025-09-30 21:56:40.852 2 DEBUG nova.compute.manager [None req-ccb0e7fd-1a12-4f0e-ad29-e229ea4533c4 - - - - - -] [instance: 70bc4ef2-d80e-49af-b20a-6240f762b81e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:56:40 compute-1 sshd-session[253667]: Connection closed by invalid user admin 167.71.248.239 port 43358 [preauth]
Sep 30 21:56:42 compute-1 nova_compute[192795]: 2025-09-30 21:56:42.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.031 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:56:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-1 nova_compute[192795]: 2025-09-30 21:56:44.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:46 compute-1 podman[253669]: 2025-09-30 21:56:46.228116353 +0000 UTC m=+0.063524658 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3)
Sep 30 21:56:47 compute-1 nova_compute[192795]: 2025-09-30 21:56:47.131 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269392.1290405, a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:56:47 compute-1 nova_compute[192795]: 2025-09-30 21:56:47.132 2 INFO nova.compute.manager [-] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] VM Stopped (Lifecycle Event)
Sep 30 21:56:47 compute-1 nova_compute[192795]: 2025-09-30 21:56:47.162 2 DEBUG nova.compute.manager [None req-2df01053-8c89-4d73-8015-6a567503b34a - - - - - -] [instance: a3753ac3-14bf-4b78-9d09-d0fe47e3c2f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:56:47 compute-1 nova_compute[192795]: 2025-09-30 21:56:47.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:49 compute-1 nova_compute[192795]: 2025-09-30 21:56:49.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:52 compute-1 nova_compute[192795]: 2025-09-30 21:56:52.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:52 compute-1 podman[253691]: 2025-09-30 21:56:52.216360999 +0000 UTC m=+0.054883430 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:56:52 compute-1 podman[253689]: 2025-09-30 21:56:52.216471241 +0000 UTC m=+0.061993097 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:56:52 compute-1 podman[253690]: 2025-09-30 21:56:52.240980389 +0000 UTC m=+0.081888584 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:56:54 compute-1 nova_compute[192795]: 2025-09-30 21:56:54.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:57 compute-1 nova_compute[192795]: 2025-09-30 21:56:57.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:59 compute-1 nova_compute[192795]: 2025-09-30 21:56:59.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:01.857 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:df:1d 10.100.0.2 2001:db8::f816:3eff:fef3:df1d'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef3:df1d/64', 'neutron:device_id': 'ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14f43e9d-ff95-45ca-8ef3-d794e65d228a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=54acfaa1-75af-46fc-b755-5aabbfb79138) old=Port_Binding(mac=['fa:16:3e:f3:df:1d 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:57:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:01.858 103861 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 54acfaa1-75af-46fc-b755-5aabbfb79138 in datapath b5e8390b-42ff-40d7-bb46-05b4a7f0a027 updated
Sep 30 21:57:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:01.859 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5e8390b-42ff-40d7-bb46-05b4a7f0a027, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:57:01 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:01.863 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[693da744-4e06-407d-ab93-ed42d9fbea99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:02 compute-1 nova_compute[192795]: 2025-09-30 21:57:02.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:02 compute-1 podman[253758]: 2025-09-30 21:57:02.228231429 +0000 UTC m=+0.072028924 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:57:03 compute-1 nova_compute[192795]: 2025-09-30 21:57:03.713 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:04 compute-1 nova_compute[192795]: 2025-09-30 21:57:04.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:07 compute-1 nova_compute[192795]: 2025-09-30 21:57:07.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:07 compute-1 podman[253780]: 2025-09-30 21:57:07.222261755 +0000 UTC m=+0.053462313 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:57:07 compute-1 podman[253779]: 2025-09-30 21:57:07.229483365 +0000 UTC m=+0.064363020 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Sep 30 21:57:07 compute-1 podman[253781]: 2025-09-30 21:57:07.240117365 +0000 UTC m=+0.061525364 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:57:07 compute-1 nova_compute[192795]: 2025-09-30 21:57:07.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:07 compute-1 nova_compute[192795]: 2025-09-30 21:57:07.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:57:08 compute-1 nova_compute[192795]: 2025-09-30 21:57:08.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:09 compute-1 nova_compute[192795]: 2025-09-30 21:57:09.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.723 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.724 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.724 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.724 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.884 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.885 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5688MB free_disk=73.29682922363281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.885 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.886 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.969 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.970 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:57:10 compute-1 nova_compute[192795]: 2025-09-30 21:57:10.994 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.008 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.009 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.033 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.072 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.102 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.181 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.234 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.234 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.577 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "6e909358-f157-4878-9c45-00e8c263d2d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.578 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.594 2 DEBUG nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.709 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.710 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.717 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.717 2 INFO nova.compute.claims [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Claim successful on node compute-1.ctlplane.example.com
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.836 2 DEBUG nova.compute.provider_tree [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.854 2 DEBUG nova.scheduler.client.report [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.879 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.880 2 DEBUG nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.931 2 DEBUG nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.932 2 DEBUG nova.network.neutron [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.956 2 INFO nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:57:11 compute-1 nova_compute[192795]: 2025-09-30 21:57:11.980 2 DEBUG nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.112 2 DEBUG nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.114 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.115 2 INFO nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Creating image(s)
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.115 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.116 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.117 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.139 2 DEBUG nova.policy [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.143 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.241 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.242 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.242 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.253 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.311 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.312 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.352 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.353 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.354 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.410 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.411 2 DEBUG nova.virt.disk.api [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.412 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.469 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.471 2 DEBUG nova.virt.disk.api [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.471 2 DEBUG nova.objects.instance [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid 6e909358-f157-4878-9c45-00e8c263d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.494 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.494 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Ensure instance console log exists: /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.495 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.495 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.496 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:12 compute-1 nova_compute[192795]: 2025-09-30 21:57:12.851 2 DEBUG nova.network.neutron [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Successfully created port: c6151fb1-ab92-4620-9c1b-de715e136554 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:57:14 compute-1 nova_compute[192795]: 2025-09-30 21:57:14.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:16 compute-1 nova_compute[192795]: 2025-09-30 21:57:16.209 2 DEBUG nova.network.neutron [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Successfully updated port: c6151fb1-ab92-4620-9c1b-de715e136554 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:57:16 compute-1 nova_compute[192795]: 2025-09-30 21:57:16.228 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:57:16 compute-1 nova_compute[192795]: 2025-09-30 21:57:16.228 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:57:16 compute-1 nova_compute[192795]: 2025-09-30 21:57:16.229 2 DEBUG nova.network.neutron [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:57:16 compute-1 nova_compute[192795]: 2025-09-30 21:57:16.235 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:16 compute-1 nova_compute[192795]: 2025-09-30 21:57:16.236 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:16 compute-1 nova_compute[192795]: 2025-09-30 21:57:16.334 2 DEBUG nova.compute.manager [req-538cc0b7-6965-424b-898c-ef49eed19371 req-5bb30390-3b03-4882-bd0e-0758eda46dce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received event network-changed-c6151fb1-ab92-4620-9c1b-de715e136554 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:57:16 compute-1 nova_compute[192795]: 2025-09-30 21:57:16.335 2 DEBUG nova.compute.manager [req-538cc0b7-6965-424b-898c-ef49eed19371 req-5bb30390-3b03-4882-bd0e-0758eda46dce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Refreshing instance network info cache due to event network-changed-c6151fb1-ab92-4620-9c1b-de715e136554. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:57:16 compute-1 nova_compute[192795]: 2025-09-30 21:57:16.335 2 DEBUG oslo_concurrency.lockutils [req-538cc0b7-6965-424b-898c-ef49eed19371 req-5bb30390-3b03-4882-bd0e-0758eda46dce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:57:17 compute-1 nova_compute[192795]: 2025-09-30 21:57:17.108 2 DEBUG nova.network.neutron [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:57:17 compute-1 nova_compute[192795]: 2025-09-30 21:57:17.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:17 compute-1 podman[253855]: 2025-09-30 21:57:17.227046627 +0000 UTC m=+0.069203868 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:57:17 compute-1 nova_compute[192795]: 2025-09-30 21:57:17.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.363 2 DEBUG nova.network.neutron [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updating instance_info_cache with network_info: [{"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.391 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.392 2 DEBUG nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Instance network_info: |[{"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.393 2 DEBUG oslo_concurrency.lockutils [req-538cc0b7-6965-424b-898c-ef49eed19371 req-5bb30390-3b03-4882-bd0e-0758eda46dce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.393 2 DEBUG nova.network.neutron [req-538cc0b7-6965-424b-898c-ef49eed19371 req-5bb30390-3b03-4882-bd0e-0758eda46dce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Refreshing network info cache for port c6151fb1-ab92-4620-9c1b-de715e136554 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.398 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Start _get_guest_xml network_info=[{"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_options': None, 'encryption_format': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.407 2 WARNING nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.413 2 DEBUG nova.virt.libvirt.host [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.414 2 DEBUG nova.virt.libvirt.host [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.423 2 DEBUG nova.virt.libvirt.host [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.424 2 DEBUG nova.virt.libvirt.host [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.426 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.426 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.426 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.427 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.427 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.427 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.427 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.428 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.428 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.428 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.429 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.430 2 DEBUG nova.virt.hardware [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.434 2 DEBUG nova.virt.libvirt.vif [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-556145375',display_name='tempest-TestGettingAddress-server-556145375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-556145375',id=187,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAUT2tGw/YfTDpIGp84ZfbyD7IpRQ7QUJyyF+bEV3JHk/9CkDID8KoS9k6snQG6VCxf8dtLMx/iInIy7Iaf6LMNj8QmQ0p01A6OrJ5TUZj/qleflyrwQYPj6f1blncJ8qQ==',key_name='tempest-TestGettingAddress-822226956',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-gs7xl9uc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:57:12Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=6e909358-f157-4878-9c45-00e8c263d2d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.434 2 DEBUG nova.network.os_vif_util [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.435 2 DEBUG nova.network.os_vif_util [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:4e:27,bridge_name='br-int',has_traffic_filtering=True,id=c6151fb1-ab92-4620-9c1b-de715e136554,network=Network(b5e8390b-42ff-40d7-bb46-05b4a7f0a027),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6151fb1-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.436 2 DEBUG nova.objects.instance [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e909358-f157-4878-9c45-00e8c263d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.452 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <uuid>6e909358-f157-4878-9c45-00e8c263d2d3</uuid>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <name>instance-000000bb</name>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <memory>131072</memory>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <vcpu>1</vcpu>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <metadata>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <nova:name>tempest-TestGettingAddress-server-556145375</nova:name>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <nova:creationTime>2025-09-30 21:57:19</nova:creationTime>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <nova:flavor name="m1.nano">
Sep 30 21:57:19 compute-1 nova_compute[192795]:         <nova:memory>128</nova:memory>
Sep 30 21:57:19 compute-1 nova_compute[192795]:         <nova:disk>1</nova:disk>
Sep 30 21:57:19 compute-1 nova_compute[192795]:         <nova:swap>0</nova:swap>
Sep 30 21:57:19 compute-1 nova_compute[192795]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:57:19 compute-1 nova_compute[192795]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       </nova:flavor>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <nova:owner>
Sep 30 21:57:19 compute-1 nova_compute[192795]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:57:19 compute-1 nova_compute[192795]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       </nova:owner>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <nova:ports>
Sep 30 21:57:19 compute-1 nova_compute[192795]:         <nova:port uuid="c6151fb1-ab92-4620-9c1b-de715e136554">
Sep 30 21:57:19 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe94:4e27" ipVersion="6"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:         </nova:port>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       </nova:ports>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     </nova:instance>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   </metadata>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <sysinfo type="smbios">
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <system>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <entry name="serial">6e909358-f157-4878-9c45-00e8c263d2d3</entry>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <entry name="uuid">6e909358-f157-4878-9c45-00e8c263d2d3</entry>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     </system>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   </sysinfo>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <os>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <boot dev="hd"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <smbios mode="sysinfo"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   </os>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <features>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <acpi/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <apic/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <vmcoreinfo/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   </features>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <clock offset="utc">
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <timer name="hpet" present="no"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   </clock>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <cpu mode="custom" match="exact">
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <model>Nehalem</model>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   </cpu>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   <devices>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <disk type="file" device="disk">
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <target dev="vda" bus="virtio"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <disk type="file" device="cdrom">
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <source file="/var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk.config"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <target dev="sda" bus="sata"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     </disk>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <interface type="ethernet">
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <mac address="fa:16:3e:94:4e:27"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <mtu size="1442"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <target dev="tapc6151fb1-ab"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     </interface>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <serial type="pty">
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <log file="/var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/console.log" append="off"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     </serial>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <video>
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <model type="virtio"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     </video>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <input type="tablet" bus="usb"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <rng model="virtio">
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     </rng>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <controller type="usb" index="0"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     <memballoon model="virtio">
Sep 30 21:57:19 compute-1 nova_compute[192795]:       <stats period="10"/>
Sep 30 21:57:19 compute-1 nova_compute[192795]:     </memballoon>
Sep 30 21:57:19 compute-1 nova_compute[192795]:   </devices>
Sep 30 21:57:19 compute-1 nova_compute[192795]: </domain>
Sep 30 21:57:19 compute-1 nova_compute[192795]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.454 2 DEBUG nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Preparing to wait for external event network-vif-plugged-c6151fb1-ab92-4620-9c1b-de715e136554 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.455 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.455 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.455 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.457 2 DEBUG nova.virt.libvirt.vif [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-556145375',display_name='tempest-TestGettingAddress-server-556145375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-556145375',id=187,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAUT2tGw/YfTDpIGp84ZfbyD7IpRQ7QUJyyF+bEV3JHk/9CkDID8KoS9k6snQG6VCxf8dtLMx/iInIy7Iaf6LMNj8QmQ0p01A6OrJ5TUZj/qleflyrwQYPj6f1blncJ8qQ==',key_name='tempest-TestGettingAddress-822226956',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-gs7xl9uc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:57:12Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=6e909358-f157-4878-9c45-00e8c263d2d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.457 2 DEBUG nova.network.os_vif_util [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.459 2 DEBUG nova.network.os_vif_util [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:4e:27,bridge_name='br-int',has_traffic_filtering=True,id=c6151fb1-ab92-4620-9c1b-de715e136554,network=Network(b5e8390b-42ff-40d7-bb46-05b4a7f0a027),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6151fb1-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.459 2 DEBUG os_vif [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:4e:27,bridge_name='br-int',has_traffic_filtering=True,id=c6151fb1-ab92-4620-9c1b-de715e136554,network=Network(b5e8390b-42ff-40d7-bb46-05b4a7f0a027),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6151fb1-ab') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.461 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.462 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.466 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6151fb1-ab, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.467 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6151fb1-ab, col_values=(('external_ids', {'iface-id': 'c6151fb1-ab92-4620-9c1b-de715e136554', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:94:4e:27', 'vm-uuid': '6e909358-f157-4878-9c45-00e8c263d2d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:19 compute-1 NetworkManager[51724]: <info>  [1759269439.5248] manager: (tapc6151fb1-ab): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/392)
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.533 2 INFO os_vif [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:4e:27,bridge_name='br-int',has_traffic_filtering=True,id=c6151fb1-ab92-4620-9c1b-de715e136554,network=Network(b5e8390b-42ff-40d7-bb46-05b4a7f0a027),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6151fb1-ab')
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.586 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.586 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.587 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:94:4e:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:57:19 compute-1 nova_compute[192795]: 2025-09-30 21:57:19.588 2 INFO nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Using config drive
Sep 30 21:57:20 compute-1 nova_compute[192795]: 2025-09-30 21:57:20.460 2 INFO nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Creating config drive at /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk.config
Sep 30 21:57:20 compute-1 nova_compute[192795]: 2025-09-30 21:57:20.465 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo_02i4re execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:57:20 compute-1 nova_compute[192795]: 2025-09-30 21:57:20.596 2 DEBUG oslo_concurrency.processutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo_02i4re" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:57:20 compute-1 kernel: tapc6151fb1-ab: entered promiscuous mode
Sep 30 21:57:20 compute-1 NetworkManager[51724]: <info>  [1759269440.6858] manager: (tapc6151fb1-ab): new Tun device (/org/freedesktop/NetworkManager/Devices/393)
Sep 30 21:57:20 compute-1 ovn_controller[94902]: 2025-09-30T21:57:20Z|00784|binding|INFO|Claiming lport c6151fb1-ab92-4620-9c1b-de715e136554 for this chassis.
Sep 30 21:57:20 compute-1 ovn_controller[94902]: 2025-09-30T21:57:20Z|00785|binding|INFO|c6151fb1-ab92-4620-9c1b-de715e136554: Claiming fa:16:3e:94:4e:27 10.100.0.7 2001:db8::f816:3eff:fe94:4e27
Sep 30 21:57:20 compute-1 nova_compute[192795]: 2025-09-30 21:57:20.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.744 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:4e:27 10.100.0.7 2001:db8::f816:3eff:fe94:4e27'], port_security=['fa:16:3e:94:4e:27 10.100.0.7 2001:db8::f816:3eff:fe94:4e27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28 2001:db8::f816:3eff:fe94:4e27/64', 'neutron:device_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4503c00f-33ef-43f3-bcf5-155f3209f13f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14f43e9d-ff95-45ca-8ef3-d794e65d228a, chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c6151fb1-ab92-4620-9c1b-de715e136554) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.746 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c6151fb1-ab92-4620-9c1b-de715e136554 in datapath b5e8390b-42ff-40d7-bb46-05b4a7f0a027 bound to our chassis
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.748 103861 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5e8390b-42ff-40d7-bb46-05b4a7f0a027
Sep 30 21:57:20 compute-1 systemd-udevd[253893]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.766 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[924154f7-a516-40b6-961a-6754ff40dc09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.767 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5e8390b-41 in ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.770 220603 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5e8390b-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.770 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[65dfc3b7-e78b-40f8-aca0-36070dc53d3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 systemd-machined[152783]: New machine qemu-87-instance-000000bb.
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.771 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[008d3d73-9d0b-4654-852a-48007d2fee1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 NetworkManager[51724]: <info>  [1759269440.7794] device (tapc6151fb1-ab): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:57:20 compute-1 NetworkManager[51724]: <info>  [1759269440.7805] device (tapc6151fb1-ab): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.784 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa274b1-a059-4d0a-9a19-55cd1ddc21ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 systemd[1]: Started Virtual Machine qemu-87-instance-000000bb.
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.814 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1be7ee-d227-4682-b1cc-7d28b9e56897]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 nova_compute[192795]: 2025-09-30 21:57:20.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:20 compute-1 ovn_controller[94902]: 2025-09-30T21:57:20Z|00786|binding|INFO|Setting lport c6151fb1-ab92-4620-9c1b-de715e136554 ovn-installed in OVS
Sep 30 21:57:20 compute-1 ovn_controller[94902]: 2025-09-30T21:57:20Z|00787|binding|INFO|Setting lport c6151fb1-ab92-4620-9c1b-de715e136554 up in Southbound
Sep 30 21:57:20 compute-1 nova_compute[192795]: 2025-09-30 21:57:20.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.855 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[a2cc88ab-5ba1-49ed-947e-6aff24bfc05b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 NetworkManager[51724]: <info>  [1759269440.8626] manager: (tapb5e8390b-40): new Veth device (/org/freedesktop/NetworkManager/Devices/394)
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.861 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[782ea38c-e9a1-4ec9-b735-3ee726d5645f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.902 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4bbbcb35-3c35-44b1-808e-a771acdd83e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.906 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe49725-6d50-477e-8d57-9bbea25b6bdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 NetworkManager[51724]: <info>  [1759269440.9303] device (tapb5e8390b-40): carrier: link connected
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.938 220668 DEBUG oslo.privsep.daemon [-] privsep: reply[4911ad57-c12b-4add-8b21-cad76d608e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.961 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[0772d4d6-8ec9-46bd-bed1-55de1ee50e9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5e8390b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:df:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613660, 'reachable_time': 30411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253927, 'error': None, 'target': 'ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:20 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:20.981 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[7d2b550b-60c4-43d4-944b-5d19279818fa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef3:df1d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 613660, 'tstamp': 613660}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253928, 'error': None, 'target': 'ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.011 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7cb3d0-8ea4-4f5e-808e-aeb1ab738ec3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5e8390b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:df:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613660, 'reachable_time': 30411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253929, 'error': None, 'target': 'ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.046 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[5c73e067-53ab-4e91-8e37-b4c075b66d22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.133 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[e1cd2883-8a07-4a1b-8d22-7b08cf33b792]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.134 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5e8390b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.135 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.135 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5e8390b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:57:21 compute-1 kernel: tapb5e8390b-40: entered promiscuous mode
Sep 30 21:57:21 compute-1 NetworkManager[51724]: <info>  [1759269441.1383] manager: (tapb5e8390b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/395)
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.141 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5e8390b-40, col_values=(('external_ids', {'iface-id': '54acfaa1-75af-46fc-b755-5aabbfb79138'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:21 compute-1 ovn_controller[94902]: 2025-09-30T21:57:21Z|00788|binding|INFO|Releasing lport 54acfaa1-75af-46fc-b755-5aabbfb79138 from this chassis (sb_readonly=0)
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.158 103861 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5e8390b-42ff-40d7-bb46-05b4a7f0a027.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5e8390b-42ff-40d7-bb46-05b4a7f0a027.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.160 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9aef121b-abaa-43b5-b5cc-2bc0cd1756db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.161 103861 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: global
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     log         /dev/log local0 debug
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     log-tag     haproxy-metadata-proxy-b5e8390b-42ff-40d7-bb46-05b4a7f0a027
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     user        root
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     group       root
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     maxconn     1024
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     pidfile     /var/lib/neutron/external/pids/b5e8390b-42ff-40d7-bb46-05b4a7f0a027.pid.haproxy
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     daemon
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: defaults
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     log global
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     mode http
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     option httplog
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     option dontlognull
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     option http-server-close
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     option forwardfor
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     retries                 3
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     timeout http-request    30s
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     timeout connect         30s
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     timeout client          32s
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     timeout server          32s
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     timeout http-keep-alive 30s
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: listen listener
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     bind 169.254.169.254:80
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:     http-request add-header X-OVN-Network-ID b5e8390b-42ff-40d7-bb46-05b4a7f0a027
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.161 103861 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'env', 'PROCESS_TAG=haproxy-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5e8390b-42ff-40d7-bb46-05b4a7f0a027.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.417 2 DEBUG nova.compute.manager [req-1dcba1f2-ae56-4e9f-b462-5cd06af50db1 req-d3cddafe-ada1-4bf1-a930-c97513309277 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received event network-vif-plugged-c6151fb1-ab92-4620-9c1b-de715e136554 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.418 2 DEBUG oslo_concurrency.lockutils [req-1dcba1f2-ae56-4e9f-b462-5cd06af50db1 req-d3cddafe-ada1-4bf1-a930-c97513309277 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.419 2 DEBUG oslo_concurrency.lockutils [req-1dcba1f2-ae56-4e9f-b462-5cd06af50db1 req-d3cddafe-ada1-4bf1-a930-c97513309277 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.419 2 DEBUG oslo_concurrency.lockutils [req-1dcba1f2-ae56-4e9f-b462-5cd06af50db1 req-d3cddafe-ada1-4bf1-a930-c97513309277 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.419 2 DEBUG nova.compute.manager [req-1dcba1f2-ae56-4e9f-b462-5cd06af50db1 req-d3cddafe-ada1-4bf1-a930-c97513309277 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Processing event network-vif-plugged-c6151fb1-ab92-4620-9c1b-de715e136554 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.535 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:57:21 compute-1 podman[253968]: 2025-09-30 21:57:21.582147992 +0000 UTC m=+0.078651347 container create 3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:57:21 compute-1 podman[253968]: 2025-09-30 21:57:21.530210441 +0000 UTC m=+0.026713846 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:57:21 compute-1 systemd[1]: Started libpod-conmon-3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade.scope.
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.639 2 DEBUG nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.643 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269441.6387498, 6e909358-f157-4878-9c45-00e8c263d2d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:57:21 compute-1 systemd[1]: Started libcrun container.
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.644 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] VM Started (Lifecycle Event)
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.647 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:57:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd8849ca82d5c347ff865c8d43192b81e3ed87d3f6ad18405e8b55a2cd1af64b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.651 2 INFO nova.virt.libvirt.driver [-] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Instance spawned successfully.
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.652 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:57:21 compute-1 podman[253968]: 2025-09-30 21:57:21.663844529 +0000 UTC m=+0.160347914 container init 3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.667 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:57:21 compute-1 podman[253968]: 2025-09-30 21:57:21.671974074 +0000 UTC m=+0.168477419 container start 3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.678 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.682 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.682 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.682 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.683 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.683 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.683 2 DEBUG nova.virt.libvirt.driver [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:57:21 compute-1 neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027[253982]: [NOTICE]   (253986) : New worker (253988) forked
Sep 30 21:57:21 compute-1 neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027[253982]: [NOTICE]   (253986) : Loading success.
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.710 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.711 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269441.6389084, 6e909358-f157-4878-9c45-00e8c263d2d3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.711 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] VM Paused (Lifecycle Event)
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.732 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.736 2 DEBUG nova.virt.driver [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] Emitting event <LifecycleEvent: 1759269441.6443546, 6e909358-f157-4878-9c45-00e8c263d2d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.737 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] VM Resumed (Lifecycle Event)
Sep 30 21:57:21 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:21.745 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.748 2 INFO nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Took 9.64 seconds to spawn the instance on the hypervisor.
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.749 2 DEBUG nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.752 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.757 2 DEBUG nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.786 2 INFO nova.compute.manager [None req-71b5620b-1476-4b6c-8752-3556840c4868 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.825 2 INFO nova.compute.manager [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Took 10.16 seconds to build instance.
Sep 30 21:57:21 compute-1 nova_compute[192795]: 2025-09-30 21:57:21.862 2 DEBUG oslo_concurrency.lockutils [None req-ae9d4b9e-5fdb-4f4d-b4bc-52bf2728f707 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:22 compute-1 nova_compute[192795]: 2025-09-30 21:57:22.266 2 DEBUG nova.network.neutron [req-538cc0b7-6965-424b-898c-ef49eed19371 req-5bb30390-3b03-4882-bd0e-0758eda46dce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updated VIF entry in instance network info cache for port c6151fb1-ab92-4620-9c1b-de715e136554. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:57:22 compute-1 nova_compute[192795]: 2025-09-30 21:57:22.267 2 DEBUG nova.network.neutron [req-538cc0b7-6965-424b-898c-ef49eed19371 req-5bb30390-3b03-4882-bd0e-0758eda46dce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updating instance_info_cache with network_info: [{"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:57:22 compute-1 nova_compute[192795]: 2025-09-30 21:57:22.284 2 DEBUG oslo_concurrency.lockutils [req-538cc0b7-6965-424b-898c-ef49eed19371 req-5bb30390-3b03-4882-bd0e-0758eda46dce dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:57:22 compute-1 nova_compute[192795]: 2025-09-30 21:57:22.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:23 compute-1 podman[253999]: 2025-09-30 21:57:23.233359512 +0000 UTC m=+0.071505109 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:57:23 compute-1 podman[253997]: 2025-09-30 21:57:23.245752069 +0000 UTC m=+0.083710402 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:57:23 compute-1 podman[253998]: 2025-09-30 21:57:23.288664982 +0000 UTC m=+0.128632447 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:57:23 compute-1 nova_compute[192795]: 2025-09-30 21:57:23.511 2 DEBUG nova.compute.manager [req-d5bace57-b3ee-43e0-8a50-85580cc62eb1 req-0117a08d-46f1-4f5c-bb2b-07de34ca4523 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received event network-vif-plugged-c6151fb1-ab92-4620-9c1b-de715e136554 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:57:23 compute-1 nova_compute[192795]: 2025-09-30 21:57:23.511 2 DEBUG oslo_concurrency.lockutils [req-d5bace57-b3ee-43e0-8a50-85580cc62eb1 req-0117a08d-46f1-4f5c-bb2b-07de34ca4523 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:23 compute-1 nova_compute[192795]: 2025-09-30 21:57:23.512 2 DEBUG oslo_concurrency.lockutils [req-d5bace57-b3ee-43e0-8a50-85580cc62eb1 req-0117a08d-46f1-4f5c-bb2b-07de34ca4523 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:23 compute-1 nova_compute[192795]: 2025-09-30 21:57:23.512 2 DEBUG oslo_concurrency.lockutils [req-d5bace57-b3ee-43e0-8a50-85580cc62eb1 req-0117a08d-46f1-4f5c-bb2b-07de34ca4523 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:23 compute-1 nova_compute[192795]: 2025-09-30 21:57:23.512 2 DEBUG nova.compute.manager [req-d5bace57-b3ee-43e0-8a50-85580cc62eb1 req-0117a08d-46f1-4f5c-bb2b-07de34ca4523 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] No waiting events found dispatching network-vif-plugged-c6151fb1-ab92-4620-9c1b-de715e136554 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:57:23 compute-1 nova_compute[192795]: 2025-09-30 21:57:23.512 2 WARNING nova.compute.manager [req-d5bace57-b3ee-43e0-8a50-85580cc62eb1 req-0117a08d-46f1-4f5c-bb2b-07de34ca4523 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received unexpected event network-vif-plugged-c6151fb1-ab92-4620-9c1b-de715e136554 for instance with vm_state active and task_state None.
Sep 30 21:57:24 compute-1 nova_compute[192795]: 2025-09-30 21:57:24.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:24 compute-1 nova_compute[192795]: 2025-09-30 21:57:24.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:24 compute-1 ovn_controller[94902]: 2025-09-30T21:57:24Z|00789|binding|INFO|Releasing lport 54acfaa1-75af-46fc-b755-5aabbfb79138 from this chassis (sb_readonly=0)
Sep 30 21:57:24 compute-1 nova_compute[192795]: 2025-09-30 21:57:24.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:24 compute-1 NetworkManager[51724]: <info>  [1759269444.9248] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/396)
Sep 30 21:57:24 compute-1 NetworkManager[51724]: <info>  [1759269444.9256] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/397)
Sep 30 21:57:24 compute-1 ovn_controller[94902]: 2025-09-30T21:57:24Z|00790|binding|INFO|Releasing lport 54acfaa1-75af-46fc-b755-5aabbfb79138 from this chassis (sb_readonly=0)
Sep 30 21:57:24 compute-1 nova_compute[192795]: 2025-09-30 21:57:24.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:24 compute-1 nova_compute[192795]: 2025-09-30 21:57:24.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:25 compute-1 nova_compute[192795]: 2025-09-30 21:57:25.414 2 DEBUG nova.compute.manager [req-80e1f6e6-1be1-494e-8a33-a2dcc9596459 req-bf58a592-b700-4e3f-a648-9306da119c8f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received event network-changed-c6151fb1-ab92-4620-9c1b-de715e136554 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:57:25 compute-1 nova_compute[192795]: 2025-09-30 21:57:25.414 2 DEBUG nova.compute.manager [req-80e1f6e6-1be1-494e-8a33-a2dcc9596459 req-bf58a592-b700-4e3f-a648-9306da119c8f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Refreshing instance network info cache due to event network-changed-c6151fb1-ab92-4620-9c1b-de715e136554. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:57:25 compute-1 nova_compute[192795]: 2025-09-30 21:57:25.415 2 DEBUG oslo_concurrency.lockutils [req-80e1f6e6-1be1-494e-8a33-a2dcc9596459 req-bf58a592-b700-4e3f-a648-9306da119c8f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:57:25 compute-1 nova_compute[192795]: 2025-09-30 21:57:25.415 2 DEBUG oslo_concurrency.lockutils [req-80e1f6e6-1be1-494e-8a33-a2dcc9596459 req-bf58a592-b700-4e3f-a648-9306da119c8f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:57:25 compute-1 nova_compute[192795]: 2025-09-30 21:57:25.415 2 DEBUG nova.network.neutron [req-80e1f6e6-1be1-494e-8a33-a2dcc9596459 req-bf58a592-b700-4e3f-a648-9306da119c8f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Refreshing network info cache for port c6151fb1-ab92-4620-9c1b-de715e136554 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:57:25 compute-1 nova_compute[192795]: 2025-09-30 21:57:25.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:25 compute-1 nova_compute[192795]: 2025-09-30 21:57:25.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:57:25 compute-1 nova_compute[192795]: 2025-09-30 21:57:25.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:57:25 compute-1 nova_compute[192795]: 2025-09-30 21:57:25.882 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:57:26 compute-1 nova_compute[192795]: 2025-09-30 21:57:26.769 2 DEBUG nova.network.neutron [req-80e1f6e6-1be1-494e-8a33-a2dcc9596459 req-bf58a592-b700-4e3f-a648-9306da119c8f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updated VIF entry in instance network info cache for port c6151fb1-ab92-4620-9c1b-de715e136554. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:57:26 compute-1 nova_compute[192795]: 2025-09-30 21:57:26.769 2 DEBUG nova.network.neutron [req-80e1f6e6-1be1-494e-8a33-a2dcc9596459 req-bf58a592-b700-4e3f-a648-9306da119c8f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updating instance_info_cache with network_info: [{"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:57:26 compute-1 nova_compute[192795]: 2025-09-30 21:57:26.790 2 DEBUG oslo_concurrency.lockutils [req-80e1f6e6-1be1-494e-8a33-a2dcc9596459 req-bf58a592-b700-4e3f-a648-9306da119c8f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:57:26 compute-1 nova_compute[192795]: 2025-09-30 21:57:26.791 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:57:26 compute-1 nova_compute[192795]: 2025-09-30 21:57:26.791 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:57:26 compute-1 nova_compute[192795]: 2025-09-30 21:57:26.791 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6e909358-f157-4878-9c45-00e8c263d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:57:29 compute-1 nova_compute[192795]: 2025-09-30 21:57:29.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:29 compute-1 nova_compute[192795]: 2025-09-30 21:57:29.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:29 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:29.748 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:57:31 compute-1 nova_compute[192795]: 2025-09-30 21:57:31.347 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updating instance_info_cache with network_info: [{"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:57:31 compute-1 nova_compute[192795]: 2025-09-30 21:57:31.378 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:57:31 compute-1 nova_compute[192795]: 2025-09-30 21:57:31.378 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:57:33 compute-1 podman[254081]: 2025-09-30 21:57:33.257178316 +0000 UTC m=+0.082989962 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:57:34 compute-1 nova_compute[192795]: 2025-09-30 21:57:34.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:57:34 compute-1 nova_compute[192795]: 2025-09-30 21:57:34.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:57:34 compute-1 nova_compute[192795]: 2025-09-30 21:57:34.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 21:57:34 compute-1 nova_compute[192795]: 2025-09-30 21:57:34.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:57:34 compute-1 nova_compute[192795]: 2025-09-30 21:57:34.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:34 compute-1 nova_compute[192795]: 2025-09-30 21:57:34.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:57:35 compute-1 ovn_controller[94902]: 2025-09-30T21:57:35Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:94:4e:27 10.100.0.7
Sep 30 21:57:35 compute-1 ovn_controller[94902]: 2025-09-30T21:57:35Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:94:4e:27 10.100.0.7
Sep 30 21:57:38 compute-1 podman[254103]: 2025-09-30 21:57:38.210388135 +0000 UTC m=+0.048707437 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:57:38 compute-1 podman[254102]: 2025-09-30 21:57:38.21320424 +0000 UTC m=+0.055161368 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:57:38 compute-1 podman[254101]: 2025-09-30 21:57:38.219611108 +0000 UTC m=+0.065024157 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.)
Sep 30 21:57:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:38.718 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:38.719 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:57:38.719 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:39 compute-1 nova_compute[192795]: 2025-09-30 21:57:39.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:44 compute-1 nova_compute[192795]: 2025-09-30 21:57:44.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:48 compute-1 podman[254162]: 2025-09-30 21:57:48.253356976 +0000 UTC m=+0.090317306 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:57:49 compute-1 nova_compute[192795]: 2025-09-30 21:57:49.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:57:49 compute-1 nova_compute[192795]: 2025-09-30 21:57:49.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:49 compute-1 nova_compute[192795]: 2025-09-30 21:57:49.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 21:57:49 compute-1 nova_compute[192795]: 2025-09-30 21:57:49.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:57:49 compute-1 nova_compute[192795]: 2025-09-30 21:57:49.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:57:49 compute-1 nova_compute[192795]: 2025-09-30 21:57:49.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:54 compute-1 podman[254182]: 2025-09-30 21:57:54.230313303 +0000 UTC m=+0.070713857 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:57:54 compute-1 podman[254184]: 2025-09-30 21:57:54.235404128 +0000 UTC m=+0.061956707 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:57:54 compute-1 podman[254183]: 2025-09-30 21:57:54.270159236 +0000 UTC m=+0.099534969 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 21:57:54 compute-1 nova_compute[192795]: 2025-09-30 21:57:54.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:54 compute-1 ovn_controller[94902]: 2025-09-30T21:57:54Z|00791|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Sep 30 21:57:59 compute-1 nova_compute[192795]: 2025-09-30 21:57:59.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:57:59 compute-1 nova_compute[192795]: 2025-09-30 21:57:59.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:58:04 compute-1 podman[254252]: 2025-09-30 21:58:04.247361508 +0000 UTC m=+0.086849164 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:58:04 compute-1 nova_compute[192795]: 2025-09-30 21:58:04.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:58:07 compute-1 nova_compute[192795]: 2025-09-30 21:58:07.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:07 compute-1 nova_compute[192795]: 2025-09-30 21:58:07.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:07 compute-1 nova_compute[192795]: 2025-09-30 21:58:07.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:58:08 compute-1 nova_compute[192795]: 2025-09-30 21:58:08.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:09 compute-1 podman[254273]: 2025-09-30 21:58:09.224818037 +0000 UTC m=+0.071327364 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Sep 30 21:58:09 compute-1 podman[254274]: 2025-09-30 21:58:09.238717204 +0000 UTC m=+0.070784480 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:58:09 compute-1 podman[254275]: 2025-09-30 21:58:09.249120908 +0000 UTC m=+0.087392458 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 21:58:09 compute-1 nova_compute[192795]: 2025-09-30 21:58:09.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:58:09 compute-1 nova_compute[192795]: 2025-09-30 21:58:09.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:09 compute-1 nova_compute[192795]: 2025-09-30 21:58:09.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 21:58:09 compute-1 nova_compute[192795]: 2025-09-30 21:58:09.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:58:09 compute-1 nova_compute[192795]: 2025-09-30 21:58:09.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:58:09 compute-1 nova_compute[192795]: 2025-09-30 21:58:09.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:10 compute-1 sshd-session[254337]: error: kex_exchange_identification: read: Connection reset by peer
Sep 30 21:58:10 compute-1 sshd-session[254337]: Connection reset by 45.140.17.97 port 56568
Sep 30 21:58:11 compute-1 nova_compute[192795]: 2025-09-30 21:58:11.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:11 compute-1 nova_compute[192795]: 2025-09-30 21:58:11.731 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:11 compute-1 nova_compute[192795]: 2025-09-30 21:58:11.731 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:11 compute-1 nova_compute[192795]: 2025-09-30 21:58:11.731 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:11 compute-1 nova_compute[192795]: 2025-09-30 21:58:11.732 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:58:11 compute-1 nova_compute[192795]: 2025-09-30 21:58:11.838 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:58:11 compute-1 nova_compute[192795]: 2025-09-30 21:58:11.936 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:58:11 compute-1 nova_compute[192795]: 2025-09-30 21:58:11.938 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.028 2 DEBUG oslo_concurrency.processutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.218 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.219 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5520MB free_disk=73.26776123046875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.219 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.220 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:12.226 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:58:12 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:12.227 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.378 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Instance 6e909358-f157-4878-9c45-00e8c263d2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.379 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.379 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.425 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.443 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.471 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:58:12 compute-1 nova_compute[192795]: 2025-09-30 21:58:12.472 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:14 compute-1 nova_compute[192795]: 2025-09-30 21:58:14.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:16 compute-1 nova_compute[192795]: 2025-09-30 21:58:16.472 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:17 compute-1 nova_compute[192795]: 2025-09-30 21:58:17.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:18 compute-1 nova_compute[192795]: 2025-09-30 21:58:18.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:19 compute-1 podman[254345]: 2025-09-30 21:58:19.23716549 +0000 UTC m=+0.074814187 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:58:19 compute-1 nova_compute[192795]: 2025-09-30 21:58:19.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:58:19 compute-1 nova_compute[192795]: 2025-09-30 21:58:19.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:58:19 compute-1 nova_compute[192795]: 2025-09-30 21:58:19.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 21:58:19 compute-1 nova_compute[192795]: 2025-09-30 21:58:19.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:58:19 compute-1 nova_compute[192795]: 2025-09-30 21:58:19.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:19 compute-1 nova_compute[192795]: 2025-09-30 21:58:19.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:58:22 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:22.229 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:58:24 compute-1 nova_compute[192795]: 2025-09-30 21:58:24.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:24 compute-1 nova_compute[192795]: 2025-09-30 21:58:24.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:58:24 compute-1 nova_compute[192795]: 2025-09-30 21:58:24.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:25 compute-1 podman[254368]: 2025-09-30 21:58:25.255231106 +0000 UTC m=+0.071388746 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:58:25 compute-1 podman[254366]: 2025-09-30 21:58:25.258478512 +0000 UTC m=+0.084407940 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:58:25 compute-1 podman[254367]: 2025-09-30 21:58:25.287056247 +0000 UTC m=+0.115374908 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:58:27 compute-1 nova_compute[192795]: 2025-09-30 21:58:27.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:27 compute-1 nova_compute[192795]: 2025-09-30 21:58:27.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:58:27 compute-1 nova_compute[192795]: 2025-09-30 21:58:27.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:58:27 compute-1 nova_compute[192795]: 2025-09-30 21:58:27.960 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:58:27 compute-1 nova_compute[192795]: 2025-09-30 21:58:27.961 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquired lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:58:27 compute-1 nova_compute[192795]: 2025-09-30 21:58:27.961 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:58:27 compute-1 nova_compute[192795]: 2025-09-30 21:58:27.962 2 DEBUG nova.objects.instance [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6e909358-f157-4878-9c45-00e8c263d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:58:29 compute-1 nova_compute[192795]: 2025-09-30 21:58:29.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:58:29 compute-1 nova_compute[192795]: 2025-09-30 21:58:29.902 2 DEBUG nova.network.neutron [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updating instance_info_cache with network_info: [{"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:58:29 compute-1 nova_compute[192795]: 2025-09-30 21:58:29.924 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Releasing lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:58:29 compute-1 nova_compute[192795]: 2025-09-30 21:58:29.924 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:58:30 compute-1 nova_compute[192795]: 2025-09-30 21:58:30.919 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:34 compute-1 nova_compute[192795]: 2025-09-30 21:58:34.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:58:35 compute-1 podman[254435]: 2025-09-30 21:58:35.239138618 +0000 UTC m=+0.073745108 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:58:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:38.722 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:38.724 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:38.725 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:39 compute-1 nova_compute[192795]: 2025-09-30 21:58:39.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:40 compute-1 podman[254470]: 2025-09-30 21:58:40.243177805 +0000 UTC m=+0.075394591 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6)
Sep 30 21:58:40 compute-1 podman[254472]: 2025-09-30 21:58:40.258235263 +0000 UTC m=+0.070248106 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent)
Sep 30 21:58:40 compute-1 podman[254471]: 2025-09-30 21:58:40.305550382 +0000 UTC m=+0.121906180 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:58:43 compute-1 nova_compute[192795]: 2025-09-30 21:58:43.876 2 DEBUG nova.compute.manager [req-ad7abc6e-e054-4895-9a4b-0526f6ed612b req-97316e0e-2df1-4f8f-a66f-468b414d7b9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received event network-changed-c6151fb1-ab92-4620-9c1b-de715e136554 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:58:43 compute-1 nova_compute[192795]: 2025-09-30 21:58:43.876 2 DEBUG nova.compute.manager [req-ad7abc6e-e054-4895-9a4b-0526f6ed612b req-97316e0e-2df1-4f8f-a66f-468b414d7b9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Refreshing instance network info cache due to event network-changed-c6151fb1-ab92-4620-9c1b-de715e136554. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:58:43 compute-1 nova_compute[192795]: 2025-09-30 21:58:43.876 2 DEBUG oslo_concurrency.lockutils [req-ad7abc6e-e054-4895-9a4b-0526f6ed612b req-97316e0e-2df1-4f8f-a66f-468b414d7b9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:58:43 compute-1 nova_compute[192795]: 2025-09-30 21:58:43.877 2 DEBUG oslo_concurrency.lockutils [req-ad7abc6e-e054-4895-9a4b-0526f6ed612b req-97316e0e-2df1-4f8f-a66f-468b414d7b9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:58:43 compute-1 nova_compute[192795]: 2025-09-30 21:58:43.877 2 DEBUG nova.network.neutron [req-ad7abc6e-e054-4895-9a4b-0526f6ed612b req-97316e0e-2df1-4f8f-a66f-468b414d7b9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Refreshing network info cache for port c6151fb1-ab92-4620-9c1b-de715e136554 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.040 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'name': 'tempest-TestGettingAddress-server-556145375', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000bb', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'hostId': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.041 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.047 2 DEBUG oslo_concurrency.lockutils [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "6e909358-f157-4878-9c45-00e8c263d2d3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.048 2 DEBUG oslo_concurrency.lockutils [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.048 2 DEBUG oslo_concurrency.lockutils [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.049 2 DEBUG oslo_concurrency.lockutils [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.049 2 DEBUG oslo_concurrency.lockutils [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.064 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/memory.usage volume: 42.85546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '932be27c-86b4-46f2-be60-990c39c81db6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.85546875, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'timestamp': '2025-09-30T21:58:44.041926', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'instance-000000bb', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a1f73c1a-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.797333255, 'message_signature': 'c3aa96d40507e2a313b7e17ef4ab7e789b673018ce4ebf639e123bc8d5d4c257'}]}, 'timestamp': '2025-09-30 21:58:44.065644', '_unique_id': '5c537776c3ba4d5e8956ad44c29d860b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.067 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.068 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.069 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.069 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-556145375>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-556145375>]
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.069 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.069 2 INFO nova.compute.manager [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Terminating instance
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.072 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6e909358-f157-4878-9c45-00e8c263d2d3 / tapc6151fb1-ab inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.072 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.incoming.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5874bc4c-9e57-489f-b2fa-991b7d00386a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.069601', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a1f867e8-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': '72f2459bba3dc1d0e651fad5ceda5ad85aab072c819359e7f1f4d3457452153e'}]}, 'timestamp': '2025-09-30 21:58:44.073364', '_unique_id': '937fc2e6dfef44a99dbb825c7fa2ff29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.074 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.076 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.076 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-556145375>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-556145375>]
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.076 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1cf8778-9cc1-454c-b907-d41337916d88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.076832', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a1f9059a-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': 'c79e4dd0e17ed6c37e974624ddc9be56163a8e8734f3b0b5a846f5053aaecc52'}]}, 'timestamp': '2025-09-30 21:58:44.077281', '_unique_id': 'cd48c487560e4f78990e8d4cd83b903c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.078 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.080 2 DEBUG nova.compute.manager [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:58:44 compute-1 kernel: tapc6151fb1-ab (unregistering): left promiscuous mode
Sep 30 21:58:44 compute-1 NetworkManager[51724]: <info>  [1759269524.1067] device (tapc6151fb1-ab): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:44 compute-1 ovn_controller[94902]: 2025-09-30T21:58:44Z|00792|binding|INFO|Releasing lport c6151fb1-ab92-4620-9c1b-de715e136554 from this chassis (sb_readonly=0)
Sep 30 21:58:44 compute-1 ovn_controller[94902]: 2025-09-30T21:58:44Z|00793|binding|INFO|Setting lport c6151fb1-ab92-4620-9c1b-de715e136554 down in Southbound
Sep 30 21:58:44 compute-1 ovn_controller[94902]: 2025-09-30T21:58:44Z|00794|binding|INFO|Removing iface tapc6151fb1-ab ovn-installed in OVS
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.134 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:4e:27 10.100.0.7 2001:db8::f816:3eff:fe94:4e27'], port_security=['fa:16:3e:94:4e:27 10.100.0.7 2001:db8::f816:3eff:fe94:4e27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28 2001:db8::f816:3eff:fe94:4e27/64', 'neutron:device_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4503c00f-33ef-43f3-bcf5-155f3209f13f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14f43e9d-ff95-45ca-8ef3-d794e65d228a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>], logical_port=c6151fb1-ab92-4620-9c1b-de715e136554) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f966f0e19d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.135 103861 INFO neutron.agent.ovn.metadata.agent [-] Port c6151fb1-ab92-4620-9c1b-de715e136554 in datapath b5e8390b-42ff-40d7-bb46-05b4a7f0a027 unbound from our chassis
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.137 103861 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5e8390b-42ff-40d7-bb46-05b4a7f0a027, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.139 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[8d1f2b71-c665-400a-86ff-faa9947b22b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.140 103861 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027 namespace which is not needed anymore
Sep 30 21:58:44 compute-1 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000bb.scope: Deactivated successfully.
Sep 30 21:58:44 compute-1 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000bb.scope: Consumed 16.336s CPU time.
Sep 30 21:58:44 compute-1 systemd-machined[152783]: Machine qemu-87-instance-000000bb terminated.
Sep 30 21:58:44 compute-1 virtqemud[192217]: Unable to read from monitor: Connection reset by peer
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: libvirt: QEMU Driver error : Unable to read from monitor: Connection reset by peer
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.290 12 WARNING ceilometer.compute.virt.libvirt.inspector [-] Error from libvirt while checking blockStats, This may not be harmful, but please check : Unable to read from monitor: Connection reset by peer: libvirt.libvirtError: Unable to read from monitor: Connection reset by peer
Sep 30 21:58:44 compute-1 neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027[253982]: [NOTICE]   (253986) : haproxy version is 2.8.14-c23fe91
Sep 30 21:58:44 compute-1 neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027[253982]: [NOTICE]   (253986) : path to executable is /usr/sbin/haproxy
Sep 30 21:58:44 compute-1 neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027[253982]: [WARNING]  (253986) : Exiting Master process...
Sep 30 21:58:44 compute-1 neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027[253982]: [ALERT]    (253986) : Current worker (253988) exited with code 143 (Terminated)
Sep 30 21:58:44 compute-1 neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027[253982]: [WARNING]  (253986) : All workers exited. Exiting... (0)
Sep 30 21:58:44 compute-1 systemd[1]: libpod-3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade.scope: Deactivated successfully.
Sep 30 21:58:44 compute-1 podman[254559]: 2025-09-30 21:58:44.335709726 +0000 UTC m=+0.069855645 container died 3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:58:44 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade-userdata-shm.mount: Deactivated successfully.
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: libvirt: Domain Config error : Requested operation is not valid: domain is not running
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.369 12 WARNING ceilometer.compute.virt.libvirt.inspector [-] Error from libvirt while checking blockStats, This may not be harmful, but please check : Requested operation is not valid: domain is not running: libvirt.libvirtError: Requested operation is not valid: domain is not running
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.370 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.370 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/cpu volume: 12380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-cd8849ca82d5c347ff865c8d43192b81e3ed87d3f6ad18405e8b55a2cd1af64b-merged.mount: Deactivated successfully.
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17e76431-1fae-4c8e-8682-d1386c18e331', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12380000000, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'timestamp': '2025-09-30T21:58:44.370735', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'instance-000000bb', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a225ed80-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.797333255, 'message_signature': '4c46081338d574d5cc843e057935d6d57c7c1ef8c6cfdc20dad53d1198c20308'}]}, 'timestamp': '2025-09-30 21:58:44.371884', '_unique_id': 'c96495511b1c46d1882c2e5cfe593ec3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.373 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.376 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.379 12 DEBUG ceilometer.compute.pollsters [-] Instance 6e909358-f157-4878-9c45-00e8c263d2d3 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-000000bb, id=6e909358-f157-4878-9c45-00e8c263d2d3>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.379 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.379 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.379 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.379 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-556145375>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-556145375>]
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.380 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.380 12 DEBUG ceilometer.compute.pollsters [-] Instance 6e909358-f157-4878-9c45-00e8c263d2d3 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-000000bb, id=6e909358-f157-4878-9c45-00e8c263d2d3>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.381 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.381 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.379 2 INFO nova.virt.libvirt.driver [-] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Instance destroyed successfully.
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.380 2 DEBUG nova.objects.instance [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid 6e909358-f157-4878-9c45-00e8c263d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fb9a8e0-158f-47c9-98cd-a342dfb27852', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.381093', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a227731c-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': 'ce60f8bbd9550f0711077ccee3932f94059991033f78da5dedd1df48d59eb48c'}]}, 'timestamp': '2025-09-30 21:58:44.381611', '_unique_id': 'fadd0ceeafe14d2c8deeae7c32c9110b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.383 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.incoming.bytes volume: 21836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76e9c34b-dbef-4af9-a7c4-7516659f2fa4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 21836, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.383930', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a227dbf4-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': 'edbd00ec3a0920102892ff595f127bfeb1824d337e0461a4166994daa09591cc'}]}, 'timestamp': '2025-09-30 21:58:44.384164', '_unique_id': '668daab970574355a25ec0bf8a989f76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.384 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '619cbb74-e26d-4f0d-91ef-785cbe1d6227', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.385225', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a2281114-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': 'a0941279264bd6e837d210d7c74689a2fb9cf19d105a45fe77b109122750c482'}]}, 'timestamp': '2025-09-30 21:58:44.385520', '_unique_id': '0bbf3258ae914b9f89d4c6be0a7d1995'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.385 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.387 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.387 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.387 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.387 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.outgoing.bytes volume: 19750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '495683d3-8362-4039-a1a9-0238b23f90b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19750, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.387398', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a2286344-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': '07c21cbda399c4949c515d3807a2d852e71809926078f5f827592da6a673ad5c'}]}, 'timestamp': '2025-09-30 21:58:44.387624', '_unique_id': '94d59956496244618e75e2f98b4a74de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-556145375>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-556145375>]
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.388 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 podman[254559]: 2025-09-30 21:58:44.390040401 +0000 UTC m=+0.124186330 container cleanup 3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54e66ebc-361a-4417-b7ff-3a53490bc9c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.389043', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a228a3ae-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': '3b218abc2242857177c4279ef46d594641d2d8dff67b40a66b48c9a0bf7c30a9'}]}, 'timestamp': '2025-09-30 21:58:44.389278', '_unique_id': 'e143c7cd11e94e809dcad4ea285e34b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.389 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.390 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.390 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.outgoing.packets volume: 127 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19dfaa2a-a038-49ab-b2eb-215f1f60e0c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 127, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.390677', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a228e42c-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': '5974a6453794e18b6cdeb3296c95f21c5100238f7b8be88256f497e0e4ba79fa'}]}, 'timestamp': '2025-09-30 21:58:44.390965', '_unique_id': 'dfa9e9de31e64b91a48198538d4680f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.391 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.392 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.392 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.393 12 DEBUG ceilometer.compute.pollsters [-] Instance 6e909358-f157-4878-9c45-00e8c263d2d3 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-000000bb, id=6e909358-f157-4878-9c45-00e8c263d2d3>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.394 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.394 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb8b6de0-9774-4755-8939-5dfdf64f0cc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.394176', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a2297216-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': '7218a553c3c9fd0e067c60f1939d459f88abc0e7ace0f11545db06f8baf2d713'}]}, 'timestamp': '2025-09-30 21:58:44.394614', '_unique_id': '9e95e5c1b433486a8059ae58c46e06ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.395 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.396 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.396 12 DEBUG ceilometer.compute.pollsters [-] 6e909358-f157-4878-9c45-00e8c263d2d3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c597686-fd0e-4457-835a-90a0731ba916', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-000000bb-6e909358-f157-4878-9c45-00e8c263d2d3-tapc6151fb1-ab', 'timestamp': '2025-09-30T21:58:44.396663', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-556145375', 'name': 'tapc6151fb1-ab', 'instance_id': '6e909358-f157-4878-9c45-00e8c263d2d3', 'instance_type': 'm1.nano', 'host': 'dc939b1dec339b5508b6792df14da4a9fd3652e856be1b5af51ecd62', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:4e:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6151fb1-ab'}, 'message_id': 'a229cf0e-9e48-11f0-984a-fa163e8033fc', 'monotonic_time': 6219.802398658, 'message_signature': 'e356c526f3e59b4939c62741dd999e751714aad728085ce6b24e8bb5aae766bf'}]}, 'timestamp': '2025-09-30 21:58:44.396969', '_unique_id': '80dd948512514b43919bb76390ba6d33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:58:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 21:58:44.397 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.400 2 DEBUG nova.virt.libvirt.vif [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-556145375',display_name='tempest-TestGettingAddress-server-556145375',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-556145375',id=187,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAUT2tGw/YfTDpIGp84ZfbyD7IpRQ7QUJyyF+bEV3JHk/9CkDID8KoS9k6snQG6VCxf8dtLMx/iInIy7Iaf6LMNj8QmQ0p01A6OrJ5TUZj/qleflyrwQYPj6f1blncJ8qQ==',key_name='tempest-TestGettingAddress-822226956',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:57:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-gs7xl9uc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:57:21Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=6e909358-f157-4878-9c45-00e8c263d2d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.401 2 DEBUG nova.network.os_vif_util [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.403 2 DEBUG nova.network.os_vif_util [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:94:4e:27,bridge_name='br-int',has_traffic_filtering=True,id=c6151fb1-ab92-4620-9c1b-de715e136554,network=Network(b5e8390b-42ff-40d7-bb46-05b4a7f0a027),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6151fb1-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.403 2 DEBUG os_vif [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:4e:27,bridge_name='br-int',has_traffic_filtering=True,id=c6151fb1-ab92-4620-9c1b-de715e136554,network=Network(b5e8390b-42ff-40d7-bb46-05b4a7f0a027),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6151fb1-ab') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.405 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6151fb1-ab, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.411 2 INFO os_vif [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:4e:27,bridge_name='br-int',has_traffic_filtering=True,id=c6151fb1-ab92-4620-9c1b-de715e136554,network=Network(b5e8390b-42ff-40d7-bb46-05b4a7f0a027),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6151fb1-ab')
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.412 2 INFO nova.virt.libvirt.driver [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Deleting instance files /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3_del
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.413 2 INFO nova.virt.libvirt.driver [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Deletion of /var/lib/nova/instances/6e909358-f157-4878-9c45-00e8c263d2d3_del complete
Sep 30 21:58:44 compute-1 systemd[1]: libpod-conmon-3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade.scope: Deactivated successfully.
Sep 30 21:58:44 compute-1 podman[254605]: 2025-09-30 21:58:44.468633206 +0000 UTC m=+0.042500113 container remove 3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.477 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf5957a-002c-4d34-9e84-b7a2ad4fa470]: (4, ('Tue Sep 30 09:58:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027 (3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade)\n3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade\nTue Sep 30 09:58:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027 (3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade)\n3f06a8cf9f561a86e6fffd6c53af3237c328f7eec371dc8e4a903bc4a052eade\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.479 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[87f45e6b-eb66-42a7-b524-0a85cc5538c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.480 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5e8390b-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:44 compute-1 kernel: tapb5e8390b-40: left promiscuous mode
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.494 2 INFO nova.compute.manager [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.496 2 DEBUG oslo.service.loopingcall [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.498 2 DEBUG nova.compute.manager [-] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.498 2 DEBUG nova.network.neutron [-] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.498 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[1411b640-6519-4ae3-b4c9-8c3d0bdbaf01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.526 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[947b0bfc-c9e8-416d-93bc-591e3b5ae4c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.528 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[9f13042e-02c2-4969-95dc-1e33e973b075]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.562 220603 DEBUG oslo.privsep.daemon [-] privsep: reply[d47cdd1d-5947-4012-87cb-d63d37b6ab0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613652, 'reachable_time': 35039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254620, 'error': None, 'target': 'ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.566 103975 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:58:44 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:58:44.567 103975 DEBUG oslo.privsep.daemon [-] privsep: reply[2996c8b5-c432-42d3-a1c6-c77abeb68a4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:58:44 compute-1 systemd[1]: run-netns-ovnmeta\x2db5e8390b\x2d42ff\x2d40d7\x2dbb46\x2d05b4a7f0a027.mount: Deactivated successfully.
Sep 30 21:58:44 compute-1 nova_compute[192795]: 2025-09-30 21:58:44.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.009 2 DEBUG nova.compute.manager [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received event network-vif-unplugged-c6151fb1-ab92-4620-9c1b-de715e136554 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.009 2 DEBUG oslo_concurrency.lockutils [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.010 2 DEBUG oslo_concurrency.lockutils [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.010 2 DEBUG oslo_concurrency.lockutils [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.011 2 DEBUG nova.compute.manager [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] No waiting events found dispatching network-vif-unplugged-c6151fb1-ab92-4620-9c1b-de715e136554 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.011 2 DEBUG nova.compute.manager [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received event network-vif-unplugged-c6151fb1-ab92-4620-9c1b-de715e136554 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.012 2 DEBUG nova.compute.manager [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received event network-vif-plugged-c6151fb1-ab92-4620-9c1b-de715e136554 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.012 2 DEBUG oslo_concurrency.lockutils [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.012 2 DEBUG oslo_concurrency.lockutils [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.013 2 DEBUG oslo_concurrency.lockutils [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.013 2 DEBUG nova.compute.manager [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] No waiting events found dispatching network-vif-plugged-c6151fb1-ab92-4620-9c1b-de715e136554 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.014 2 WARNING nova.compute.manager [req-f07563d4-a5e1-4dbc-a6a3-0d93d4c7221f req-468e3c02-59f5-4908-89a6-d2a857e3df29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received unexpected event network-vif-plugged-c6151fb1-ab92-4620-9c1b-de715e136554 for instance with vm_state active and task_state deleting.
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.390 2 DEBUG nova.network.neutron [-] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.417 2 INFO nova.compute.manager [-] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Took 1.92 seconds to deallocate network for instance.
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.618 2 DEBUG oslo_concurrency.lockutils [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.619 2 DEBUG oslo_concurrency.lockutils [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.719 2 DEBUG nova.compute.provider_tree [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.737 2 DEBUG nova.scheduler.client.report [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.760 2 DEBUG oslo_concurrency.lockutils [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.790 2 INFO nova.scheduler.client.report [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance 6e909358-f157-4878-9c45-00e8c263d2d3
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.872 2 DEBUG oslo_concurrency.lockutils [None req-ce9d1f02-4393-4cb9-a122-5529181519d5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "6e909358-f157-4878-9c45-00e8c263d2d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.986 2 DEBUG nova.network.neutron [req-ad7abc6e-e054-4895-9a4b-0526f6ed612b req-97316e0e-2df1-4f8f-a66f-468b414d7b9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updated VIF entry in instance network info cache for port c6151fb1-ab92-4620-9c1b-de715e136554. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:58:46 compute-1 nova_compute[192795]: 2025-09-30 21:58:46.986 2 DEBUG nova.network.neutron [req-ad7abc6e-e054-4895-9a4b-0526f6ed612b req-97316e0e-2df1-4f8f-a66f-468b414d7b9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Updating instance_info_cache with network_info: [{"id": "c6151fb1-ab92-4620-9c1b-de715e136554", "address": "fa:16:3e:94:4e:27", "network": {"id": "b5e8390b-42ff-40d7-bb46-05b4a7f0a027", "bridge": "br-int", "label": "tempest-network-smoke--1736259770", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe94:4e27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6151fb1-ab", "ovs_interfaceid": "c6151fb1-ab92-4620-9c1b-de715e136554", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:58:47 compute-1 nova_compute[192795]: 2025-09-30 21:58:47.014 2 DEBUG oslo_concurrency.lockutils [req-ad7abc6e-e054-4895-9a4b-0526f6ed612b req-97316e0e-2df1-4f8f-a66f-468b414d7b9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-6e909358-f157-4878-9c45-00e8c263d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:58:48 compute-1 nova_compute[192795]: 2025-09-30 21:58:48.163 2 DEBUG nova.compute.manager [req-0d9a09ab-590b-4dc8-a68d-96a4af9d2679 req-cf8f28e9-8a53-4199-9a58-a150201831d4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Received event network-vif-deleted-c6151fb1-ab92-4620-9c1b-de715e136554 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:58:49 compute-1 nova_compute[192795]: 2025-09-30 21:58:49.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:49 compute-1 nova_compute[192795]: 2025-09-30 21:58:49.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:50 compute-1 podman[254621]: 2025-09-30 21:58:50.266502366 +0000 UTC m=+0.096927850 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 21:58:53 compute-1 nova_compute[192795]: 2025-09-30 21:58:53.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:53 compute-1 nova_compute[192795]: 2025-09-30 21:58:53.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:54 compute-1 nova_compute[192795]: 2025-09-30 21:58:54.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:54 compute-1 nova_compute[192795]: 2025-09-30 21:58:54.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:56 compute-1 podman[254643]: 2025-09-30 21:58:56.22851945 +0000 UTC m=+0.065753157 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:58:56 compute-1 podman[254645]: 2025-09-30 21:58:56.246166547 +0000 UTC m=+0.070036401 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:58:56 compute-1 podman[254644]: 2025-09-30 21:58:56.285344161 +0000 UTC m=+0.110468688 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:58:59 compute-1 nova_compute[192795]: 2025-09-30 21:58:59.372 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269524.3708458, 6e909358-f157-4878-9c45-00e8c263d2d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:58:59 compute-1 nova_compute[192795]: 2025-09-30 21:58:59.372 2 INFO nova.compute.manager [-] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] VM Stopped (Lifecycle Event)
Sep 30 21:58:59 compute-1 nova_compute[192795]: 2025-09-30 21:58:59.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:59 compute-1 nova_compute[192795]: 2025-09-30 21:58:59.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:01 compute-1 nova_compute[192795]: 2025-09-30 21:59:01.146 2 DEBUG nova.compute.manager [None req-ae5a4db1-7f70-4b11-94ae-886e50fc048f - - - - - -] [instance: 6e909358-f157-4878-9c45-00e8c263d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:59:04 compute-1 nova_compute[192795]: 2025-09-30 21:59:04.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:04 compute-1 nova_compute[192795]: 2025-09-30 21:59:04.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:04 compute-1 nova_compute[192795]: 2025-09-30 21:59:04.713 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:06 compute-1 podman[254714]: 2025-09-30 21:59:06.209238219 +0000 UTC m=+0.057168231 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:59:07 compute-1 nova_compute[192795]: 2025-09-30 21:59:07.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:07 compute-1 nova_compute[192795]: 2025-09-30 21:59:07.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:59:09 compute-1 nova_compute[192795]: 2025-09-30 21:59:09.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:09 compute-1 nova_compute[192795]: 2025-09-30 21:59:09.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:09 compute-1 nova_compute[192795]: 2025-09-30 21:59:09.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:11 compute-1 podman[254734]: 2025-09-30 21:59:11.212701761 +0000 UTC m=+0.057478238 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 21:59:11 compute-1 podman[254736]: 2025-09-30 21:59:11.238402321 +0000 UTC m=+0.076636495 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:59:11 compute-1 podman[254735]: 2025-09-30 21:59:11.239059638 +0000 UTC m=+0.081506173 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.734 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.735 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.735 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.735 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.886 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.887 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5697MB free_disk=73.29683303833008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.887 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.888 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.958 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:59:12 compute-1 nova_compute[192795]: 2025-09-30 21:59:12.958 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:59:13 compute-1 nova_compute[192795]: 2025-09-30 21:59:13.006 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:59:13 compute-1 nova_compute[192795]: 2025-09-30 21:59:13.022 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:59:13 compute-1 nova_compute[192795]: 2025-09-30 21:59:13.057 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:59:13 compute-1 nova_compute[192795]: 2025-09-30 21:59:13.057 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:59:14 compute-1 nova_compute[192795]: 2025-09-30 21:59:14.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:14 compute-1 nova_compute[192795]: 2025-09-30 21:59:14.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:16 compute-1 nova_compute[192795]: 2025-09-30 21:59:16.058 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:17 compute-1 nova_compute[192795]: 2025-09-30 21:59:17.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:19 compute-1 nova_compute[192795]: 2025-09-30 21:59:19.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:19 compute-1 nova_compute[192795]: 2025-09-30 21:59:19.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:19 compute-1 nova_compute[192795]: 2025-09-30 21:59:19.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:21 compute-1 podman[254794]: 2025-09-30 21:59:21.232369786 +0000 UTC m=+0.067680227 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3)
Sep 30 21:59:24 compute-1 nova_compute[192795]: 2025-09-30 21:59:24.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:24 compute-1 nova_compute[192795]: 2025-09-30 21:59:24.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:24 compute-1 ovn_controller[94902]: 2025-09-30T21:59:24Z|00795|memory_trim|INFO|Detected inactivity (last active 30023 ms ago): trimming memory
Sep 30 21:59:25 compute-1 nova_compute[192795]: 2025-09-30 21:59:25.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:27 compute-1 podman[254815]: 2025-09-30 21:59:27.230255788 +0000 UTC m=+0.066600149 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:59:27 compute-1 podman[254822]: 2025-09-30 21:59:27.253607835 +0000 UTC m=+0.066644571 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:59:27 compute-1 podman[254816]: 2025-09-30 21:59:27.273474799 +0000 UTC m=+0.101191382 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:59:28 compute-1 nova_compute[192795]: 2025-09-30 21:59:28.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:28 compute-1 nova_compute[192795]: 2025-09-30 21:59:28.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:59:28 compute-1 nova_compute[192795]: 2025-09-30 21:59:28.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:59:28 compute-1 nova_compute[192795]: 2025-09-30 21:59:28.713 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:59:29 compute-1 nova_compute[192795]: 2025-09-30 21:59:29.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:29 compute-1 nova_compute[192795]: 2025-09-30 21:59:29.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:34 compute-1 nova_compute[192795]: 2025-09-30 21:59:34.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:34 compute-1 nova_compute[192795]: 2025-09-30 21:59:34.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:37 compute-1 podman[254884]: 2025-09-30 21:59:37.23734961 +0000 UTC m=+0.084467291 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:59:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:59:38.723 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:59:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:59:38.723 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:59:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 21:59:38.724 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:59:39 compute-1 nova_compute[192795]: 2025-09-30 21:59:39.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:39 compute-1 nova_compute[192795]: 2025-09-30 21:59:39.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:42 compute-1 podman[254907]: 2025-09-30 21:59:42.211230575 +0000 UTC m=+0.050069633 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:59:42 compute-1 podman[254905]: 2025-09-30 21:59:42.211232515 +0000 UTC m=+0.058221568 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9)
Sep 30 21:59:42 compute-1 podman[254906]: 2025-09-30 21:59:42.211696067 +0000 UTC m=+0.054971022 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:59:44 compute-1 nova_compute[192795]: 2025-09-30 21:59:44.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:44 compute-1 nova_compute[192795]: 2025-09-30 21:59:44.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:49 compute-1 nova_compute[192795]: 2025-09-30 21:59:49.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:49 compute-1 nova_compute[192795]: 2025-09-30 21:59:49.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:52 compute-1 podman[254967]: 2025-09-30 21:59:52.208255363 +0000 UTC m=+0.052392965 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=iscsid, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:59:54 compute-1 nova_compute[192795]: 2025-09-30 21:59:54.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:54 compute-1 nova_compute[192795]: 2025-09-30 21:59:54.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:55 compute-1 sshd-session[254988]: Accepted publickey for zuul from 192.168.122.10 port 46154 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:59:55 compute-1 systemd-logind[793]: New session 68 of user zuul.
Sep 30 21:59:55 compute-1 systemd[1]: Started Session 68 of User zuul.
Sep 30 21:59:55 compute-1 sshd-session[254988]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:59:55 compute-1 sudo[254992]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 21:59:55 compute-1 sudo[254992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:59:58 compute-1 podman[255132]: 2025-09-30 21:59:58.22225048 +0000 UTC m=+0.054864419 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:59:58 compute-1 podman[255130]: 2025-09-30 21:59:58.229124782 +0000 UTC m=+0.067092533 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:59:58 compute-1 podman[255131]: 2025-09-30 21:59:58.28133005 +0000 UTC m=+0.118327655 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:59:59 compute-1 nova_compute[192795]: 2025-09-30 21:59:59.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:59 compute-1 nova_compute[192795]: 2025-09-30 21:59:59.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:00 compute-1 ovs-vsctl[255230]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 22:00:01 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 255016 (sos)
Sep 30 22:00:01 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Sep 30 22:00:01 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Sep 30 22:00:01 compute-1 virtqemud[192217]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 22:00:01 compute-1 virtqemud[192217]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 22:00:01 compute-1 virtqemud[192217]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 22:00:02 compute-1 kernel: block vda: the capability attribute has been deprecated.
Sep 30 22:00:03 compute-1 crontab[255649]: (root) LIST (root)
Sep 30 22:00:04 compute-1 nova_compute[192795]: 2025-09-30 22:00:04.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:04 compute-1 nova_compute[192795]: 2025-09-30 22:00:04.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:05 compute-1 systemd[1]: Starting Hostname Service...
Sep 30 22:00:05 compute-1 systemd[1]: Started Hostname Service.
Sep 30 22:00:06 compute-1 nova_compute[192795]: 2025-09-30 22:00:06.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:06 compute-1 nova_compute[192795]: 2025-09-30 22:00:06.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:06 compute-1 nova_compute[192795]: 2025-09-30 22:00:06.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 22:00:08 compute-1 podman[255900]: 2025-09-30 22:00:08.12619348 +0000 UTC m=+0.087187164 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Sep 30 22:00:09 compute-1 nova_compute[192795]: 2025-09-30 22:00:09.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:09 compute-1 nova_compute[192795]: 2025-09-30 22:00:09.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:09 compute-1 nova_compute[192795]: 2025-09-30 22:00:09.719 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:09 compute-1 nova_compute[192795]: 2025-09-30 22:00:09.719 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:09 compute-1 nova_compute[192795]: 2025-09-30 22:00:09.719 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:00:13 compute-1 ovs-appctl[256730]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 22:00:13 compute-1 ovs-appctl[256736]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 22:00:13 compute-1 ovs-appctl[256740]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 22:00:13 compute-1 podman[256751]: 2025-09-30 22:00:13.249668183 +0000 UTC m=+0.077713794 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:00:13 compute-1 podman[256746]: 2025-09-30 22:00:13.257193061 +0000 UTC m=+0.080374603 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, distribution-scope=public, config_id=edpm, container_name=openstack_network_exporter)
Sep 30 22:00:13 compute-1 podman[256753]: 2025-09-30 22:00:13.277181859 +0000 UTC m=+0.104364906 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 22:00:13 compute-1 nova_compute[192795]: 2025-09-30 22:00:13.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:13 compute-1 nova_compute[192795]: 2025-09-30 22:00:13.733 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:00:13 compute-1 nova_compute[192795]: 2025-09-30 22:00:13.733 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:00:13 compute-1 nova_compute[192795]: 2025-09-30 22:00:13.733 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:00:13 compute-1 nova_compute[192795]: 2025-09-30 22:00:13.734 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:00:13 compute-1 nova_compute[192795]: 2025-09-30 22:00:13.915 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:00:13 compute-1 nova_compute[192795]: 2025-09-30 22:00:13.916 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5392MB free_disk=72.90235137939453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:00:13 compute-1 nova_compute[192795]: 2025-09-30 22:00:13.916 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:00:13 compute-1 nova_compute[192795]: 2025-09-30 22:00:13.916 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:00:14 compute-1 nova_compute[192795]: 2025-09-30 22:00:14.006 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:00:14 compute-1 nova_compute[192795]: 2025-09-30 22:00:14.006 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:00:14 compute-1 nova_compute[192795]: 2025-09-30 22:00:14.029 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:00:14 compute-1 nova_compute[192795]: 2025-09-30 22:00:14.044 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:00:14 compute-1 nova_compute[192795]: 2025-09-30 22:00:14.062 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:00:14 compute-1 nova_compute[192795]: 2025-09-30 22:00:14.062 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:00:14 compute-1 nova_compute[192795]: 2025-09-30 22:00:14.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:14 compute-1 nova_compute[192795]: 2025-09-30 22:00:14.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:16 compute-1 nova_compute[192795]: 2025-09-30 22:00:16.063 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:17 compute-1 nova_compute[192795]: 2025-09-30 22:00:17.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:19 compute-1 nova_compute[192795]: 2025-09-30 22:00:19.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:19 compute-1 nova_compute[192795]: 2025-09-30 22:00:19.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:19 compute-1 nova_compute[192795]: 2025-09-30 22:00:19.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:22 compute-1 virtqemud[192217]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 22:00:22 compute-1 podman[258295]: 2025-09-30 22:00:22.331088193 +0000 UTC m=+0.079690295 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Sep 30 22:00:24 compute-1 nova_compute[192795]: 2025-09-30 22:00:24.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:24 compute-1 systemd[1]: Starting Time & Date Service...
Sep 30 22:00:24 compute-1 nova_compute[192795]: 2025-09-30 22:00:24.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:24 compute-1 nova_compute[192795]: 2025-09-30 22:00:24.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 22:00:24 compute-1 nova_compute[192795]: 2025-09-30 22:00:24.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:24 compute-1 nova_compute[192795]: 2025-09-30 22:00:24.731 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 22:00:24 compute-1 systemd[1]: Started Time & Date Service.
Sep 30 22:00:25 compute-1 nova_compute[192795]: 2025-09-30 22:00:25.732 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:28 compute-1 podman[258488]: 2025-09-30 22:00:28.483188367 +0000 UTC m=+0.083557568 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:00:28 compute-1 podman[258486]: 2025-09-30 22:00:28.488051334 +0000 UTC m=+0.103310009 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Sep 30 22:00:28 compute-1 podman[258487]: 2025-09-30 22:00:28.542713248 +0000 UTC m=+0.151923192 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250923)
Sep 30 22:00:29 compute-1 nova_compute[192795]: 2025-09-30 22:00:29.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:29 compute-1 nova_compute[192795]: 2025-09-30 22:00:29.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:29 compute-1 nova_compute[192795]: 2025-09-30 22:00:29.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:00:29 compute-1 nova_compute[192795]: 2025-09-30 22:00:29.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:00:29 compute-1 nova_compute[192795]: 2025-09-30 22:00:29.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:29 compute-1 nova_compute[192795]: 2025-09-30 22:00:29.725 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:00:33 compute-1 nova_compute[192795]: 2025-09-30 22:00:33.726 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:34 compute-1 nova_compute[192795]: 2025-09-30 22:00:34.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:34 compute-1 nova_compute[192795]: 2025-09-30 22:00:34.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:38 compute-1 podman[258553]: 2025-09-30 22:00:38.280586382 +0000 UTC m=+0.087008408 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 22:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:00:38.724 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:00:38.726 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:00:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:00:38.726 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:00:39 compute-1 nova_compute[192795]: 2025-09-30 22:00:39.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:39 compute-1 nova_compute[192795]: 2025-09-30 22:00:39.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:43 compute-1 podman[258574]: 2025-09-30 22:00:43.883660289 +0000 UTC m=+0.074223291 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:00:43 compute-1 podman[258573]: 2025-09-30 22:00:43.892094212 +0000 UTC m=+0.082885239 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Sep 30 22:00:43 compute-1 podman[258575]: 2025-09-30 22:00:43.90185311 +0000 UTC m=+0.072037604 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:00:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:44 compute-1 nova_compute[192795]: 2025-09-30 22:00:44.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:44 compute-1 nova_compute[192795]: 2025-09-30 22:00:44.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:45 compute-1 nova_compute[192795]: 2025-09-30 22:00:45.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:49 compute-1 nova_compute[192795]: 2025-09-30 22:00:49.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:49 compute-1 nova_compute[192795]: 2025-09-30 22:00:49.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:50 compute-1 sudo[254992]: pam_unix(sudo:session): session closed for user root
Sep 30 22:00:50 compute-1 sshd-session[254991]: Received disconnect from 192.168.122.10 port 46154:11: disconnected by user
Sep 30 22:00:50 compute-1 sshd-session[254991]: Disconnected from user zuul 192.168.122.10 port 46154
Sep 30 22:00:50 compute-1 sshd-session[254988]: pam_unix(sshd:session): session closed for user zuul
Sep 30 22:00:50 compute-1 systemd[1]: session-68.scope: Deactivated successfully.
Sep 30 22:00:50 compute-1 systemd[1]: session-68.scope: Consumed 1min 31.733s CPU time, 704.7M memory peak, read 222.3M from disk, written 22.4M to disk.
Sep 30 22:00:50 compute-1 systemd-logind[793]: Session 68 logged out. Waiting for processes to exit.
Sep 30 22:00:50 compute-1 systemd-logind[793]: Removed session 68.
Sep 30 22:00:50 compute-1 sshd-session[258633]: Accepted publickey for zuul from 192.168.122.10 port 60642 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 22:00:50 compute-1 systemd-logind[793]: New session 69 of user zuul.
Sep 30 22:00:50 compute-1 systemd[1]: Started Session 69 of User zuul.
Sep 30 22:00:50 compute-1 sshd-session[258633]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 22:00:50 compute-1 sudo[258637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2025-09-30-ydelksh.tar.xz
Sep 30 22:00:50 compute-1 sudo[258637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 22:00:50 compute-1 sudo[258637]: pam_unix(sudo:session): session closed for user root
Sep 30 22:00:50 compute-1 sshd-session[258636]: Received disconnect from 192.168.122.10 port 60642:11: disconnected by user
Sep 30 22:00:50 compute-1 sshd-session[258636]: Disconnected from user zuul 192.168.122.10 port 60642
Sep 30 22:00:50 compute-1 sshd-session[258633]: pam_unix(sshd:session): session closed for user zuul
Sep 30 22:00:50 compute-1 systemd[1]: session-69.scope: Deactivated successfully.
Sep 30 22:00:50 compute-1 systemd-logind[793]: Session 69 logged out. Waiting for processes to exit.
Sep 30 22:00:50 compute-1 systemd-logind[793]: Removed session 69.
Sep 30 22:00:51 compute-1 sshd-session[258662]: Accepted publickey for zuul from 192.168.122.10 port 60648 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 22:00:51 compute-1 systemd-logind[793]: New session 70 of user zuul.
Sep 30 22:00:51 compute-1 systemd[1]: Started Session 70 of User zuul.
Sep 30 22:00:51 compute-1 sshd-session[258662]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 22:00:51 compute-1 sudo[258666]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Sep 30 22:00:51 compute-1 sudo[258666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 22:00:51 compute-1 sudo[258666]: pam_unix(sudo:session): session closed for user root
Sep 30 22:00:51 compute-1 sshd-session[258665]: Received disconnect from 192.168.122.10 port 60648:11: disconnected by user
Sep 30 22:00:51 compute-1 sshd-session[258665]: Disconnected from user zuul 192.168.122.10 port 60648
Sep 30 22:00:51 compute-1 sshd-session[258662]: pam_unix(sshd:session): session closed for user zuul
Sep 30 22:00:51 compute-1 systemd[1]: session-70.scope: Deactivated successfully.
Sep 30 22:00:51 compute-1 systemd-logind[793]: Session 70 logged out. Waiting for processes to exit.
Sep 30 22:00:51 compute-1 systemd-logind[793]: Removed session 70.
Sep 30 22:00:53 compute-1 podman[258691]: 2025-09-30 22:00:53.264519146 +0000 UTC m=+0.084660707 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:00:54 compute-1 nova_compute[192795]: 2025-09-30 22:00:54.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:54 compute-1 nova_compute[192795]: 2025-09-30 22:00:54.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:54 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 22:00:54 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 22:00:59 compute-1 podman[258715]: 2025-09-30 22:00:59.218798685 +0000 UTC m=+0.063776094 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 22:00:59 compute-1 podman[258717]: 2025-09-30 22:00:59.234418818 +0000 UTC m=+0.075505914 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 22:00:59 compute-1 podman[258716]: 2025-09-30 22:00:59.266796353 +0000 UTC m=+0.110752435 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 22:00:59 compute-1 nova_compute[192795]: 2025-09-30 22:00:59.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:59 compute-1 nova_compute[192795]: 2025-09-30 22:00:59.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:01 compute-1 CROND[258786]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 22:01:01 compute-1 run-parts[258789]: (/etc/cron.hourly) starting 0anacron
Sep 30 22:01:01 compute-1 run-parts[258795]: (/etc/cron.hourly) finished 0anacron
Sep 30 22:01:01 compute-1 CROND[258785]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 22:01:04 compute-1 nova_compute[192795]: 2025-09-30 22:01:04.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:04 compute-1 nova_compute[192795]: 2025-09-30 22:01:04.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:06 compute-1 nova_compute[192795]: 2025-09-30 22:01:06.735 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:09 compute-1 podman[258796]: 2025-09-30 22:01:09.237737471 +0000 UTC m=+0.076146011 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:01:09 compute-1 nova_compute[192795]: 2025-09-30 22:01:09.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:09 compute-1 nova_compute[192795]: 2025-09-30 22:01:09.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:10 compute-1 nova_compute[192795]: 2025-09-30 22:01:10.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:10 compute-1 nova_compute[192795]: 2025-09-30 22:01:10.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:01:11 compute-1 nova_compute[192795]: 2025-09-30 22:01:11.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.605 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.724 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.724 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.724 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.725 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.887 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.888 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5675MB free_disk=73.29690170288086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.888 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:01:13 compute-1 nova_compute[192795]: 2025-09-30 22:01:13.888 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:01:14 compute-1 nova_compute[192795]: 2025-09-30 22:01:14.131 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:01:14 compute-1 nova_compute[192795]: 2025-09-30 22:01:14.132 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:01:14 compute-1 podman[258818]: 2025-09-30 22:01:14.222116373 +0000 UTC m=+0.061631529 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:01:14 compute-1 podman[258817]: 2025-09-30 22:01:14.222286137 +0000 UTC m=+0.065691106 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Sep 30 22:01:14 compute-1 podman[258819]: 2025-09-30 22:01:14.247173844 +0000 UTC m=+0.085346204 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 22:01:14 compute-1 nova_compute[192795]: 2025-09-30 22:01:14.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:14 compute-1 nova_compute[192795]: 2025-09-30 22:01:14.595 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:01:14 compute-1 nova_compute[192795]: 2025-09-30 22:01:14.616 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:01:14 compute-1 nova_compute[192795]: 2025-09-30 22:01:14.649 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:01:14 compute-1 nova_compute[192795]: 2025-09-30 22:01:14.649 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:01:14 compute-1 nova_compute[192795]: 2025-09-30 22:01:14.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:16 compute-1 nova_compute[192795]: 2025-09-30 22:01:16.649 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:17 compute-1 nova_compute[192795]: 2025-09-30 22:01:17.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:19 compute-1 nova_compute[192795]: 2025-09-30 22:01:19.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:19 compute-1 nova_compute[192795]: 2025-09-30 22:01:19.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:21 compute-1 nova_compute[192795]: 2025-09-30 22:01:21.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:24 compute-1 podman[258877]: 2025-09-30 22:01:24.230103501 +0000 UTC m=+0.065314086 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 22:01:24 compute-1 nova_compute[192795]: 2025-09-30 22:01:24.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:24 compute-1 nova_compute[192795]: 2025-09-30 22:01:24.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:25 compute-1 nova_compute[192795]: 2025-09-30 22:01:25.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:29 compute-1 nova_compute[192795]: 2025-09-30 22:01:29.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:29 compute-1 nova_compute[192795]: 2025-09-30 22:01:29.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:30 compute-1 podman[258897]: 2025-09-30 22:01:30.256443795 +0000 UTC m=+0.095673447 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20250923, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 22:01:30 compute-1 podman[258902]: 2025-09-30 22:01:30.262229227 +0000 UTC m=+0.080288750 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 22:01:30 compute-1 podman[258898]: 2025-09-30 22:01:30.286460397 +0000 UTC m=+0.114235837 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:01:30 compute-1 nova_compute[192795]: 2025-09-30 22:01:30.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:30 compute-1 nova_compute[192795]: 2025-09-30 22:01:30.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:01:30 compute-1 nova_compute[192795]: 2025-09-30 22:01:30.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:01:30 compute-1 nova_compute[192795]: 2025-09-30 22:01:30.757 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:01:34 compute-1 nova_compute[192795]: 2025-09-30 22:01:34.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:34 compute-1 nova_compute[192795]: 2025-09-30 22:01:34.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:01:38.725 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:01:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:01:38.726 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:01:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:01:38.726 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:01:39 compute-1 nova_compute[192795]: 2025-09-30 22:01:39.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:39 compute-1 nova_compute[192795]: 2025-09-30 22:01:39.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:40 compute-1 podman[258964]: 2025-09-30 22:01:40.000206534 +0000 UTC m=+0.082419628 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:01:44 compute-1 nova_compute[192795]: 2025-09-30 22:01:44.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:44 compute-1 nova_compute[192795]: 2025-09-30 22:01:44.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:45 compute-1 podman[258987]: 2025-09-30 22:01:45.245504474 +0000 UTC m=+0.068534121 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Sep 30 22:01:45 compute-1 podman[258986]: 2025-09-30 22:01:45.247587759 +0000 UTC m=+0.069581638 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 22:01:45 compute-1 podman[258985]: 2025-09-30 22:01:45.264791723 +0000 UTC m=+0.091827595 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 22:01:49 compute-1 nova_compute[192795]: 2025-09-30 22:01:49.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:49 compute-1 nova_compute[192795]: 2025-09-30 22:01:49.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:54 compute-1 nova_compute[192795]: 2025-09-30 22:01:54.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:54 compute-1 nova_compute[192795]: 2025-09-30 22:01:54.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:55 compute-1 podman[259046]: 2025-09-30 22:01:55.2277075 +0000 UTC m=+0.071063407 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, config_id=iscsid)
Sep 30 22:01:59 compute-1 nova_compute[192795]: 2025-09-30 22:01:59.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:59 compute-1 nova_compute[192795]: 2025-09-30 22:01:59.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:01 compute-1 podman[259066]: 2025-09-30 22:02:01.213673627 +0000 UTC m=+0.054270114 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Sep 30 22:02:01 compute-1 podman[259067]: 2025-09-30 22:02:01.245922659 +0000 UTC m=+0.082935871 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 22:02:01 compute-1 podman[259068]: 2025-09-30 22:02:01.247181972 +0000 UTC m=+0.078811262 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 22:02:04 compute-1 nova_compute[192795]: 2025-09-30 22:02:04.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:04 compute-1 nova_compute[192795]: 2025-09-30 22:02:04.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:05 compute-1 sshd-session[259135]: Invalid user admin from 78.128.112.74 port 49150
Sep 30 22:02:05 compute-1 sshd-session[259135]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 22:02:05 compute-1 sshd-session[259135]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=78.128.112.74
Sep 30 22:02:07 compute-1 sshd-session[259135]: Failed password for invalid user admin from 78.128.112.74 port 49150 ssh2
Sep 30 22:02:07 compute-1 sshd-session[259135]: Connection closed by invalid user admin 78.128.112.74 port 49150 [preauth]
Sep 30 22:02:08 compute-1 nova_compute[192795]: 2025-09-30 22:02:08.753 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:09 compute-1 nova_compute[192795]: 2025-09-30 22:02:09.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:09 compute-1 nova_compute[192795]: 2025-09-30 22:02:09.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:10 compute-1 podman[259137]: 2025-09-30 22:02:10.242963762 +0000 UTC m=+0.082241482 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 22:02:12 compute-1 nova_compute[192795]: 2025-09-30 22:02:12.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:12 compute-1 nova_compute[192795]: 2025-09-30 22:02:12.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:12 compute-1 nova_compute[192795]: 2025-09-30 22:02:12.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:02:14 compute-1 nova_compute[192795]: 2025-09-30 22:02:14.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:14 compute-1 nova_compute[192795]: 2025-09-30 22:02:14.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:15 compute-1 nova_compute[192795]: 2025-09-30 22:02:15.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:15 compute-1 nova_compute[192795]: 2025-09-30 22:02:15.759 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:02:15 compute-1 nova_compute[192795]: 2025-09-30 22:02:15.759 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:02:15 compute-1 nova_compute[192795]: 2025-09-30 22:02:15.760 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:02:15 compute-1 nova_compute[192795]: 2025-09-30 22:02:15.760 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:02:15 compute-1 nova_compute[192795]: 2025-09-30 22:02:15.918 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:02:15 compute-1 nova_compute[192795]: 2025-09-30 22:02:15.919 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5665MB free_disk=73.29687118530273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:02:15 compute-1 nova_compute[192795]: 2025-09-30 22:02:15.919 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:02:15 compute-1 nova_compute[192795]: 2025-09-30 22:02:15.919 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.132 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.133 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.157 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.224 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.225 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 22:02:16 compute-1 podman[259158]: 2025-09-30 22:02:16.232487803 +0000 UTC m=+0.056557615 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 22:02:16 compute-1 podman[259157]: 2025-09-30 22:02:16.243648858 +0000 UTC m=+0.070851112 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.243 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 22:02:16 compute-1 podman[259159]: 2025-09-30 22:02:16.251381612 +0000 UTC m=+0.064520185 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.273 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.308 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.355 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.357 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:02:16 compute-1 nova_compute[192795]: 2025-09-30 22:02:16.357 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:02:17 compute-1 nova_compute[192795]: 2025-09-30 22:02:17.358 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:19 compute-1 nova_compute[192795]: 2025-09-30 22:02:19.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:19 compute-1 nova_compute[192795]: 2025-09-30 22:02:19.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:19 compute-1 nova_compute[192795]: 2025-09-30 22:02:19.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:21 compute-1 nova_compute[192795]: 2025-09-30 22:02:21.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:24 compute-1 nova_compute[192795]: 2025-09-30 22:02:24.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:24 compute-1 nova_compute[192795]: 2025-09-30 22:02:24.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:26 compute-1 podman[259216]: 2025-09-30 22:02:26.213404124 +0000 UTC m=+0.059915523 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 22:02:27 compute-1 nova_compute[192795]: 2025-09-30 22:02:27.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:29 compute-1 nova_compute[192795]: 2025-09-30 22:02:29.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:29 compute-1 nova_compute[192795]: 2025-09-30 22:02:29.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:30 compute-1 nova_compute[192795]: 2025-09-30 22:02:30.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:30 compute-1 nova_compute[192795]: 2025-09-30 22:02:30.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:02:30 compute-1 nova_compute[192795]: 2025-09-30 22:02:30.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:02:30 compute-1 nova_compute[192795]: 2025-09-30 22:02:30.863 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:02:32 compute-1 podman[259240]: 2025-09-30 22:02:32.207184637 +0000 UTC m=+0.046189330 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:02:32 compute-1 podman[259238]: 2025-09-30 22:02:32.20768502 +0000 UTC m=+0.053021391 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 22:02:32 compute-1 podman[259239]: 2025-09-30 22:02:32.236310677 +0000 UTC m=+0.078858244 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, org.label-schema.build-date=20250923)
Sep 30 22:02:33 compute-1 nova_compute[192795]: 2025-09-30 22:02:33.859 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:34 compute-1 nova_compute[192795]: 2025-09-30 22:02:34.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:34 compute-1 nova_compute[192795]: 2025-09-30 22:02:34.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:02:38.726 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:02:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:02:38.729 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:02:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:02:38.729 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:02:39 compute-1 nova_compute[192795]: 2025-09-30 22:02:39.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:39 compute-1 nova_compute[192795]: 2025-09-30 22:02:39.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:41 compute-1 podman[259306]: 2025-09-30 22:02:41.273428018 +0000 UTC m=+0.105443055 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:02:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:44 compute-1 nova_compute[192795]: 2025-09-30 22:02:44.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:44 compute-1 nova_compute[192795]: 2025-09-30 22:02:44.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:47 compute-1 podman[259326]: 2025-09-30 22:02:47.216023879 +0000 UTC m=+0.061481883 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Sep 30 22:02:47 compute-1 podman[259327]: 2025-09-30 22:02:47.221186236 +0000 UTC m=+0.055985620 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:02:47 compute-1 podman[259328]: 2025-09-30 22:02:47.221166795 +0000 UTC m=+0.055241539 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Sep 30 22:02:49 compute-1 nova_compute[192795]: 2025-09-30 22:02:49.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:49 compute-1 nova_compute[192795]: 2025-09-30 22:02:49.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:54 compute-1 nova_compute[192795]: 2025-09-30 22:02:54.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:54 compute-1 nova_compute[192795]: 2025-09-30 22:02:54.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:57 compute-1 podman[259390]: 2025-09-30 22:02:57.219319736 +0000 UTC m=+0.064923476 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid)
Sep 30 22:02:59 compute-1 nova_compute[192795]: 2025-09-30 22:02:59.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:59 compute-1 nova_compute[192795]: 2025-09-30 22:02:59.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:03 compute-1 podman[259412]: 2025-09-30 22:03:03.247560882 +0000 UTC m=+0.071934054 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 22:03:03 compute-1 podman[259410]: 2025-09-30 22:03:03.252844355 +0000 UTC m=+0.084323498 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 22:03:03 compute-1 podman[259411]: 2025-09-30 22:03:03.362368829 +0000 UTC m=+0.187000158 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:03:04 compute-1 nova_compute[192795]: 2025-09-30 22:03:04.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:04 compute-1 nova_compute[192795]: 2025-09-30 22:03:04.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:09 compute-1 nova_compute[192795]: 2025-09-30 22:03:09.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:09 compute-1 nova_compute[192795]: 2025-09-30 22:03:09.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:10 compute-1 nova_compute[192795]: 2025-09-30 22:03:10.723 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:12 compute-1 podman[259478]: 2025-09-30 22:03:12.226390627 +0000 UTC m=+0.070394884 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 22:03:14 compute-1 nova_compute[192795]: 2025-09-30 22:03:14.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:14 compute-1 nova_compute[192795]: 2025-09-30 22:03:14.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:14 compute-1 nova_compute[192795]: 2025-09-30 22:03:14.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:14 compute-1 nova_compute[192795]: 2025-09-30 22:03:14.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:03:14 compute-1 nova_compute[192795]: 2025-09-30 22:03:14.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:15 compute-1 nova_compute[192795]: 2025-09-30 22:03:15.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:15 compute-1 nova_compute[192795]: 2025-09-30 22:03:15.742 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:03:15 compute-1 nova_compute[192795]: 2025-09-30 22:03:15.742 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:03:15 compute-1 nova_compute[192795]: 2025-09-30 22:03:15.743 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:03:15 compute-1 nova_compute[192795]: 2025-09-30 22:03:15.743 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:03:15 compute-1 nova_compute[192795]: 2025-09-30 22:03:15.951 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:03:15 compute-1 nova_compute[192795]: 2025-09-30 22:03:15.952 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5671MB free_disk=73.29689025878906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:03:15 compute-1 nova_compute[192795]: 2025-09-30 22:03:15.953 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:03:15 compute-1 nova_compute[192795]: 2025-09-30 22:03:15.953 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:03:16 compute-1 nova_compute[192795]: 2025-09-30 22:03:16.090 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:03:16 compute-1 nova_compute[192795]: 2025-09-30 22:03:16.091 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:03:16 compute-1 nova_compute[192795]: 2025-09-30 22:03:16.119 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:03:16 compute-1 nova_compute[192795]: 2025-09-30 22:03:16.215 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:03:16 compute-1 nova_compute[192795]: 2025-09-30 22:03:16.217 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:03:16 compute-1 nova_compute[192795]: 2025-09-30 22:03:16.217 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:03:18 compute-1 nova_compute[192795]: 2025-09-30 22:03:18.218 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:18 compute-1 podman[259501]: 2025-09-30 22:03:18.23550132 +0000 UTC m=+0.061509315 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 22:03:18 compute-1 podman[259499]: 2025-09-30 22:03:18.244309187 +0000 UTC m=+0.068867913 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Sep 30 22:03:18 compute-1 podman[259500]: 2025-09-30 22:03:18.256847314 +0000 UTC m=+0.086130237 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 22:03:19 compute-1 nova_compute[192795]: 2025-09-30 22:03:19.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:19 compute-1 nova_compute[192795]: 2025-09-30 22:03:19.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:21 compute-1 nova_compute[192795]: 2025-09-30 22:03:21.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:21 compute-1 nova_compute[192795]: 2025-09-30 22:03:21.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:24 compute-1 nova_compute[192795]: 2025-09-30 22:03:24.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:24 compute-1 nova_compute[192795]: 2025-09-30 22:03:24.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:28 compute-1 podman[259563]: 2025-09-30 22:03:28.248392376 +0000 UTC m=+0.079503049 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 22:03:29 compute-1 nova_compute[192795]: 2025-09-30 22:03:29.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:29 compute-1 nova_compute[192795]: 2025-09-30 22:03:29.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:29 compute-1 nova_compute[192795]: 2025-09-30 22:03:29.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:30 compute-1 nova_compute[192795]: 2025-09-30 22:03:30.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:30 compute-1 nova_compute[192795]: 2025-09-30 22:03:30.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:03:30 compute-1 nova_compute[192795]: 2025-09-30 22:03:30.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:03:30 compute-1 nova_compute[192795]: 2025-09-30 22:03:30.720 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:03:34 compute-1 podman[259584]: 2025-09-30 22:03:34.232031406 +0000 UTC m=+0.073040149 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:03:34 compute-1 podman[259586]: 2025-09-30 22:03:34.23259095 +0000 UTC m=+0.062943438 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:03:34 compute-1 podman[259585]: 2025-09-30 22:03:34.273398904 +0000 UTC m=+0.108345160 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Sep 30 22:03:34 compute-1 nova_compute[192795]: 2025-09-30 22:03:34.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:34 compute-1 nova_compute[192795]: 2025-09-30 22:03:34.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:03:38.727 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:03:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:03:38.728 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:03:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:03:38.728 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:03:39 compute-1 nova_compute[192795]: 2025-09-30 22:03:39.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:39 compute-1 nova_compute[192795]: 2025-09-30 22:03:39.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:43 compute-1 podman[259652]: 2025-09-30 22:03:43.248646116 +0000 UTC m=+0.080673427 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 22:03:44 compute-1 nova_compute[192795]: 2025-09-30 22:03:44.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:44 compute-1 nova_compute[192795]: 2025-09-30 22:03:44.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:49 compute-1 podman[259675]: 2025-09-30 22:03:49.282594283 +0000 UTC m=+0.102126840 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:03:49 compute-1 podman[259674]: 2025-09-30 22:03:49.283005934 +0000 UTC m=+0.106189265 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 22:03:49 compute-1 podman[259673]: 2025-09-30 22:03:49.291539605 +0000 UTC m=+0.121060610 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 22:03:49 compute-1 nova_compute[192795]: 2025-09-30 22:03:49.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:49 compute-1 nova_compute[192795]: 2025-09-30 22:03:49.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:54 compute-1 nova_compute[192795]: 2025-09-30 22:03:54.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:54 compute-1 nova_compute[192795]: 2025-09-30 22:03:54.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:59 compute-1 podman[259733]: 2025-09-30 22:03:59.21226611 +0000 UTC m=+0.054222444 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:03:59 compute-1 nova_compute[192795]: 2025-09-30 22:03:59.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:59 compute-1 nova_compute[192795]: 2025-09-30 22:03:59.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:04 compute-1 nova_compute[192795]: 2025-09-30 22:04:04.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:04 compute-1 nova_compute[192795]: 2025-09-30 22:04:04.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:05 compute-1 podman[259755]: 2025-09-30 22:04:05.248551218 +0000 UTC m=+0.067661779 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:04:05 compute-1 podman[259753]: 2025-09-30 22:04:05.270188108 +0000 UTC m=+0.095396057 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 22:04:05 compute-1 podman[259754]: 2025-09-30 22:04:05.298438707 +0000 UTC m=+0.120678379 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:04:06 compute-1 sshd-session[259818]: Invalid user admin from 167.71.248.239 port 55566
Sep 30 22:04:06 compute-1 sshd-session[259818]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 22:04:06 compute-1 sshd-session[259818]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=167.71.248.239
Sep 30 22:04:09 compute-1 sshd-session[259818]: Failed password for invalid user admin from 167.71.248.239 port 55566 ssh2
Sep 30 22:04:09 compute-1 nova_compute[192795]: 2025-09-30 22:04:09.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:09 compute-1 nova_compute[192795]: 2025-09-30 22:04:09.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:10 compute-1 nova_compute[192795]: 2025-09-30 22:04:10.715 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:11 compute-1 sshd-session[259818]: Connection closed by invalid user admin 167.71.248.239 port 55566 [preauth]
Sep 30 22:04:14 compute-1 podman[259820]: 2025-09-30 22:04:14.232357371 +0000 UTC m=+0.067339841 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923)
Sep 30 22:04:14 compute-1 nova_compute[192795]: 2025-09-30 22:04:14.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:14 compute-1 nova_compute[192795]: 2025-09-30 22:04:14.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:14 compute-1 nova_compute[192795]: 2025-09-30 22:04:14.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:14 compute-1 nova_compute[192795]: 2025-09-30 22:04:14.692 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:04:14 compute-1 nova_compute[192795]: 2025-09-30 22:04:14.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:15 compute-1 nova_compute[192795]: 2025-09-30 22:04:15.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:15 compute-1 nova_compute[192795]: 2025-09-30 22:04:15.725 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:04:15 compute-1 nova_compute[192795]: 2025-09-30 22:04:15.726 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:04:15 compute-1 nova_compute[192795]: 2025-09-30 22:04:15.726 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:04:15 compute-1 nova_compute[192795]: 2025-09-30 22:04:15.726 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:04:15 compute-1 nova_compute[192795]: 2025-09-30 22:04:15.898 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:04:15 compute-1 nova_compute[192795]: 2025-09-30 22:04:15.899 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5666MB free_disk=73.29687118530273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:04:15 compute-1 nova_compute[192795]: 2025-09-30 22:04:15.899 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:04:15 compute-1 nova_compute[192795]: 2025-09-30 22:04:15.899 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:04:16 compute-1 nova_compute[192795]: 2025-09-30 22:04:16.207 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:04:16 compute-1 nova_compute[192795]: 2025-09-30 22:04:16.207 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:04:16 compute-1 nova_compute[192795]: 2025-09-30 22:04:16.276 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:04:16 compute-1 nova_compute[192795]: 2025-09-30 22:04:16.307 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:04:16 compute-1 nova_compute[192795]: 2025-09-30 22:04:16.308 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:04:16 compute-1 nova_compute[192795]: 2025-09-30 22:04:16.308 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:04:19 compute-1 nova_compute[192795]: 2025-09-30 22:04:19.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:19 compute-1 nova_compute[192795]: 2025-09-30 22:04:19.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:20 compute-1 podman[259841]: 2025-09-30 22:04:20.229795215 +0000 UTC m=+0.071557521 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm)
Sep 30 22:04:20 compute-1 podman[259842]: 2025-09-30 22:04:20.248662753 +0000 UTC m=+0.073168593 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:04:20 compute-1 podman[259843]: 2025-09-30 22:04:20.262047818 +0000 UTC m=+0.091200598 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Sep 30 22:04:20 compute-1 nova_compute[192795]: 2025-09-30 22:04:20.308 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:21 compute-1 nova_compute[192795]: 2025-09-30 22:04:21.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:23 compute-1 nova_compute[192795]: 2025-09-30 22:04:23.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:24 compute-1 nova_compute[192795]: 2025-09-30 22:04:24.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:24 compute-1 nova_compute[192795]: 2025-09-30 22:04:24.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:29 compute-1 nova_compute[192795]: 2025-09-30 22:04:29.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:29 compute-1 nova_compute[192795]: 2025-09-30 22:04:29.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:29 compute-1 nova_compute[192795]: 2025-09-30 22:04:29.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:30 compute-1 podman[259908]: 2025-09-30 22:04:30.255914625 +0000 UTC m=+0.089643067 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:04:31 compute-1 nova_compute[192795]: 2025-09-30 22:04:31.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:31 compute-1 nova_compute[192795]: 2025-09-30 22:04:31.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:04:31 compute-1 nova_compute[192795]: 2025-09-30 22:04:31.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:04:31 compute-1 nova_compute[192795]: 2025-09-30 22:04:31.767 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:04:32 compute-1 nova_compute[192795]: 2025-09-30 22:04:32.276 2 DEBUG oslo_concurrency.processutils [None req-6b86cd02-5ccd-4f89-9da4-d90bf5d34515 9765353c43d34d7a870f0c35895bd320 5fb5d4b07ed54e6cb716f880185e34d5 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 22:04:32 compute-1 nova_compute[192795]: 2025-09-30 22:04:32.303 2 DEBUG oslo_concurrency.processutils [None req-6b86cd02-5ccd-4f89-9da4-d90bf5d34515 9765353c43d34d7a870f0c35895bd320 5fb5d4b07ed54e6cb716f880185e34d5 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 22:04:34 compute-1 nova_compute[192795]: 2025-09-30 22:04:34.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:34 compute-1 nova_compute[192795]: 2025-09-30 22:04:34.762 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:34 compute-1 nova_compute[192795]: 2025-09-30 22:04:34.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:36 compute-1 podman[259931]: 2025-09-30 22:04:36.22977883 +0000 UTC m=+0.058262516 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 22:04:36 compute-1 podman[259929]: 2025-09-30 22:04:36.240552258 +0000 UTC m=+0.084883644 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 22:04:36 compute-1 podman[259930]: 2025-09-30 22:04:36.289658677 +0000 UTC m=+0.119649652 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 22:04:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:04:37.890 103861 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 22:04:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:04:37.892 103861 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 22:04:37 compute-1 nova_compute[192795]: 2025-09-30 22:04:37.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:37 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:04:37.893 103861 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=78438f8f-1ac2-4393-90b7-0b62e0665947, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 22:04:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:04:38.729 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:04:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:04:38.729 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:04:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:04:38.729 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:04:39 compute-1 nova_compute[192795]: 2025-09-30 22:04:39.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:39 compute-1 nova_compute[192795]: 2025-09-30 22:04:39.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.040 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.040 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.040 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.040 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.040 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.041 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.041 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.041 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.041 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.041 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.042 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:04:44.042 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-1 nova_compute[192795]: 2025-09-30 22:04:44.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:44 compute-1 nova_compute[192795]: 2025-09-30 22:04:44.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:45 compute-1 podman[259997]: 2025-09-30 22:04:45.268249265 +0000 UTC m=+0.094269327 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm)
Sep 30 22:04:49 compute-1 nova_compute[192795]: 2025-09-30 22:04:49.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:49 compute-1 nova_compute[192795]: 2025-09-30 22:04:49.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:51 compute-1 podman[260018]: 2025-09-30 22:04:51.21970403 +0000 UTC m=+0.059621541 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 22:04:51 compute-1 podman[260019]: 2025-09-30 22:04:51.228628672 +0000 UTC m=+0.067012053 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 22:04:51 compute-1 podman[260017]: 2025-09-30 22:04:51.234502983 +0000 UTC m=+0.075977784 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Sep 30 22:04:54 compute-1 nova_compute[192795]: 2025-09-30 22:04:54.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:54 compute-1 nova_compute[192795]: 2025-09-30 22:04:54.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:59 compute-1 nova_compute[192795]: 2025-09-30 22:04:59.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:59 compute-1 nova_compute[192795]: 2025-09-30 22:04:59.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:01 compute-1 anacron[105422]: Job `cron.monthly' started
Sep 30 22:05:01 compute-1 anacron[105422]: Job `cron.monthly' terminated
Sep 30 22:05:01 compute-1 anacron[105422]: Normal exit (3 jobs run)
Sep 30 22:05:01 compute-1 podman[260080]: 2025-09-30 22:05:01.169502577 +0000 UTC m=+0.063073011 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 22:05:04 compute-1 nova_compute[192795]: 2025-09-30 22:05:04.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:04 compute-1 nova_compute[192795]: 2025-09-30 22:05:04.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:07 compute-1 podman[260103]: 2025-09-30 22:05:07.220461943 +0000 UTC m=+0.055228487 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 22:05:07 compute-1 podman[260101]: 2025-09-30 22:05:07.234953339 +0000 UTC m=+0.075472392 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:05:07 compute-1 podman[260102]: 2025-09-30 22:05:07.28646676 +0000 UTC m=+0.117563629 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, io.buildah.version=1.41.3)
Sep 30 22:05:08 compute-1 nova_compute[192795]: 2025-09-30 22:05:08.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:08 compute-1 nova_compute[192795]: 2025-09-30 22:05:08.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 22:05:09 compute-1 nova_compute[192795]: 2025-09-30 22:05:09.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:09 compute-1 nova_compute[192795]: 2025-09-30 22:05:09.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:12 compute-1 nova_compute[192795]: 2025-09-30 22:05:12.730 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:14 compute-1 nova_compute[192795]: 2025-09-30 22:05:14.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:14 compute-1 nova_compute[192795]: 2025-09-30 22:05:14.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:16 compute-1 podman[260173]: 2025-09-30 22:05:16.257727947 +0000 UTC m=+0.087928743 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:05:16 compute-1 nova_compute[192795]: 2025-09-30 22:05:16.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:16 compute-1 nova_compute[192795]: 2025-09-30 22:05:16.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:16 compute-1 nova_compute[192795]: 2025-09-30 22:05:16.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:05:17 compute-1 nova_compute[192795]: 2025-09-30 22:05:17.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.288 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.289 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.289 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.290 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.481 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.482 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5682MB free_disk=73.29140090942383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.482 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.483 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.550 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.550 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.592 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.608 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.610 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:05:18 compute-1 nova_compute[192795]: 2025-09-30 22:05:18.610 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:05:19 compute-1 nova_compute[192795]: 2025-09-30 22:05:19.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:19 compute-1 nova_compute[192795]: 2025-09-30 22:05:19.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:21 compute-1 nova_compute[192795]: 2025-09-30 22:05:21.610 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:21 compute-1 nova_compute[192795]: 2025-09-30 22:05:21.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:22 compute-1 podman[260194]: 2025-09-30 22:05:22.246534469 +0000 UTC m=+0.072187986 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:05:22 compute-1 podman[260193]: 2025-09-30 22:05:22.262551364 +0000 UTC m=+0.088506120 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Sep 30 22:05:22 compute-1 podman[260195]: 2025-09-30 22:05:22.287386395 +0000 UTC m=+0.104830490 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 22:05:23 compute-1 nova_compute[192795]: 2025-09-30 22:05:23.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:24 compute-1 nova_compute[192795]: 2025-09-30 22:05:24.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:24 compute-1 nova_compute[192795]: 2025-09-30 22:05:24.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:29 compute-1 nova_compute[192795]: 2025-09-30 22:05:29.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:29 compute-1 nova_compute[192795]: 2025-09-30 22:05:29.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:30 compute-1 nova_compute[192795]: 2025-09-30 22:05:30.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:31 compute-1 nova_compute[192795]: 2025-09-30 22:05:31.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:31 compute-1 nova_compute[192795]: 2025-09-30 22:05:31.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:05:31 compute-1 nova_compute[192795]: 2025-09-30 22:05:31.695 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:05:31 compute-1 nova_compute[192795]: 2025-09-30 22:05:31.735 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:05:32 compute-1 podman[260256]: 2025-09-30 22:05:32.267549307 +0000 UTC m=+0.092533972 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 22:05:34 compute-1 nova_compute[192795]: 2025-09-30 22:05:34.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:34 compute-1 nova_compute[192795]: 2025-09-30 22:05:34.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 22:05:34 compute-1 nova_compute[192795]: 2025-09-30 22:05:34.738 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 22:05:34 compute-1 nova_compute[192795]: 2025-09-30 22:05:34.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:34 compute-1 nova_compute[192795]: 2025-09-30 22:05:34.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:38 compute-1 podman[260279]: 2025-09-30 22:05:38.240349895 +0000 UTC m=+0.073640663 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:05:38 compute-1 podman[260281]: 2025-09-30 22:05:38.241248999 +0000 UTC m=+0.064874857 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:05:38 compute-1 podman[260280]: 2025-09-30 22:05:38.30049158 +0000 UTC m=+0.121310796 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Sep 30 22:05:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:05:38.730 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:05:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:05:38.731 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:05:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:05:38.732 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:05:39 compute-1 nova_compute[192795]: 2025-09-30 22:05:39.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:39 compute-1 nova_compute[192795]: 2025-09-30 22:05:39.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:44 compute-1 nova_compute[192795]: 2025-09-30 22:05:44.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:44 compute-1 nova_compute[192795]: 2025-09-30 22:05:44.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:47 compute-1 podman[260348]: 2025-09-30 22:05:47.266275526 +0000 UTC m=+0.101332982 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 22:05:49 compute-1 nova_compute[192795]: 2025-09-30 22:05:49.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:49 compute-1 nova_compute[192795]: 2025-09-30 22:05:49.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:53 compute-1 podman[260370]: 2025-09-30 22:05:53.232343628 +0000 UTC m=+0.063099841 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 22:05:53 compute-1 podman[260369]: 2025-09-30 22:05:53.257400456 +0000 UTC m=+0.087976935 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container)
Sep 30 22:05:53 compute-1 podman[260371]: 2025-09-30 22:05:53.26146284 +0000 UTC m=+0.092321676 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 22:05:53 compute-1 nova_compute[192795]: 2025-09-30 22:05:53.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:54 compute-1 nova_compute[192795]: 2025-09-30 22:05:54.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:55 compute-1 nova_compute[192795]: 2025-09-30 22:05:55.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:59 compute-1 nova_compute[192795]: 2025-09-30 22:05:59.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:00 compute-1 nova_compute[192795]: 2025-09-30 22:06:00.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:03 compute-1 podman[260433]: 2025-09-30 22:06:03.226687726 +0000 UTC m=+0.068938353 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 22:06:04 compute-1 nova_compute[192795]: 2025-09-30 22:06:04.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:05 compute-1 nova_compute[192795]: 2025-09-30 22:06:05.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:09 compute-1 podman[260454]: 2025-09-30 22:06:09.203041946 +0000 UTC m=+0.048158345 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=multipathd)
Sep 30 22:06:09 compute-1 podman[260456]: 2025-09-30 22:06:09.238325877 +0000 UTC m=+0.070765469 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:06:09 compute-1 podman[260455]: 2025-09-30 22:06:09.280281832 +0000 UTC m=+0.119645443 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:06:09 compute-1 nova_compute[192795]: 2025-09-30 22:06:09.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:10 compute-1 nova_compute[192795]: 2025-09-30 22:06:10.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:12 compute-1 nova_compute[192795]: 2025-09-30 22:06:12.704 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:14 compute-1 nova_compute[192795]: 2025-09-30 22:06:14.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:15 compute-1 nova_compute[192795]: 2025-09-30 22:06:15.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:17 compute-1 nova_compute[192795]: 2025-09-30 22:06:17.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:17 compute-1 nova_compute[192795]: 2025-09-30 22:06:17.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:17 compute-1 nova_compute[192795]: 2025-09-30 22:06:17.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:06:18 compute-1 podman[260522]: 2025-09-30 22:06:18.235010633 +0000 UTC m=+0.066481191 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 22:06:18 compute-1 nova_compute[192795]: 2025-09-30 22:06:18.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:18 compute-1 nova_compute[192795]: 2025-09-30 22:06:18.725 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:06:18 compute-1 nova_compute[192795]: 2025-09-30 22:06:18.726 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:06:18 compute-1 nova_compute[192795]: 2025-09-30 22:06:18.726 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:06:18 compute-1 nova_compute[192795]: 2025-09-30 22:06:18.726 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:06:18 compute-1 nova_compute[192795]: 2025-09-30 22:06:18.918 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:06:18 compute-1 nova_compute[192795]: 2025-09-30 22:06:18.919 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5668MB free_disk=73.29141998291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:06:18 compute-1 nova_compute[192795]: 2025-09-30 22:06:18.919 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:06:18 compute-1 nova_compute[192795]: 2025-09-30 22:06:18.920 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:06:19 compute-1 nova_compute[192795]: 2025-09-30 22:06:19.006 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:06:19 compute-1 nova_compute[192795]: 2025-09-30 22:06:19.007 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:06:19 compute-1 nova_compute[192795]: 2025-09-30 22:06:19.127 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:06:19 compute-1 nova_compute[192795]: 2025-09-30 22:06:19.145 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:06:19 compute-1 nova_compute[192795]: 2025-09-30 22:06:19.146 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:06:19 compute-1 nova_compute[192795]: 2025-09-30 22:06:19.147 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:06:19 compute-1 nova_compute[192795]: 2025-09-30 22:06:19.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:20 compute-1 nova_compute[192795]: 2025-09-30 22:06:20.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:22 compute-1 nova_compute[192795]: 2025-09-30 22:06:22.147 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:23 compute-1 nova_compute[192795]: 2025-09-30 22:06:23.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:24 compute-1 podman[260544]: 2025-09-30 22:06:24.228897305 +0000 UTC m=+0.063004489 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 22:06:24 compute-1 podman[260543]: 2025-09-30 22:06:24.229168882 +0000 UTC m=+0.068858700 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 22:06:24 compute-1 podman[260542]: 2025-09-30 22:06:24.239232622 +0000 UTC m=+0.078241043 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 22:06:24 compute-1 nova_compute[192795]: 2025-09-30 22:06:24.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:24 compute-1 nova_compute[192795]: 2025-09-30 22:06:24.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:25 compute-1 nova_compute[192795]: 2025-09-30 22:06:25.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:29 compute-1 nova_compute[192795]: 2025-09-30 22:06:29.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:30 compute-1 nova_compute[192795]: 2025-09-30 22:06:30.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:30 compute-1 nova_compute[192795]: 2025-09-30 22:06:30.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:32 compute-1 nova_compute[192795]: 2025-09-30 22:06:32.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:32 compute-1 nova_compute[192795]: 2025-09-30 22:06:32.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:06:32 compute-1 nova_compute[192795]: 2025-09-30 22:06:32.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:06:32 compute-1 nova_compute[192795]: 2025-09-30 22:06:32.713 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:06:34 compute-1 podman[260602]: 2025-09-30 22:06:34.240906211 +0000 UTC m=+0.075770270 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, container_name=iscsid)
Sep 30 22:06:34 compute-1 nova_compute[192795]: 2025-09-30 22:06:34.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:35 compute-1 nova_compute[192795]: 2025-09-30 22:06:35.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:37 compute-1 nova_compute[192795]: 2025-09-30 22:06:37.709 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:06:38.732 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:06:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:06:38.732 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:06:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:06:38.732 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:06:39 compute-1 nova_compute[192795]: 2025-09-30 22:06:39.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:40 compute-1 nova_compute[192795]: 2025-09-30 22:06:40.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:40 compute-1 podman[260623]: 2025-09-30 22:06:40.210260149 +0000 UTC m=+0.055778663 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:06:40 compute-1 podman[260625]: 2025-09-30 22:06:40.215128694 +0000 UTC m=+0.053756150 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 22:06:40 compute-1 podman[260624]: 2025-09-30 22:06:40.246179578 +0000 UTC m=+0.087322639 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:06:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-1 nova_compute[192795]: 2025-09-30 22:06:44.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:45 compute-1 nova_compute[192795]: 2025-09-30 22:06:45.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:49 compute-1 podman[260692]: 2025-09-30 22:06:49.220387241 +0000 UTC m=+0.065003991 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 22:06:49 compute-1 nova_compute[192795]: 2025-09-30 22:06:49.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:50 compute-1 nova_compute[192795]: 2025-09-30 22:06:50.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:54 compute-1 nova_compute[192795]: 2025-09-30 22:06:54.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:55 compute-1 nova_compute[192795]: 2025-09-30 22:06:55.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:55 compute-1 podman[260715]: 2025-09-30 22:06:55.221808666 +0000 UTC m=+0.052824455 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 22:06:55 compute-1 podman[260714]: 2025-09-30 22:06:55.238101298 +0000 UTC m=+0.069310443 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, config_id=edpm, io.openshift.expose-services=)
Sep 30 22:06:55 compute-1 podman[260716]: 2025-09-30 22:06:55.244423141 +0000 UTC m=+0.065629347 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Sep 30 22:06:59 compute-1 nova_compute[192795]: 2025-09-30 22:06:59.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:00 compute-1 nova_compute[192795]: 2025-09-30 22:07:00.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:04 compute-1 nova_compute[192795]: 2025-09-30 22:07:04.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:05 compute-1 nova_compute[192795]: 2025-09-30 22:07:05.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:05 compute-1 podman[260776]: 2025-09-30 22:07:05.212658854 +0000 UTC m=+0.055323651 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:07:09 compute-1 nova_compute[192795]: 2025-09-30 22:07:09.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:10 compute-1 nova_compute[192795]: 2025-09-30 22:07:10.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:11 compute-1 podman[260798]: 2025-09-30 22:07:11.209538605 +0000 UTC m=+0.048075073 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:07:11 compute-1 podman[260796]: 2025-09-30 22:07:11.209621037 +0000 UTC m=+0.055336001 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd)
Sep 30 22:07:11 compute-1 podman[260797]: 2025-09-30 22:07:11.244280083 +0000 UTC m=+0.086954668 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:07:14 compute-1 nova_compute[192795]: 2025-09-30 22:07:14.705 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:14 compute-1 nova_compute[192795]: 2025-09-30 22:07:14.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:15 compute-1 nova_compute[192795]: 2025-09-30 22:07:15.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:18 compute-1 nova_compute[192795]: 2025-09-30 22:07:18.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:19 compute-1 nova_compute[192795]: 2025-09-30 22:07:19.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:19 compute-1 nova_compute[192795]: 2025-09-30 22:07:19.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:07:19 compute-1 nova_compute[192795]: 2025-09-30 22:07:19.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:20 compute-1 podman[260864]: 2025-09-30 22:07:20.213750974 +0000 UTC m=+0.059906519 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.790 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.790 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.790 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.790 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.924 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.925 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5682MB free_disk=73.29140090942383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.925 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:07:20 compute-1 nova_compute[192795]: 2025-09-30 22:07:20.926 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.121 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.122 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.267 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing inventories for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.286 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating ProviderTree inventory for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.287 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Updating inventory in ProviderTree for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.303 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing aggregate associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.361 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Refreshing trait associations for resource provider e551d5b4-e9f6-409e-b2a1-508a20c11333, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.392 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.417 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.418 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:07:21 compute-1 nova_compute[192795]: 2025-09-30 22:07:21.418 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:07:24 compute-1 nova_compute[192795]: 2025-09-30 22:07:24.419 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:24 compute-1 nova_compute[192795]: 2025-09-30 22:07:24.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:25 compute-1 nova_compute[192795]: 2025-09-30 22:07:25.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:25 compute-1 nova_compute[192795]: 2025-09-30 22:07:25.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:26 compute-1 podman[260886]: 2025-09-30 22:07:26.212180184 +0000 UTC m=+0.054299774 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:07:26 compute-1 podman[260885]: 2025-09-30 22:07:26.219325079 +0000 UTC m=+0.065506814 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container)
Sep 30 22:07:26 compute-1 podman[260887]: 2025-09-30 22:07:26.236107052 +0000 UTC m=+0.075302947 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 22:07:26 compute-1 nova_compute[192795]: 2025-09-30 22:07:26.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:29 compute-1 nova_compute[192795]: 2025-09-30 22:07:29.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:30 compute-1 nova_compute[192795]: 2025-09-30 22:07:30.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:30 compute-1 nova_compute[192795]: 2025-09-30 22:07:30.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:34 compute-1 nova_compute[192795]: 2025-09-30 22:07:34.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:34 compute-1 nova_compute[192795]: 2025-09-30 22:07:34.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:07:34 compute-1 nova_compute[192795]: 2025-09-30 22:07:34.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:07:34 compute-1 nova_compute[192795]: 2025-09-30 22:07:34.726 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:07:34 compute-1 nova_compute[192795]: 2025-09-30 22:07:34.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:35 compute-1 nova_compute[192795]: 2025-09-30 22:07:35.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:36 compute-1 podman[260946]: 2025-09-30 22:07:36.215326188 +0000 UTC m=+0.061804939 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 22:07:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:07:38.733 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:07:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:07:38.733 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:07:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:07:38.733 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:07:39 compute-1 nova_compute[192795]: 2025-09-30 22:07:39.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:40 compute-1 nova_compute[192795]: 2025-09-30 22:07:40.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:42 compute-1 podman[260968]: 2025-09-30 22:07:42.206250765 +0000 UTC m=+0.047011316 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:07:42 compute-1 podman[260966]: 2025-09-30 22:07:42.214825426 +0000 UTC m=+0.060335900 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:07:42 compute-1 podman[260967]: 2025-09-30 22:07:42.247133911 +0000 UTC m=+0.089505663 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 22:07:44 compute-1 nova_compute[192795]: 2025-09-30 22:07:44.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:45 compute-1 nova_compute[192795]: 2025-09-30 22:07:45.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:49 compute-1 nova_compute[192795]: 2025-09-30 22:07:49.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:50 compute-1 nova_compute[192795]: 2025-09-30 22:07:50.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:51 compute-1 podman[261030]: 2025-09-30 22:07:51.215623328 +0000 UTC m=+0.058198365 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Sep 30 22:07:54 compute-1 nova_compute[192795]: 2025-09-30 22:07:54.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:55 compute-1 nova_compute[192795]: 2025-09-30 22:07:55.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:57 compute-1 podman[261051]: 2025-09-30 22:07:57.219229992 +0000 UTC m=+0.055230189 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 22:07:57 compute-1 podman[261050]: 2025-09-30 22:07:57.222736022 +0000 UTC m=+0.063082822 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Sep 30 22:07:57 compute-1 podman[261052]: 2025-09-30 22:07:57.23001429 +0000 UTC m=+0.057345354 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:07:59 compute-1 nova_compute[192795]: 2025-09-30 22:07:59.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:00 compute-1 nova_compute[192795]: 2025-09-30 22:08:00.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:04 compute-1 nova_compute[192795]: 2025-09-30 22:08:04.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:05 compute-1 nova_compute[192795]: 2025-09-30 22:08:05.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:07 compute-1 podman[261110]: 2025-09-30 22:08:07.214893563 +0000 UTC m=+0.058139014 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 22:08:10 compute-1 nova_compute[192795]: 2025-09-30 22:08:10.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:10 compute-1 nova_compute[192795]: 2025-09-30 22:08:10.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:13 compute-1 podman[261130]: 2025-09-30 22:08:13.237370143 +0000 UTC m=+0.076276901 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd)
Sep 30 22:08:13 compute-1 podman[261132]: 2025-09-30 22:08:13.253630234 +0000 UTC m=+0.071992172 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 22:08:13 compute-1 podman[261131]: 2025-09-30 22:08:13.317429832 +0000 UTC m=+0.146676192 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller)
Sep 30 22:08:14 compute-1 nova_compute[192795]: 2025-09-30 22:08:14.721 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:15 compute-1 nova_compute[192795]: 2025-09-30 22:08:15.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:15 compute-1 nova_compute[192795]: 2025-09-30 22:08:15.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.721 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.722 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.722 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.722 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.875 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.876 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5674MB free_disk=73.29141998291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.876 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:08:20 compute-1 nova_compute[192795]: 2025-09-30 22:08:20.876 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:08:21 compute-1 nova_compute[192795]: 2025-09-30 22:08:21.356 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:08:21 compute-1 nova_compute[192795]: 2025-09-30 22:08:21.357 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:08:21 compute-1 nova_compute[192795]: 2025-09-30 22:08:21.441 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:08:21 compute-1 nova_compute[192795]: 2025-09-30 22:08:21.559 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:08:21 compute-1 nova_compute[192795]: 2025-09-30 22:08:21.562 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:08:21 compute-1 nova_compute[192795]: 2025-09-30 22:08:21.562 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:08:22 compute-1 podman[261200]: 2025-09-30 22:08:22.230463985 +0000 UTC m=+0.069252971 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 22:08:25 compute-1 nova_compute[192795]: 2025-09-30 22:08:25.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:25 compute-1 nova_compute[192795]: 2025-09-30 22:08:25.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:26 compute-1 nova_compute[192795]: 2025-09-30 22:08:26.563 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:26 compute-1 nova_compute[192795]: 2025-09-30 22:08:26.564 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:26 compute-1 nova_compute[192795]: 2025-09-30 22:08:26.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:28 compute-1 podman[261221]: 2025-09-30 22:08:28.21128295 +0000 UTC m=+0.051502622 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:08:28 compute-1 podman[261220]: 2025-09-30 22:08:28.211327331 +0000 UTC m=+0.056280816 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Sep 30 22:08:28 compute-1 podman[261222]: 2025-09-30 22:08:28.238153914 +0000 UTC m=+0.074497096 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 22:08:30 compute-1 nova_compute[192795]: 2025-09-30 22:08:30.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:30 compute-1 nova_compute[192795]: 2025-09-30 22:08:30.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:30 compute-1 nova_compute[192795]: 2025-09-30 22:08:30.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:34 compute-1 nova_compute[192795]: 2025-09-30 22:08:34.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:34 compute-1 nova_compute[192795]: 2025-09-30 22:08:34.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:08:34 compute-1 nova_compute[192795]: 2025-09-30 22:08:34.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:08:34 compute-1 nova_compute[192795]: 2025-09-30 22:08:34.706 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:08:35 compute-1 nova_compute[192795]: 2025-09-30 22:08:35.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:35 compute-1 nova_compute[192795]: 2025-09-30 22:08:35.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:38 compute-1 podman[261281]: 2025-09-30 22:08:38.202411183 +0000 UTC m=+0.050222448 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:08:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:08:38.734 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:08:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:08:38.734 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:08:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:08:38.735 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:08:39 compute-1 nova_compute[192795]: 2025-09-30 22:08:39.702 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:40 compute-1 nova_compute[192795]: 2025-09-30 22:08:40.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:40 compute-1 nova_compute[192795]: 2025-09-30 22:08:40.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:08:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-1 podman[261304]: 2025-09-30 22:08:44.205900745 +0000 UTC m=+0.044792329 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:08:44 compute-1 podman[261302]: 2025-09-30 22:08:44.213755268 +0000 UTC m=+0.056720297 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 22:08:44 compute-1 podman[261303]: 2025-09-30 22:08:44.241131505 +0000 UTC m=+0.083734934 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 22:08:45 compute-1 nova_compute[192795]: 2025-09-30 22:08:45.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:45 compute-1 nova_compute[192795]: 2025-09-30 22:08:45.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:08:50 compute-1 nova_compute[192795]: 2025-09-30 22:08:50.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:53 compute-1 podman[261373]: 2025-09-30 22:08:53.211265072 +0000 UTC m=+0.058616625 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 22:08:55 compute-1 nova_compute[192795]: 2025-09-30 22:08:55.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:08:55 compute-1 nova_compute[192795]: 2025-09-30 22:08:55.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:08:55 compute-1 nova_compute[192795]: 2025-09-30 22:08:55.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:08:55 compute-1 nova_compute[192795]: 2025-09-30 22:08:55.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:08:55 compute-1 nova_compute[192795]: 2025-09-30 22:08:55.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:55 compute-1 nova_compute[192795]: 2025-09-30 22:08:55.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:08:59 compute-1 podman[261394]: 2025-09-30 22:08:59.234342481 +0000 UTC m=+0.071887810 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 22:08:59 compute-1 podman[261395]: 2025-09-30 22:08:59.23588699 +0000 UTC m=+0.070290167 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:08:59 compute-1 podman[261393]: 2025-09-30 22:08:59.262211851 +0000 UTC m=+0.105013615 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9)
Sep 30 22:09:00 compute-1 nova_compute[192795]: 2025-09-30 22:09:00.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:00 compute-1 nova_compute[192795]: 2025-09-30 22:09:00.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:00 compute-1 nova_compute[192795]: 2025-09-30 22:09:00.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:09:00 compute-1 nova_compute[192795]: 2025-09-30 22:09:00.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:00 compute-1 nova_compute[192795]: 2025-09-30 22:09:00.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:00 compute-1 nova_compute[192795]: 2025-09-30 22:09:00.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:05 compute-1 nova_compute[192795]: 2025-09-30 22:09:05.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:05 compute-1 nova_compute[192795]: 2025-09-30 22:09:05.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:05 compute-1 nova_compute[192795]: 2025-09-30 22:09:05.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:09:05 compute-1 nova_compute[192795]: 2025-09-30 22:09:05.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:05 compute-1 nova_compute[192795]: 2025-09-30 22:09:05.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:05 compute-1 nova_compute[192795]: 2025-09-30 22:09:05.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:09 compute-1 podman[261456]: 2025-09-30 22:09:09.22448659 +0000 UTC m=+0.065721910 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 22:09:10 compute-1 nova_compute[192795]: 2025-09-30 22:09:10.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:12 compute-1 unix_chkpwd[261479]: password check failed for user (root)
Sep 30 22:09:12 compute-1 sshd-session[261477]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=164.92.202.181  user=root
Sep 30 22:09:14 compute-1 sshd-session[261477]: Failed password for root from 164.92.202.181 port 57656 ssh2
Sep 30 22:09:15 compute-1 nova_compute[192795]: 2025-09-30 22:09:15.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:15 compute-1 podman[261480]: 2025-09-30 22:09:15.222711924 +0000 UTC m=+0.059231593 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 22:09:15 compute-1 podman[261482]: 2025-09-30 22:09:15.229338466 +0000 UTC m=+0.058355420 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 22:09:15 compute-1 podman[261481]: 2025-09-30 22:09:15.262113962 +0000 UTC m=+0.095456218 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 22:09:15 compute-1 sshd-session[261477]: Received disconnect from 164.92.202.181 port 57656:11:  [preauth]
Sep 30 22:09:15 compute-1 sshd-session[261477]: Disconnected from authenticating user root 164.92.202.181 port 57656 [preauth]
Sep 30 22:09:16 compute-1 nova_compute[192795]: 2025-09-30 22:09:16.713 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.694 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.747 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.747 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.748 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.748 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.903 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.904 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5673MB free_disk=73.29141998291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.904 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.905 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.960 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.960 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:09:20 compute-1 nova_compute[192795]: 2025-09-30 22:09:20.980 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:09:21 compute-1 nova_compute[192795]: 2025-09-30 22:09:21.009 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:09:21 compute-1 nova_compute[192795]: 2025-09-30 22:09:21.011 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:09:21 compute-1 nova_compute[192795]: 2025-09-30 22:09:21.011 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:09:23 compute-1 nova_compute[192795]: 2025-09-30 22:09:23.011 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:24 compute-1 podman[261547]: 2025-09-30 22:09:24.209539784 +0000 UTC m=+0.055747192 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:09:25 compute-1 nova_compute[192795]: 2025-09-30 22:09:25.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:25 compute-1 nova_compute[192795]: 2025-09-30 22:09:25.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:25 compute-1 nova_compute[192795]: 2025-09-30 22:09:25.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:09:25 compute-1 nova_compute[192795]: 2025-09-30 22:09:25.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:25 compute-1 nova_compute[192795]: 2025-09-30 22:09:25.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:25 compute-1 nova_compute[192795]: 2025-09-30 22:09:25.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:25 compute-1 nova_compute[192795]: 2025-09-30 22:09:25.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:27 compute-1 nova_compute[192795]: 2025-09-30 22:09:27.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:27 compute-1 nova_compute[192795]: 2025-09-30 22:09:27.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:28 compute-1 nova_compute[192795]: 2025-09-30 22:09:28.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:30 compute-1 nova_compute[192795]: 2025-09-30 22:09:30.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:30 compute-1 nova_compute[192795]: 2025-09-30 22:09:30.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:30 compute-1 nova_compute[192795]: 2025-09-30 22:09:30.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:09:30 compute-1 nova_compute[192795]: 2025-09-30 22:09:30.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:30 compute-1 nova_compute[192795]: 2025-09-30 22:09:30.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:30 compute-1 nova_compute[192795]: 2025-09-30 22:09:30.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:30 compute-1 podman[261568]: 2025-09-30 22:09:30.221172537 +0000 UTC m=+0.058493163 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 22:09:30 compute-1 podman[261567]: 2025-09-30 22:09:30.221416443 +0000 UTC m=+0.063339917 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Sep 30 22:09:30 compute-1 podman[261569]: 2025-09-30 22:09:30.236209756 +0000 UTC m=+0.069117268 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 22:09:30 compute-1 nova_compute[192795]: 2025-09-30 22:09:30.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:35 compute-1 nova_compute[192795]: 2025-09-30 22:09:35.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:35 compute-1 nova_compute[192795]: 2025-09-30 22:09:35.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:35 compute-1 nova_compute[192795]: 2025-09-30 22:09:35.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:35 compute-1 nova_compute[192795]: 2025-09-30 22:09:35.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:09:35 compute-1 nova_compute[192795]: 2025-09-30 22:09:35.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:09:35 compute-1 nova_compute[192795]: 2025-09-30 22:09:35.716 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:09:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:09:38.735 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:09:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:09:38.736 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:09:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:09:38.736 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:09:40 compute-1 nova_compute[192795]: 2025-09-30 22:09:40.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:40 compute-1 nova_compute[192795]: 2025-09-30 22:09:40.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:40 compute-1 nova_compute[192795]: 2025-09-30 22:09:40.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:09:40 compute-1 nova_compute[192795]: 2025-09-30 22:09:40.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:40 compute-1 nova_compute[192795]: 2025-09-30 22:09:40.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:40 compute-1 nova_compute[192795]: 2025-09-30 22:09:40.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:09:40 compute-1 podman[261627]: 2025-09-30 22:09:40.226385975 +0000 UTC m=+0.070720969 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:09:45 compute-1 nova_compute[192795]: 2025-09-30 22:09:45.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:46 compute-1 podman[261651]: 2025-09-30 22:09:46.228654204 +0000 UTC m=+0.055965117 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:09:46 compute-1 podman[261649]: 2025-09-30 22:09:46.261441321 +0000 UTC m=+0.102370296 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 22:09:46 compute-1 podman[261650]: 2025-09-30 22:09:46.267435656 +0000 UTC m=+0.098983419 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Sep 30 22:09:50 compute-1 nova_compute[192795]: 2025-09-30 22:09:50.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:55 compute-1 nova_compute[192795]: 2025-09-30 22:09:55.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:09:55 compute-1 podman[261717]: 2025-09-30 22:09:55.224609721 +0000 UTC m=+0.062736003 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 22:10:00 compute-1 nova_compute[192795]: 2025-09-30 22:10:00.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:10:00 compute-1 nova_compute[192795]: 2025-09-30 22:10:00.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:00 compute-1 nova_compute[192795]: 2025-09-30 22:10:00.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:10:00 compute-1 nova_compute[192795]: 2025-09-30 22:10:00.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:00 compute-1 nova_compute[192795]: 2025-09-30 22:10:00.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:00 compute-1 nova_compute[192795]: 2025-09-30 22:10:00.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:01 compute-1 podman[261739]: 2025-09-30 22:10:01.218879142 +0000 UTC m=+0.050568677 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 22:10:01 compute-1 podman[261737]: 2025-09-30 22:10:01.242599585 +0000 UTC m=+0.078819798 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 22:10:01 compute-1 podman[261738]: 2025-09-30 22:10:01.250068418 +0000 UTC m=+0.076378334 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:10:05 compute-1 nova_compute[192795]: 2025-09-30 22:10:05.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:08 compute-1 sshd-session[261802]: Invalid user config from 80.94.95.116 port 50948
Sep 30 22:10:08 compute-1 sshd-session[261802]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 22:10:08 compute-1 sshd-session[261802]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116
Sep 30 22:10:09 compute-1 nova_compute[192795]: 2025-09-30 22:10:09.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:09 compute-1 nova_compute[192795]: 2025-09-30 22:10:09.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 22:10:10 compute-1 nova_compute[192795]: 2025-09-30 22:10:10.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:10:10 compute-1 nova_compute[192795]: 2025-09-30 22:10:10.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:10 compute-1 nova_compute[192795]: 2025-09-30 22:10:10.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:10:10 compute-1 nova_compute[192795]: 2025-09-30 22:10:10.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:10 compute-1 nova_compute[192795]: 2025-09-30 22:10:10.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:10 compute-1 nova_compute[192795]: 2025-09-30 22:10:10.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:11 compute-1 podman[261804]: 2025-09-30 22:10:11.206127517 +0000 UTC m=+0.052492588 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 22:10:11 compute-1 sshd-session[261802]: Failed password for invalid user config from 80.94.95.116 port 50948 ssh2
Sep 30 22:10:13 compute-1 sshd-session[261802]: Connection closed by invalid user config 80.94.95.116 port 50948 [preauth]
Sep 30 22:10:15 compute-1 nova_compute[192795]: 2025-09-30 22:10:15.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:17 compute-1 podman[261824]: 2025-09-30 22:10:17.251584293 +0000 UTC m=+0.083408617 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 22:10:17 compute-1 podman[261826]: 2025-09-30 22:10:17.25999662 +0000 UTC m=+0.075013720 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 22:10:17 compute-1 podman[261825]: 2025-09-30 22:10:17.261720274 +0000 UTC m=+0.088071307 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:10:17 compute-1 nova_compute[192795]: 2025-09-30 22:10:17.709 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:20 compute-1 nova_compute[192795]: 2025-09-30 22:10:20.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:10:20 compute-1 nova_compute[192795]: 2025-09-30 22:10:20.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:20 compute-1 nova_compute[192795]: 2025-09-30 22:10:20.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:10:20 compute-1 nova_compute[192795]: 2025-09-30 22:10:20.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:20 compute-1 nova_compute[192795]: 2025-09-30 22:10:20.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:20 compute-1 nova_compute[192795]: 2025-09-30 22:10:20.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:21 compute-1 nova_compute[192795]: 2025-09-30 22:10:21.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:21 compute-1 nova_compute[192795]: 2025-09-30 22:10:21.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.731 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.731 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.731 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.732 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.902 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.903 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5675MB free_disk=73.29141998291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.903 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.903 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.984 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:10:22 compute-1 nova_compute[192795]: 2025-09-30 22:10:22.984 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:10:23 compute-1 nova_compute[192795]: 2025-09-30 22:10:23.011 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:10:23 compute-1 nova_compute[192795]: 2025-09-30 22:10:23.028 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:10:23 compute-1 nova_compute[192795]: 2025-09-30 22:10:23.030 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:10:23 compute-1 nova_compute[192795]: 2025-09-30 22:10:23.030 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:10:24 compute-1 nova_compute[192795]: 2025-09-30 22:10:24.030 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:25 compute-1 nova_compute[192795]: 2025-09-30 22:10:25.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:10:25 compute-1 nova_compute[192795]: 2025-09-30 22:10:25.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:10:25 compute-1 nova_compute[192795]: 2025-09-30 22:10:25.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:10:25 compute-1 nova_compute[192795]: 2025-09-30 22:10:25.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:25 compute-1 nova_compute[192795]: 2025-09-30 22:10:25.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:25 compute-1 nova_compute[192795]: 2025-09-30 22:10:25.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:25 compute-1 nova_compute[192795]: 2025-09-30 22:10:25.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:26 compute-1 podman[261891]: 2025-09-30 22:10:26.215137443 +0000 UTC m=+0.063335096 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:10:27 compute-1 nova_compute[192795]: 2025-09-30 22:10:27.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:30 compute-1 nova_compute[192795]: 2025-09-30 22:10:30.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:10:30 compute-1 nova_compute[192795]: 2025-09-30 22:10:30.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:31 compute-1 nova_compute[192795]: 2025-09-30 22:10:31.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:32 compute-1 podman[261912]: 2025-09-30 22:10:32.217834335 +0000 UTC m=+0.052241510 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:10:32 compute-1 podman[261911]: 2025-09-30 22:10:32.248516906 +0000 UTC m=+0.090893804 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git, config_id=edpm)
Sep 30 22:10:32 compute-1 podman[261913]: 2025-09-30 22:10:32.25952602 +0000 UTC m=+0.078840171 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 22:10:35 compute-1 nova_compute[192795]: 2025-09-30 22:10:35.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:10:35 compute-1 nova_compute[192795]: 2025-09-30 22:10:35.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:35 compute-1 nova_compute[192795]: 2025-09-30 22:10:35.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:10:35 compute-1 nova_compute[192795]: 2025-09-30 22:10:35.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:35 compute-1 nova_compute[192795]: 2025-09-30 22:10:35.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:10:35 compute-1 nova_compute[192795]: 2025-09-30 22:10:35.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:37 compute-1 nova_compute[192795]: 2025-09-30 22:10:37.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:37 compute-1 nova_compute[192795]: 2025-09-30 22:10:37.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:10:37 compute-1 nova_compute[192795]: 2025-09-30 22:10:37.694 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:10:37 compute-1 nova_compute[192795]: 2025-09-30 22:10:37.715 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:10:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:10:38.736 103861 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:10:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:10:38.737 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:10:38 compute-1 ovn_metadata_agent[103856]: 2025-09-30 22:10:38.737 103861 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:10:40 compute-1 nova_compute[192795]: 2025-09-30 22:10:40.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:42 compute-1 podman[261976]: 2025-09-30 22:10:42.25703091 +0000 UTC m=+0.081111362 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid)
Sep 30 22:10:43 compute-1 nova_compute[192795]: 2025-09-30 22:10:43.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:43 compute-1 nova_compute[192795]: 2025-09-30 22:10:43.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 22:10:43 compute-1 nova_compute[192795]: 2025-09-30 22:10:43.716 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.038 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.040 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.040 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.040 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 ceilometer_agent_compute[203603]: 2025-09-30 22:10:44.040 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-1 nova_compute[192795]: 2025-09-30 22:10:44.712 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:45 compute-1 nova_compute[192795]: 2025-09-30 22:10:45.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:48 compute-1 podman[262000]: 2025-09-30 22:10:48.264621892 +0000 UTC m=+0.082450127 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:10:48 compute-1 podman[261999]: 2025-09-30 22:10:48.297718888 +0000 UTC m=+0.111288289 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 22:10:48 compute-1 podman[261998]: 2025-09-30 22:10:48.305357243 +0000 UTC m=+0.126731104 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 22:10:50 compute-1 nova_compute[192795]: 2025-09-30 22:10:50.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:55 compute-1 nova_compute[192795]: 2025-09-30 22:10:55.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:56 compute-1 nova_compute[192795]: 2025-09-30 22:10:56.693 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:57 compute-1 podman[262069]: 2025-09-30 22:10:57.235335131 +0000 UTC m=+0.070611170 container health_status 5687e840c6349d3057b198384a362b3d1b31117b51bd44509143b7dd7cfe06c4 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Sep 30 22:11:00 compute-1 nova_compute[192795]: 2025-09-30 22:11:00.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:00 compute-1 nova_compute[192795]: 2025-09-30 22:11:00.655 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:11:03 compute-1 podman[262093]: 2025-09-30 22:11:03.256249659 +0000 UTC m=+0.080634799 container health_status 77f8342058597df4b970bce682ee986765c161eb55b94fe536375bb8e84dee4b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 22:11:03 compute-1 podman[262092]: 2025-09-30 22:11:03.259187508 +0000 UTC m=+0.095210419 container health_status 432462e5a87db576f68bbda1f17c4c209c5428a3aa4a0854cbcf46f6e3872a49 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 22:11:03 compute-1 podman[262094]: 2025-09-30 22:11:03.282420069 +0000 UTC m=+0.098405484 container health_status ee002b21b01eab798acf5758512e0ce037539f62cbdb0e2316f281e897536b89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 22:11:05 compute-1 nova_compute[192795]: 2025-09-30 22:11:05.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:07 compute-1 podman[207048]: time="2025-09-30T22:11:07Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 22:11:07 compute-1 podman[207048]: @ - - [30/Sep/2025:22:11:07 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 25330 "" "Go-http-client/1.1"
Sep 30 22:11:09 compute-1 sshd-session[262156]: Accepted publickey for zuul from 192.168.122.10 port 56700 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 22:11:09 compute-1 systemd-logind[793]: New session 71 of user zuul.
Sep 30 22:11:09 compute-1 systemd[1]: Started Session 71 of User zuul.
Sep 30 22:11:09 compute-1 sshd-session[262156]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 22:11:09 compute-1 sudo[262160]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 22:11:09 compute-1 sudo[262160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 22:11:10 compute-1 nova_compute[192795]: 2025-09-30 22:11:10.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:13 compute-1 podman[262303]: 2025-09-30 22:11:13.229278985 +0000 UTC m=+0.064482458 container health_status bcdf7de313e64e81ac28f638877cae05c4582d800a15dc98cdbd7b04ac3e64fa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid)
Sep 30 22:11:14 compute-1 ovs-vsctl[262349]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 22:11:15 compute-1 virtqemud[192217]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 22:11:15 compute-1 virtqemud[192217]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 22:11:15 compute-1 virtqemud[192217]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 22:11:15 compute-1 nova_compute[192795]: 2025-09-30 22:11:15.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:11:15 compute-1 nova_compute[192795]: 2025-09-30 22:11:15.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 22:11:15 compute-1 nova_compute[192795]: 2025-09-30 22:11:15.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5008 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Sep 30 22:11:15 compute-1 nova_compute[192795]: 2025-09-30 22:11:15.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:11:15 compute-1 nova_compute[192795]: 2025-09-30 22:11:15.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:15 compute-1 nova_compute[192795]: 2025-09-30 22:11:15.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 22:11:16 compute-1 crontab[262762]: (root) LIST (root)
Sep 30 22:11:18 compute-1 systemd[1]: Starting Hostname Service...
Sep 30 22:11:18 compute-1 podman[262895]: 2025-09-30 22:11:18.788355923 +0000 UTC m=+0.086345273 container health_status a47f49a611811ffe2e99b8173b597c9801f6fe5b0f67750b7e0f887feffa45bc (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:11:18 compute-1 podman[262891]: 2025-09-30 22:11:18.817991936 +0000 UTC m=+0.120817915 container health_status 37ed7307dd768bdac00849efdaa627498b8374c1d33df29a916ef18c69acf804 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Sep 30 22:11:18 compute-1 systemd[1]: Started Hostname Service.
Sep 30 22:11:18 compute-1 podman[262893]: 2025-09-30 22:11:18.844554077 +0000 UTC m=+0.148719852 container health_status 4a3f5fd27d99c764312b702c5cb6d86d90d2f6cd89ceed17af3710ef28d07489 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:11:19 compute-1 nova_compute[192795]: 2025-09-30 22:11:19.691 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:11:20 compute-1 nova_compute[192795]: 2025-09-30 22:11:20.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:21 compute-1 nova_compute[192795]: 2025-09-30 22:11:21.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:11:21 compute-1 nova_compute[192795]: 2025-09-30 22:11:21.693 2 DEBUG nova.compute.manager [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.692 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.723 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.723 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.723 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.723 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.894 2 WARNING nova.virt.libvirt.driver [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.895 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5434MB free_disk=72.96331024169922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.895 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.896 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.987 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:11:22 compute-1 nova_compute[192795]: 2025-09-30 22:11:22.987 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:11:23 compute-1 nova_compute[192795]: 2025-09-30 22:11:23.013 2 DEBUG nova.compute.provider_tree [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed in ProviderTree for provider: e551d5b4-e9f6-409e-b2a1-508a20c11333 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:11:23 compute-1 nova_compute[192795]: 2025-09-30 22:11:23.027 2 DEBUG nova.scheduler.client.report [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Inventory has not changed for provider e551d5b4-e9f6-409e-b2a1-508a20c11333 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:11:23 compute-1 nova_compute[192795]: 2025-09-30 22:11:23.052 2 DEBUG nova.compute.resource_tracker [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:11:23 compute-1 nova_compute[192795]: 2025-09-30 22:11:23.052 2 DEBUG oslo_concurrency.lockutils [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:11:25 compute-1 nova_compute[192795]: 2025-09-30 22:11:25.052 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:11:25 compute-1 nova_compute[192795]: 2025-09-30 22:11:25.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:25 compute-1 nova_compute[192795]: 2025-09-30 22:11:25.695 2 DEBUG oslo_service.periodic_task [None req-9cf95456-689d-4329-8b2d-c152894d5231 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:11:26 compute-1 ovs-appctl[263893]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 22:11:26 compute-1 ovs-appctl[263897]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 22:11:26 compute-1 ovs-appctl[263902]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
